Grid with multiple grasping examples.

Abstract

We introduce AO-Grasp, a grasp proposal method that generates stable and actionable 6 degree-of-freedom grasps for articulated objects. Our generated grasps enable robots to interact with articulated objects, such as opening and closing cabinets and appliances. Given a segmented partial point cloud of a single articulated object, AO-Grasp predicts the best grasp points on the object with a novel Actionable Grasp Point Predictor model and then finds corresponding grasp orientations for each point by leveraging a state-of-the-art rigid object grasping method. We train AO-Grasp on our new AO-Grasp Dataset, which contains 48K actionable parallel-jaw grasps on synthetic articulated objects. In simulation, AO-Grasp achieves higher grasp success rates than existing rigid object grasping and articulated object interaction baselines on both train and test categories. Additionally, we evaluate AO-Grasp on 120 real-world scenes of objects with varied geometries, articulation axes, and joint states, where AO-Grasp produces successful grasps on 67.5% of scenes, while the baseline only produces successful grasps on 33.3% of the scenes.

AO-Grasp Dataset

We introduce the AO-Grasp Dataset, a dataset of simulated, actionable grasps on articulated objects. It contains 48K 6 DoF grasps for 61 instances from 5 common household furniture/appliance categories (Box,Dishwasher, Microwave, Safe, and TrashCan) from the PartNet-Mobility dataset.

Results

We conduct a quantitative evaluation of AO-Grasp and CGN on 120 scenes of real-world objects with varied local geometries and articulation axes, in different joints states, and captured from different viewpoints. AO-Grasp produces successful grasps on 67.5% of scenes, while the baseline Contact-GraspNet only produces successful grasps on 33.3% of the scenes.

Results table.

BibTeX

@article{morlans2023aograsp,
    title={AO-Grasp: Articulated Object Grasp Generation},
    author={Carlota Parés Morlans and Claire Chen and Yijia Weng and Michelle Yi and Yuying Huang and Nick Heppert and Linqi Zhou and Leonidas Guibas and Jeannette Bohg},
    year={2023},
    eprint={2310.15928},
    archivePrefix={arXiv},
    primaryClass={cs.RO}
}
                

Contact

If you have any questions, please contact us at aograsp[at]gmail[dot]com.