The automated assembly of complex products requires a system that can automatically plan a physically feasible sequence of actions for assembling many parts together. In this paper, we present ASAP, a physics-based planning approach for automatically generating such a sequence for general-shaped assemblies. ASAP accounts for gravity to design a sequence where each sub-assembly is physically stable with a limited number of parts being held and a support surface. We apply efficient tree search algorithms to reduce the combinatorial complexity of determining such an assembly sequence. The search can be guided by either geometric heuristics or graph neural networks trained on data with simulation labels. Finally, we show the superior performance of ASAP at generating physically realistic assembly sequence plans on a large dataset of hundreds of complex product assemblies. We further demonstrate the applicability of ASAP on both simulation and real-world robotic setups.
We apply the idea of assembly-by-disassembly to obtain the assembly sequence from the reverse order of its disassembly sequence with much less complexity.
We formulate the disassembly sequence planning as a tree-search framework where established techniques can be applied to search for feasible disassembly sequences with a constrained evaluation budget.
A feasible tree expansion is conditioned on many constraints, which we take into consideration:
Given a certain sub-assembly during the tree search, to decide which part to disassemble, we devise different selection strategies for search efficiency:
We search for the most stable pose when the assembly is placed on a support surface, with guidance from a quasistatic pose estimator.
We additionally propose a pose reuse technique that reduces reorientations of the planned sequences to keep visual consistency and improve the success rate.
To verify the physical-feasibility of the assembly sequence, we developed an efficient gravitational stability check algorithm for multi-part contact-rich assemblies. Given an assembly under a specific pose, the algorithm outputs the set of parts needed to be held to make the assembly stable.
Our proposed stability check algorithm iteratively determines unstable parts to be held until the assembly becomes stable, which shows an order of magnitude speed up, avoids the combinatorial complexity and has a reasonable accuracy.
To benchmark the performance of our approach, we build a large complex 3D assembly dataset, consisting of 2,146 assemblies ranging from 3 to 50+ complex parts. 240 assemblies were selected for the test set for benchmarking, and the rest were used to train the part-selection model.
We evaluate the percentage of assemblies from the test dataset that can be disassembled using ASAP, given a specific computational budget. We compare the performance of ASAP against a naive Random Permutation baseline, a Genetic Algorithm baseline and Assemble Them All. The comparisons are shown based on different feasibility evaluation budgets (low = 50, high = 400) and different numbers of parts can be held.
Results show that ASAP outperforms all three baselines by a significant margin. Previous methods also don't take into account gravitational stability, producing assembly sequences with floating parts and apparently unstable poses. Note that we conduct quantitative comparisons without constraints from robotic manipulators, which isolates the impact of planning strategies on assembly success while abstracting away the nuances of robotic hardware and specialized tools.
We integrate ASAP with a robotic setup targeting real-world deployment with a UFACTORY xArm 7 robotic arm and a Robotiq 2F-140 gripper. The motion of the robotic arm and gripper is governed by grasp planning, inverse kinematics and collision detection, and the orientation of the rotary table is also planned by a heuristic to optimize arm reachability. The integration is fully extendable to other choices of robot arms and grippers. To our knowledge, ASAP is the first method to generate physically-feasible robotic assembly plans by only taking the assembly and robot specifications as input without additional human guidance. (Note that only the moving arm is shown in the videos.)
Here we demonstrate the sim-to-real transfer on a real-world hardware setup with a 3D printed beam assembly with 5 parts, where we show a step-by-step correspondence between simulated ASAP plans and real hardware execution. To aid in the spatial localization of assembly components, we use a laser-cut placemat that allows the robot to determine the precise positioning of parts, thereby reducing potential assembly errors. The direct sim-to-real transfer is non-trivial because due to tight millimeter-level clearances in assembly joints for stability and inherent errors in part fabrication and arm localization, which can be made more robust by incorporating vision or force feedback and adaptive manipulation skills.
We are motivated by the apparent ability of humans to intuit the correct disassembly order of assemblies. Therefore, we develop a novel data labeling tool and training pipeline to learn part selection strategy from human annotation. Human annotators are shown an interactive 3D view of an assembly and asked to select which part to remove next.
We label both the full assembly and partial assemblies that form a fully connected graph. Each assembly is independently labeled by three annotators and the majority vote label is used. To support the large-scale nature of this labeling job, the labels were crowd-sourced using the Amazon Mechanical Turk workforce.
Due to the high expense and noisy nature of human annotations, we only label a small subset of the full assembly dataset and we fail to achieve the same level of performance as the simulated data. However, we believe that this is a promising direction for future work.
@misc{tian2023asap,
title={ASAP: Automated Sequence Planning for Complex Robotic Assembly with Physical Feasibility},
author={Yunsheng Tian and Karl D. D. Willis and Bassel Al Omari and Jieliang Luo and Pingchuan Ma and Yichen Li and Farhad Javid and Edward Gu and Joshua Jacob and Shinjiro Sueda and Hui Li and Sachin Chitta and Wojciech Matusik},
year={2023},
eprint={2309.16909},
archivePrefix={arXiv},
primaryClass={cs.RO}
}