The“visual perception+hand-eye transformation+motion planning”paradigm of robotic coordination grasping has demonstrated feasibility in unstructured scenes such as logistics.However,further developments in handling ...The“visual perception+hand-eye transformation+motion planning”paradigm of robotic coordination grasping has demonstrated feasibility in unstructured scenes such as logistics.However,further developments in handling complex and dynamic environments remain challenging.To address the issue of unknown targets requiring immediate deployment for grasping tasks,this paper proposes a novel hand-eye coordinated method for progressive grasping guided by the texture keypoints of the target.First,we develop an efficient system for acquiring texture-matching templates and an estimation algorithm for the feature region that filters the precisely registered texture feature points of the target.Then,we integrate optical flow estimation to update and track the robust texture region in real time,and design a feature-based servo grasping controller to map the optical flow points of the high-registration texture region to the robot joint velocities for precise tracking.Finally,we impose spatiotemporal constraints on the planned trajectory and decouple the target motion,to achieve progressive approach and rotationally invariant grasping for both dynamic and static targets.Comprehensive experiments demonstrate that this tracking grasping method exhibits a low latency,a high precision,and robustness in complex scenarios and dynamic disturbances,with an average position accuracy of approximately 5 mm,a rotation accuracy of approximately 0.02,and an overall grasping success rate of approximately 90%.展开更多
基金Supported by National Key R&D Program of China(Grant No.2024YFB4709800)Fundamental Research Funds for the Central Universities。
文摘The“visual perception+hand-eye transformation+motion planning”paradigm of robotic coordination grasping has demonstrated feasibility in unstructured scenes such as logistics.However,further developments in handling complex and dynamic environments remain challenging.To address the issue of unknown targets requiring immediate deployment for grasping tasks,this paper proposes a novel hand-eye coordinated method for progressive grasping guided by the texture keypoints of the target.First,we develop an efficient system for acquiring texture-matching templates and an estimation algorithm for the feature region that filters the precisely registered texture feature points of the target.Then,we integrate optical flow estimation to update and track the robust texture region in real time,and design a feature-based servo grasping controller to map the optical flow points of the high-registration texture region to the robot joint velocities for precise tracking.Finally,we impose spatiotemporal constraints on the planned trajectory and decouple the target motion,to achieve progressive approach and rotationally invariant grasping for both dynamic and static targets.Comprehensive experiments demonstrate that this tracking grasping method exhibits a low latency,a high precision,and robustness in complex scenarios and dynamic disturbances,with an average position accuracy of approximately 5 mm,a rotation accuracy of approximately 0.02,and an overall grasping success rate of approximately 90%.