Instant Grasping Framework of Textured Objects Via Precise Point Matches and Normalized Target PosesSource: Journal of Mechanisms and Robotics:;2025:;volume( 017 ):;issue: 008::page 81013-1DOI: 10.1115/1.4068273Publisher: The American Society of Mechanical Engineers (ASME)
Abstract: To reliably manipulate previously unknown objects in semi-structured environments, robots require rapid deployments and seamless transitions in pose estimation and grasping. This work proposes a novel two-stage robotic grasping method that instantly achieves accurate grasping without prior training. At the first stage, depth information and structured markers are utilized to construct compact templates for packaged targets, reducing noise and automating annotations. Then, we conduct coarse matching and design a new variant of the iterative closest point algorithm, named adaptive template-based RANSAC and iterative closest point (ATSAC-ICP), for precise point cloud registration. The method extracts locally well-registered pairs, regresses and optimizes six-degree-of-freedom (6-DOF) pose to satisfy confidence probability and precision threshold. The second stage normalizes the target pose for consistent grasp planning, which is based on scene and placement patterns. The proposed method is evaluated by several sets of experiments using various randomly selected textured objects. The results show that the pose errors are approximately ±2 mm, ±3 deg, and the successful grasping rate is over 90%. Physical experiments, conducted in different lighting conditions and with external disturbances, demonstrate effectiveness and applicability in grasping daily objects.
|
Collections
Show full item record
contributor author | Luo, Yazhe | |
contributor author | Ruan, Sipu | |
contributor author | Li, Yifei | |
contributor author | Li, Jiting | |
contributor author | Chen, Diansheng | |
date accessioned | 2025-08-20T09:42:55Z | |
date available | 2025-08-20T09:42:55Z | |
date copyright | 4/17/2025 12:00:00 AM | |
date issued | 2025 | |
identifier issn | 1942-4302 | |
identifier other | jmr-24-1628.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl1/handle/yetl/4308733 | |
description abstract | To reliably manipulate previously unknown objects in semi-structured environments, robots require rapid deployments and seamless transitions in pose estimation and grasping. This work proposes a novel two-stage robotic grasping method that instantly achieves accurate grasping without prior training. At the first stage, depth information and structured markers are utilized to construct compact templates for packaged targets, reducing noise and automating annotations. Then, we conduct coarse matching and design a new variant of the iterative closest point algorithm, named adaptive template-based RANSAC and iterative closest point (ATSAC-ICP), for precise point cloud registration. The method extracts locally well-registered pairs, regresses and optimizes six-degree-of-freedom (6-DOF) pose to satisfy confidence probability and precision threshold. The second stage normalizes the target pose for consistent grasp planning, which is based on scene and placement patterns. The proposed method is evaluated by several sets of experiments using various randomly selected textured objects. The results show that the pose errors are approximately ±2 mm, ±3 deg, and the successful grasping rate is over 90%. Physical experiments, conducted in different lighting conditions and with external disturbances, demonstrate effectiveness and applicability in grasping daily objects. | |
publisher | The American Society of Mechanical Engineers (ASME) | |
title | Instant Grasping Framework of Textured Objects Via Precise Point Matches and Normalized Target Poses | |
type | Journal Paper | |
journal volume | 17 | |
journal issue | 8 | |
journal title | Journal of Mechanisms and Robotics | |
identifier doi | 10.1115/1.4068273 | |
journal fristpage | 81013-1 | |
journal lastpage | 81013-13 | |
page | 13 | |
tree | Journal of Mechanisms and Robotics:;2025:;volume( 017 ):;issue: 008 | |
contenttype | Fulltext |