YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Computing and Information Science in Engineering
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Computing and Information Science in Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Enhancing Robotic Grasping Detection Accuracy With the R2CNN Algorithm and Force-Closure

    Source: Journal of Computing and Information Science in Engineering:;2024:;volume( 024 ):;issue: 006::page 61005-1
    Author:
    Lin, Hsien-I
    ,
    Shodiq, Muhammad Ahsan Fatwaddin
    ,
    Chu, Hong-Qi
    DOI: 10.1115/1.4065311
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: This study aims to use an improved rotational region convolutional neural network (R2CNN) algorithm to detect the grasping bounding box for the robotic arm that reaches supermarket goods. This algorithm can calculate the final predicted grasping bounding box without any additional architecture, which significantly improves the speed of grasp inferences. In this study, we added the force-closure condition so that the final grasping bounding box could achieve grasping stability in a physical sense. We experimentally demonstrated that deep model-treated object detection and grasping detection are the same tasks. We used transfer learning to improve the prediction accuracy of the grasping bounding box. In particular, the ResNet-101 network weights, which were originally used in object detection, were used to continue training with the Cornell dataset. In terms of grasping detection, we used the trained model weights that were originally used in object detection as the features of the to-be-grasped objects and fed them to the network for continuous training. For 2828 test images, this method achieved nearly 98% accuracy and a speed of 14–17 frames per second.
    • Download: (1.327Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Enhancing Robotic Grasping Detection Accuracy With the R2CNN Algorithm and Force-Closure

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4303207
    Collections
    • Journal of Computing and Information Science in Engineering

    Show full item record

    contributor authorLin, Hsien-I
    contributor authorShodiq, Muhammad Ahsan Fatwaddin
    contributor authorChu, Hong-Qi
    date accessioned2024-12-24T19:03:16Z
    date available2024-12-24T19:03:16Z
    date copyright5/9/2024 12:00:00 AM
    date issued2024
    identifier issn1530-9827
    identifier otherjcise_24_6_061005.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4303207
    description abstractThis study aims to use an improved rotational region convolutional neural network (R2CNN) algorithm to detect the grasping bounding box for the robotic arm that reaches supermarket goods. This algorithm can calculate the final predicted grasping bounding box without any additional architecture, which significantly improves the speed of grasp inferences. In this study, we added the force-closure condition so that the final grasping bounding box could achieve grasping stability in a physical sense. We experimentally demonstrated that deep model-treated object detection and grasping detection are the same tasks. We used transfer learning to improve the prediction accuracy of the grasping bounding box. In particular, the ResNet-101 network weights, which were originally used in object detection, were used to continue training with the Cornell dataset. In terms of grasping detection, we used the trained model weights that were originally used in object detection as the features of the to-be-grasped objects and fed them to the network for continuous training. For 2828 test images, this method achieved nearly 98% accuracy and a speed of 14–17 frames per second.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleEnhancing Robotic Grasping Detection Accuracy With the R2CNN Algorithm and Force-Closure
    typeJournal Paper
    journal volume24
    journal issue6
    journal titleJournal of Computing and Information Science in Engineering
    identifier doi10.1115/1.4065311
    journal fristpage61005-1
    journal lastpage61005-16
    page16
    treeJournal of Computing and Information Science in Engineering:;2024:;volume( 024 ):;issue: 006
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian