YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Automated Object Manipulation Using Vision-Based Mobile Robotic System for Construction Applications

    Source: Journal of Computing in Civil Engineering:;2021:;Volume ( 035 ):;issue: 001::page 04020058
    Author:
    Khashayar Asadi
    ,
    Varun R. Haritsa
    ,
    Kevin Han
    ,
    John-Paul Ore
    DOI: 10.1061/(ASCE)CP.1943-5487.0000946
    Publisher: ASCE
    Abstract: In the last decade, automated object manipulation for construction applications has received much attention. However, the majority of existing systems are situated in a fixed location. They are mostly static systems surrounded by necessary tools to manipulate objects within their workspace. Mobility is an essential and key challenge for different construction applications, such as material handling and site cleaning. To fill this gap, this paper presents a mobile robotic system capable of vision-based object manipulation for construction applications. This system integrates scene understanding and autonomous navigation with object grasping. To achieve this, two stereo cameras and a robotic arm are mounted on a mobile platform. This integrated system uses a global-to-local control planning strategy to reach the objects of interest (in this study, bricks, wood sticks, and pipes). Then, the scene perception, together with grasp and control planning, enables the system to detect the objects of interest, pick, and place them in a predetermined location depending on the application. The system is implemented and validated in a construction-like environment for pick-and-place activities. The results demonstrate the effectiveness of this fully autonomous system using solely onboard sensing for real-time applications with end-effector positioning accuracy of less than a centimeter.
    • Download: (1.633Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Automated Object Manipulation Using Vision-Based Mobile Robotic System for Construction Applications

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4269717
    Collections
    • Journal of Computing in Civil Engineering

    Show full item record

    contributor authorKhashayar Asadi
    contributor authorVarun R. Haritsa
    contributor authorKevin Han
    contributor authorJohn-Paul Ore
    date accessioned2022-01-30T22:50:15Z
    date available2022-01-30T22:50:15Z
    date issued1/1/2021
    identifier other(ASCE)CP.1943-5487.0000946.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4269717
    description abstractIn the last decade, automated object manipulation for construction applications has received much attention. However, the majority of existing systems are situated in a fixed location. They are mostly static systems surrounded by necessary tools to manipulate objects within their workspace. Mobility is an essential and key challenge for different construction applications, such as material handling and site cleaning. To fill this gap, this paper presents a mobile robotic system capable of vision-based object manipulation for construction applications. This system integrates scene understanding and autonomous navigation with object grasping. To achieve this, two stereo cameras and a robotic arm are mounted on a mobile platform. This integrated system uses a global-to-local control planning strategy to reach the objects of interest (in this study, bricks, wood sticks, and pipes). Then, the scene perception, together with grasp and control planning, enables the system to detect the objects of interest, pick, and place them in a predetermined location depending on the application. The system is implemented and validated in a construction-like environment for pick-and-place activities. The results demonstrate the effectiveness of this fully autonomous system using solely onboard sensing for real-time applications with end-effector positioning accuracy of less than a centimeter.
    publisherASCE
    titleAutomated Object Manipulation Using Vision-Based Mobile Robotic System for Construction Applications
    typeJournal Paper
    journal volume35
    journal issue1
    journal titleJournal of Computing in Civil Engineering
    identifier doi10.1061/(ASCE)CP.1943-5487.0000946
    journal fristpage04020058
    journal lastpage04020058-15
    page15
    treeJournal of Computing in Civil Engineering:;2021:;Volume ( 035 ):;issue: 001
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian