YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASCE
    • Journal of Construction Engineering and Management
    • View Item
    •   YE&T Library
    • ASCE
    • Journal of Construction Engineering and Management
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Crowdsourcing Construction Activity Analysis from Jobsite Video Streams

    Source: Journal of Construction Engineering and Management:;2015:;Volume ( 141 ):;issue: 011
    Author:
    Kaijian Liu
    ,
    Mani Golparvar-Fard
    DOI: 10.1061/(ASCE)CO.1943-7862.0001010
    Publisher: American Society of Civil Engineers
    Abstract: The advent of affordable jobsite cameras is reshaping the way on-site construction activities are monitored. To facilitate the analysis of large collections of videos, research has focused on addressing the problem of manual workface assessment by recognizing worker and equipment activities using computer-vision algorithms. Despite the explosion of these methods, the ability to automatically recognize and understand worker and equipment activities from videos is still rather limited. The current algorithms require large-scale annotated workface assessment video data to learn models that can deal with the high degree of intraclass variability among activity categories. To address current limitations, this study proposes crowdsourcing the task of workface assessment from jobsite video streams. By introducing an intuitive web-based platform for massive marketplaces such as Amazon Mechanical Turk (AMT) and several automated methods, the intelligence of the crowd is engaged for interpreting jobsite videos. The goal is to overcome the limitations of the current practices of workface assessment and also provide significantly large empirical data sets together with their ground truth that can serve as the basis for developing video-based activity recognition methods. Six extensive experiments have shown that engaging nonexperts on AMT to annotate construction activities in jobsite videos can provide complete and detailed workface assessment results with 85% accuracy. It has been demonstrated that crowdsourcing has the potential to minimize time needed for workface assessment, provides ground truth for algorithmic developments, and most importantly allows on-site professionals to focus their time on the more important task of root-cause analysis and performance improvements.
    • Download: (41.62Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Crowdsourcing Construction Activity Analysis from Jobsite Video Streams

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/81794
    Collections
    • Journal of Construction Engineering and Management

    Show full item record

    contributor authorKaijian Liu
    contributor authorMani Golparvar-Fard
    date accessioned2017-05-08T22:30:43Z
    date available2017-05-08T22:30:43Z
    date copyrightNovember 2015
    date issued2015
    identifier other47632768.pdf
    identifier urihttp://yetl.yabesh.ir/yetl/handle/yetl/81794
    description abstractThe advent of affordable jobsite cameras is reshaping the way on-site construction activities are monitored. To facilitate the analysis of large collections of videos, research has focused on addressing the problem of manual workface assessment by recognizing worker and equipment activities using computer-vision algorithms. Despite the explosion of these methods, the ability to automatically recognize and understand worker and equipment activities from videos is still rather limited. The current algorithms require large-scale annotated workface assessment video data to learn models that can deal with the high degree of intraclass variability among activity categories. To address current limitations, this study proposes crowdsourcing the task of workface assessment from jobsite video streams. By introducing an intuitive web-based platform for massive marketplaces such as Amazon Mechanical Turk (AMT) and several automated methods, the intelligence of the crowd is engaged for interpreting jobsite videos. The goal is to overcome the limitations of the current practices of workface assessment and also provide significantly large empirical data sets together with their ground truth that can serve as the basis for developing video-based activity recognition methods. Six extensive experiments have shown that engaging nonexperts on AMT to annotate construction activities in jobsite videos can provide complete and detailed workface assessment results with 85% accuracy. It has been demonstrated that crowdsourcing has the potential to minimize time needed for workface assessment, provides ground truth for algorithmic developments, and most importantly allows on-site professionals to focus their time on the more important task of root-cause analysis and performance improvements.
    publisherAmerican Society of Civil Engineers
    titleCrowdsourcing Construction Activity Analysis from Jobsite Video Streams
    typeJournal Paper
    journal volume141
    journal issue11
    journal titleJournal of Construction Engineering and Management
    identifier doi10.1061/(ASCE)CO.1943-7862.0001010
    treeJournal of Construction Engineering and Management:;2015:;Volume ( 141 ):;issue: 011
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian