contributor author | Ehsan Rezazadeh Azar | |
contributor author | Brenda McCabe | |
date accessioned | 2017-05-08T21:40:32Z | |
date available | 2017-05-08T21:40:32Z | |
date copyright | November 2012 | |
date issued | 2012 | |
identifier other | %28asce%29cp%2E1943-5487%2E0000186.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl/handle/yetl/59155 | |
description abstract | Earthmoving plants are essential but costly resources in the construction of heavy civil engineering projects. In addition to proper allocation, ongoing control of this equipment is necessary to ensure and increase the productivity of earthmoving operations. Captured videos from construction sites are potential tools to control earthmoving operations; however, the current practice of manual data extraction from surveillance videos is tedious, costly, and error prone. Cutting-edge computer vision techniques have the potential to automate equipment monitoring tasks. This paper presents research in the evaluation of combinations of existing object recognition and background subtraction algorithms to recognize off-highway dump trucks in noisy video streams containing other active machines. Two detection algorithms, namely, Haar–histogram of oriented gradients (HOG) and Blob-HOG, are presented and evaluated for their ability to recognize dump trucks in videos as measured by both effectiveness and timeliness. The results of this study can help practitioners select a suitable approach to recognize such equipment in videos for real-time applications such as productivity measurement, performance control, and proactive work-zone safety. | |
publisher | American Society of Civil Engineers | |
title | Automated Visual Recognition of Dump Trucks in Construction Videos | |
type | Journal Paper | |
journal volume | 26 | |
journal issue | 6 | |
journal title | Journal of Computing in Civil Engineering | |
identifier doi | 10.1061/(ASCE)CP.1943-5487.0000179 | |
tree | Journal of Computing in Civil Engineering:;2012:;Volume ( 026 ):;issue: 006 | |
contenttype | Fulltext | |