Comparative Study of Motion Features for Similarity-Based Modeling and Classification of Unsafe Actions in ConstructionSource: Journal of Computing in Civil Engineering:;2014:;Volume ( 028 ):;issue: 005DOI: 10.1061/(ASCE)CP.1943-5487.0000339Publisher: American Society of Civil Engineers
Abstract: Rapid development of motion sensors and video processing has triggered growing attention to action recognition for safety and health analysis, as well as operation analysis, in construction. Specifically for occupational safety and health, worker behavior monitoring allows for the automatic detection of workers’ unsafe actions and for feedback on their behavior, both of which enable the proactive prevention of an accident by reducing the number of unsafe actions that occur. Previous studies provide insight into tracking human movements and recognizing actions, but further research efforts are needed to understand the following characteristics of motion data, which can significantly affect classification performances: (1) various motion data types extracted from motion capture data, (2) variations of postures and actions, and (3) temporal and sequential relations of motion data. This paper thus presents a modeling and classification methodology for the recognition of unsafe actions, particularly focusing on (1) the description and comparison of four motion data types (i.e., rotation angles, joint angles, position vectors, and movement direction) that will be used as a feature for classification, (2) the estimation of actions’ mean trajectory in order to model various patterns of action, and (3) the classification of actions based on spatial-temporal similarity. With a concentration on motion analysis, experiments were undertaken for the modeling and detection of actions during ladder climbing using an red, green, blue plus depth (RGB-D) sensor. Through the experimental study, we found that the proposed approach performs well (i.e., an accuracy of up to 99.5% in lab experiments), that a rotation angle outperforms a joint angle and a position vector, and that movement direction explicitly improves the accuracy of motion classification as combined with each of the other three.
|
Collections
Show full item record
contributor author | SangUk Han | |
contributor author | SangHyun Lee | |
contributor author | Feniosky Peña-Mora | |
date accessioned | 2017-05-08T21:41:04Z | |
date available | 2017-05-08T21:41:04Z | |
date copyright | September 2014 | |
date issued | 2014 | |
identifier other | %28asce%29cp%2E1943-5487%2E0000346.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl/handle/yetl/59319 | |
description abstract | Rapid development of motion sensors and video processing has triggered growing attention to action recognition for safety and health analysis, as well as operation analysis, in construction. Specifically for occupational safety and health, worker behavior monitoring allows for the automatic detection of workers’ unsafe actions and for feedback on their behavior, both of which enable the proactive prevention of an accident by reducing the number of unsafe actions that occur. Previous studies provide insight into tracking human movements and recognizing actions, but further research efforts are needed to understand the following characteristics of motion data, which can significantly affect classification performances: (1) various motion data types extracted from motion capture data, (2) variations of postures and actions, and (3) temporal and sequential relations of motion data. This paper thus presents a modeling and classification methodology for the recognition of unsafe actions, particularly focusing on (1) the description and comparison of four motion data types (i.e., rotation angles, joint angles, position vectors, and movement direction) that will be used as a feature for classification, (2) the estimation of actions’ mean trajectory in order to model various patterns of action, and (3) the classification of actions based on spatial-temporal similarity. With a concentration on motion analysis, experiments were undertaken for the modeling and detection of actions during ladder climbing using an red, green, blue plus depth (RGB-D) sensor. Through the experimental study, we found that the proposed approach performs well (i.e., an accuracy of up to 99.5% in lab experiments), that a rotation angle outperforms a joint angle and a position vector, and that movement direction explicitly improves the accuracy of motion classification as combined with each of the other three. | |
publisher | American Society of Civil Engineers | |
title | Comparative Study of Motion Features for Similarity-Based Modeling and Classification of Unsafe Actions in Construction | |
type | Journal Paper | |
journal volume | 28 | |
journal issue | 5 | |
journal title | Journal of Computing in Civil Engineering | |
identifier doi | 10.1061/(ASCE)CP.1943-5487.0000339 | |
tree | Journal of Computing in Civil Engineering:;2014:;Volume ( 028 ):;issue: 005 | |
contenttype | Fulltext |