YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASCE
    • Journal of Transportation Engineering, Part A: Systems
    • View Item
    •   YE&T Library
    • ASCE
    • Journal of Transportation Engineering, Part A: Systems
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Radar–Vision Fusion Method for Traffic Event Detection with Roadside Perception

    Source: Journal of Transportation Engineering, Part A: Systems:;2025:;Volume ( 151 ):;issue: 008::page 04025055-1
    Author:
    Haodong Liu
    DOI: 10.1061/JTEPBS.TEENG-8753
    Publisher: American Society of Civil Engineers
    Abstract: Detecting traffic flow and events on highways is conventionally dominated by vision-centric approaches and rule-based methods. These systems focus on identifying both micro (e.g., wrong-way driving, abnormal parking, emergency lane usage) and macro (e.g., congestion, stop-and-go waves) traffic events. The development of technologies of 4D millimeter-wave radar has significantly expanded the application of roadside sensors. The utilization of 4D radar data presents several challenges, including sparsity of detection points, presence of noise, and limited feature extraction capacity of existing roadside perception systems. The end-to-end deep learning method can greatly facilitate traffic event detection by integrating traffic participant detection and event detection processes. Moreover, it enables hardware optimization of edge computing devices and on-device data transmission. Hence, this paper proposes an end-to-end radar–vision fusion method for traffic event detection with roadside perception. First, we construct a highway event data set with radar–vision fusion in a 3D environment through joint simulation using Simulation of Urban Mobility (SUMO) software and Carla. Then, we develop an encoder-decoder network for feature extraction. Finally, we present an end-to-end approach for traffic event detection. We define the average event detection precision (AEDP) as a criterion for traffic event detection. Our method maintains robustness in detecting microscale traffic events and demonstrates a 20% improvement in congestion detection using the rule-based method and a 32% improvement in the detection of stop-and-go waves.
    • Download: (1.353Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Radar–Vision Fusion Method for Traffic Event Detection with Roadside Perception

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4306853
    Collections
    • Journal of Transportation Engineering, Part A: Systems

    Show full item record

    contributor authorHaodong Liu
    date accessioned2025-08-17T22:22:50Z
    date available2025-08-17T22:22:50Z
    date copyright8/1/2025 12:00:00 AM
    date issued2025
    identifier otherJTEPBS.TEENG-8753.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4306853
    description abstractDetecting traffic flow and events on highways is conventionally dominated by vision-centric approaches and rule-based methods. These systems focus on identifying both micro (e.g., wrong-way driving, abnormal parking, emergency lane usage) and macro (e.g., congestion, stop-and-go waves) traffic events. The development of technologies of 4D millimeter-wave radar has significantly expanded the application of roadside sensors. The utilization of 4D radar data presents several challenges, including sparsity of detection points, presence of noise, and limited feature extraction capacity of existing roadside perception systems. The end-to-end deep learning method can greatly facilitate traffic event detection by integrating traffic participant detection and event detection processes. Moreover, it enables hardware optimization of edge computing devices and on-device data transmission. Hence, this paper proposes an end-to-end radar–vision fusion method for traffic event detection with roadside perception. First, we construct a highway event data set with radar–vision fusion in a 3D environment through joint simulation using Simulation of Urban Mobility (SUMO) software and Carla. Then, we develop an encoder-decoder network for feature extraction. Finally, we present an end-to-end approach for traffic event detection. We define the average event detection precision (AEDP) as a criterion for traffic event detection. Our method maintains robustness in detecting microscale traffic events and demonstrates a 20% improvement in congestion detection using the rule-based method and a 32% improvement in the detection of stop-and-go waves.
    publisherAmerican Society of Civil Engineers
    titleRadar–Vision Fusion Method for Traffic Event Detection with Roadside Perception
    typeJournal Article
    journal volume151
    journal issue8
    journal titleJournal of Transportation Engineering, Part A: Systems
    identifier doi10.1061/JTEPBS.TEENG-8753
    journal fristpage04025055-1
    journal lastpage04025055-11
    page11
    treeJournal of Transportation Engineering, Part A: Systems:;2025:;Volume ( 151 ):;issue: 008
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian