YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Multimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification

    Source: Journal of Computing in Civil Engineering:;2025:;Volume ( 039 ):;issue: 002::page 04024061-1
    Author:
    Min Jae Lee
    ,
    Ruichuan Zhang
    DOI: 10.1061/JCCEE5.CPENG-6249
    Publisher: American Society of Civil Engineers
    Abstract: Amidst the growing recognition of the impact of indoor environmental conditions on buildings and occupant comfort, health, and well-being, there has been an increasing focus on the assessment and modeling of indoor environmental quality (IEQ). Despite considerable advancements, existing IEQ modeling methodologies often prioritize and limit to singular comfort metrics, potentially neglecting the comprehensive factors associated with occupant comfort and health. There is a need for more inclusive and occupant-centric IEQ assessment models that cover a broader spectrum of environmental parameters and occupant needs. Such models require integrating diverse environmental and occupant data, facing challenges in leveraging data across various modalities and time scales as well as understanding the temporal patterns, relationships, and trends. This paper proposes a novel framework for classifying IEQ conditions based on occupant self-reported comfort and health levels to address these challenges. The proposed framework leverages a multimodal data-fusion approach with Transformer-based models, aiming to accurately predict indoor comfort and health levels by integrating diverse data sources, including multidimensional IEQ data and multimodal occupant feedback. The framework was evaluated in classifying IEQ conditions of selected public indoor spaces and achieved 97% and 96% accuracy in comfort and health-based classifications, outperforming several baselines.
    • Download: (986.0Kb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Multimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4304979
    Collections
    • Journal of Computing in Civil Engineering

    Show full item record

    contributor authorMin Jae Lee
    contributor authorRuichuan Zhang
    date accessioned2025-04-20T10:34:23Z
    date available2025-04-20T10:34:23Z
    date copyright12/23/2024 12:00:00 AM
    date issued2025
    identifier otherJCCEE5.CPENG-6249.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4304979
    description abstractAmidst the growing recognition of the impact of indoor environmental conditions on buildings and occupant comfort, health, and well-being, there has been an increasing focus on the assessment and modeling of indoor environmental quality (IEQ). Despite considerable advancements, existing IEQ modeling methodologies often prioritize and limit to singular comfort metrics, potentially neglecting the comprehensive factors associated with occupant comfort and health. There is a need for more inclusive and occupant-centric IEQ assessment models that cover a broader spectrum of environmental parameters and occupant needs. Such models require integrating diverse environmental and occupant data, facing challenges in leveraging data across various modalities and time scales as well as understanding the temporal patterns, relationships, and trends. This paper proposes a novel framework for classifying IEQ conditions based on occupant self-reported comfort and health levels to address these challenges. The proposed framework leverages a multimodal data-fusion approach with Transformer-based models, aiming to accurately predict indoor comfort and health levels by integrating diverse data sources, including multidimensional IEQ data and multimodal occupant feedback. The framework was evaluated in classifying IEQ conditions of selected public indoor spaces and achieved 97% and 96% accuracy in comfort and health-based classifications, outperforming several baselines.
    publisherAmerican Society of Civil Engineers
    titleMultimodal Data Fusion and Deep Learning for Occupant-Centric Indoor Environmental Quality Classification
    typeJournal Article
    journal volume39
    journal issue2
    journal titleJournal of Computing in Civil Engineering
    identifier doi10.1061/JCCEE5.CPENG-6249
    journal fristpage04024061-1
    journal lastpage04024061-11
    page11
    treeJournal of Computing in Civil Engineering:;2025:;Volume ( 039 ):;issue: 002
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian