YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Manufacturing Science and Engineering
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Manufacturing Science and Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Deep Multi-Modal U-Net Fusion Methodology of Thermal and Ultrasonic Images for Porosity Detection in Additive Manufacturing

    Source: Journal of Manufacturing Science and Engineering:;2023:;volume( 145 ):;issue: 006::page 61009-1
    Author:
    Zamiela, Christian
    ,
    Jiang, Zhipeng
    ,
    Stokes, Ryan
    ,
    Tian, Zhenhua
    ,
    Netchaev, Anton
    ,
    Dickerson, Charles
    ,
    Tian, Wenmeng
    ,
    Bian, Linkan
    DOI: 10.1115/1.4056873
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: We developed a deep fusion methodology of nondestructive in-situ thermal and ex-situ ultrasonic images for porosity detection in laser-based additive manufacturing (LBAM). A core challenge with the LBAM is the lack of fusion between successive layers of printed metal. Ultrasonic imaging can capture structural abnormalities by passing waves through successive layers. Alternatively, in-situ thermal images track the thermal history during fabrication. The proposed sensor fusion U-Net methodology fills the gap in fusing in-situ images with ex-situ images by employing a two-branch convolutional neural network (CNN) for feature extraction and segmentation to produce a 2D image of porosity. We modify the U-Net framework with the inception and long short term memory (LSTM) blocks. We validate the models by comparing our single modality models and fusion models with ground truth X-ray computed tomography (XCT) images. The inception U-Net fusion model achieved the highest mean intersection over union score of 0.93.
    • Download: (1.275Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Deep Multi-Modal U-Net Fusion Methodology of Thermal and Ultrasonic Images for Porosity Detection in Additive Manufacturing

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4292293
    Collections
    • Journal of Manufacturing Science and Engineering

    Show full item record

    contributor authorZamiela, Christian
    contributor authorJiang, Zhipeng
    contributor authorStokes, Ryan
    contributor authorTian, Zhenhua
    contributor authorNetchaev, Anton
    contributor authorDickerson, Charles
    contributor authorTian, Wenmeng
    contributor authorBian, Linkan
    date accessioned2023-08-16T18:40:09Z
    date available2023-08-16T18:40:09Z
    date copyright3/14/2023 12:00:00 AM
    date issued2023
    identifier issn1087-1357
    identifier othermanu_145_6_061009.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4292293
    description abstractWe developed a deep fusion methodology of nondestructive in-situ thermal and ex-situ ultrasonic images for porosity detection in laser-based additive manufacturing (LBAM). A core challenge with the LBAM is the lack of fusion between successive layers of printed metal. Ultrasonic imaging can capture structural abnormalities by passing waves through successive layers. Alternatively, in-situ thermal images track the thermal history during fabrication. The proposed sensor fusion U-Net methodology fills the gap in fusing in-situ images with ex-situ images by employing a two-branch convolutional neural network (CNN) for feature extraction and segmentation to produce a 2D image of porosity. We modify the U-Net framework with the inception and long short term memory (LSTM) blocks. We validate the models by comparing our single modality models and fusion models with ground truth X-ray computed tomography (XCT) images. The inception U-Net fusion model achieved the highest mean intersection over union score of 0.93.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleDeep Multi-Modal U-Net Fusion Methodology of Thermal and Ultrasonic Images for Porosity Detection in Additive Manufacturing
    typeJournal Paper
    journal volume145
    journal issue6
    journal titleJournal of Manufacturing Science and Engineering
    identifier doi10.1115/1.4056873
    journal fristpage61009-1
    journal lastpage61009-13
    page13
    treeJournal of Manufacturing Science and Engineering:;2023:;volume( 145 ):;issue: 006
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian