YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Computing and Information Science in Engineering
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Computing and Information Science in Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    A Quantitative Insight Into the Role of Skip Connections in Deep Neural Networks of Low Complexity: A Case Study Directed at Fluid Flow Modeling

    Source: Journal of Computing and Information Science in Engineering:;2022:;volume( 023 ):;issue: 001::page 14502
    Author:
    Choubineh, Abouzar;Chen, Jie;Coenen, Frans;Ma, Fei
    DOI: 10.1115/1.4054868
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: Deep feed-forward networks, with high complexity, backpropagate the gradient of the loss function from final layers to earlier layers. As a consequence, the “gradient” may descend rapidly toward zero. This is known as the vanishing gradient phenomenon that prevents earlier layers from benefiting from further training. One of the most efficient techniques to solve this problem is using skip connection (shortcut) schemes that enable the gradient to be directly backpropagated to earlier layers. This paper investigates whether skip connections significantly affect the performance of deep neural networks of low complexity or whether their inclusion has little or no effect. The analysis was conducted using four Convolutional Neural Networks (CNNs) to predict four different multiscale basis functions for the mixed Generalized Multiscale Finite Element Method (GMsFEM). These models were applied to 249,375 samples. Three skip connection schemes were added to the base structure: Scheme 1 from the first convolutional block to the last, Scheme 2 from the middle to the last block, and Scheme 3 from the middle to the last and the second-to-last blocks. The results demonstrate that the third scheme is most effective, as it increases the coefficient of determination (R2) value by 0.0224–0.044 and decreases the Mean Squared Error (MSE) value by 0.0027–0.0058 compared to the base structure. Hence, it is concluded that enriching the last convolutional blocks with the information hidden in neighboring blocks is more effective than enriching using earlier convolutional blocks near the input layer.
    • Download: (1.263Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      A Quantitative Insight Into the Role of Skip Connections in Deep Neural Networks of Low Complexity: A Case Study Directed at Fluid Flow Modeling

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4288130
    Collections
    • Journal of Computing and Information Science in Engineering

    Show full item record

    contributor authorChoubineh, Abouzar;Chen, Jie;Coenen, Frans;Ma, Fei
    date accessioned2022-12-27T23:12:58Z
    date available2022-12-27T23:12:58Z
    date copyright7/18/2022 12:00:00 AM
    date issued2022
    identifier issn1530-9827
    identifier otherjcise_23_1_014502.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4288130
    description abstractDeep feed-forward networks, with high complexity, backpropagate the gradient of the loss function from final layers to earlier layers. As a consequence, the “gradient” may descend rapidly toward zero. This is known as the vanishing gradient phenomenon that prevents earlier layers from benefiting from further training. One of the most efficient techniques to solve this problem is using skip connection (shortcut) schemes that enable the gradient to be directly backpropagated to earlier layers. This paper investigates whether skip connections significantly affect the performance of deep neural networks of low complexity or whether their inclusion has little or no effect. The analysis was conducted using four Convolutional Neural Networks (CNNs) to predict four different multiscale basis functions for the mixed Generalized Multiscale Finite Element Method (GMsFEM). These models were applied to 249,375 samples. Three skip connection schemes were added to the base structure: Scheme 1 from the first convolutional block to the last, Scheme 2 from the middle to the last block, and Scheme 3 from the middle to the last and the second-to-last blocks. The results demonstrate that the third scheme is most effective, as it increases the coefficient of determination (R2) value by 0.0224–0.044 and decreases the Mean Squared Error (MSE) value by 0.0027–0.0058 compared to the base structure. Hence, it is concluded that enriching the last convolutional blocks with the information hidden in neighboring blocks is more effective than enriching using earlier convolutional blocks near the input layer.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleA Quantitative Insight Into the Role of Skip Connections in Deep Neural Networks of Low Complexity: A Case Study Directed at Fluid Flow Modeling
    typeJournal Paper
    journal volume23
    journal issue1
    journal titleJournal of Computing and Information Science in Engineering
    identifier doi10.1115/1.4054868
    journal fristpage14502
    journal lastpage14502_9
    page9
    treeJournal of Computing and Information Science in Engineering:;2022:;volume( 023 ):;issue: 001
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian