YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Mechanical Design
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Mechanical Design
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Mean Squared Error May Lead You Astray When Optimizing Your Inverse Design Methods

    Source: Journal of Mechanical Design:;2024:;volume( 147 ):;issue: 002::page 21701-1
    Author:
    Habibi, Milad
    ,
    Bernard, Shai
    ,
    Wang, Jun
    ,
    Fuge, Mark
    DOI: 10.1115/1.4066102
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: When performing time-intensive optimization tasks, such as those in topology or shape optimization, researchers have turned to machine-learned inverse design (ID) methods—i.e., predicting the optimized geometry from input conditions—to replace or warm start traditional optimizers. Such methods are often optimized to reduce the mean squared error (MSE) or binary cross entropy between the output and a training dataset of optimized designs. While convenient, we show that this choice may be myopic. Specifically, we compare two methods of optimizing the hyperparameters of easily reproducible machine learning models including random forest, k-nearest neighbors, and deconvolutional neural network model for predicting the three optimal topology problems. We show that under both direct inverse design and when warm starting further topology optimization, using MSE metrics to tune hyperparameters produces less performance models than directly evaluating the objective function, though both produce designs that are almost one order of magnitude better than using the common uniform initialization. We also illustrate how warm starting impacts both the convergence time, the type of solutions obtained during optimization, and the final designs. Overall, our initial results portend that researchers may need to revisit common choices for evaluating ID methods that subtly tradeoff factors in how an ID method will actually be used. We hope our open-source dataset and evaluation environment will spur additional research in those directions.
    • Download: (909.0Kb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Mean Squared Error May Lead You Astray When Optimizing Your Inverse Design Methods

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4306424
    Collections
    • Journal of Mechanical Design

    Show full item record

    contributor authorHabibi, Milad
    contributor authorBernard, Shai
    contributor authorWang, Jun
    contributor authorFuge, Mark
    date accessioned2025-04-21T10:33:07Z
    date available2025-04-21T10:33:07Z
    date copyright8/28/2024 12:00:00 AM
    date issued2024
    identifier issn1050-0472
    identifier othermd_147_2_021701.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4306424
    description abstractWhen performing time-intensive optimization tasks, such as those in topology or shape optimization, researchers have turned to machine-learned inverse design (ID) methods—i.e., predicting the optimized geometry from input conditions—to replace or warm start traditional optimizers. Such methods are often optimized to reduce the mean squared error (MSE) or binary cross entropy between the output and a training dataset of optimized designs. While convenient, we show that this choice may be myopic. Specifically, we compare two methods of optimizing the hyperparameters of easily reproducible machine learning models including random forest, k-nearest neighbors, and deconvolutional neural network model for predicting the three optimal topology problems. We show that under both direct inverse design and when warm starting further topology optimization, using MSE metrics to tune hyperparameters produces less performance models than directly evaluating the objective function, though both produce designs that are almost one order of magnitude better than using the common uniform initialization. We also illustrate how warm starting impacts both the convergence time, the type of solutions obtained during optimization, and the final designs. Overall, our initial results portend that researchers may need to revisit common choices for evaluating ID methods that subtly tradeoff factors in how an ID method will actually be used. We hope our open-source dataset and evaluation environment will spur additional research in those directions.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleMean Squared Error May Lead You Astray When Optimizing Your Inverse Design Methods
    typeJournal Paper
    journal volume147
    journal issue2
    journal titleJournal of Mechanical Design
    identifier doi10.1115/1.4066102
    journal fristpage21701-1
    journal lastpage21701-11
    page11
    treeJournal of Mechanical Design:;2024:;volume( 147 ):;issue: 002
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian