YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Verification, Validation and Uncertainty Quantification
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Verification, Validation and Uncertainty Quantification
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Optimal Test Selection for Prediction Uncertainty Reduction

    Source: Journal of Verification, Validation and Uncertainty Quantification:;2016:;volume( 001 ):;issue: 004::page 41002
    Author:
    Mullins, Joshua
    ,
    Mahadevan, Sankaran
    ,
    Urbina, Angel
    DOI: 10.1115/1.4035204
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecise data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. The proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.
    • Download: (944.9Kb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Optimal Test Selection for Prediction Uncertainty Reduction

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4236156
    Collections
    • Journal of Verification, Validation and Uncertainty Quantification

    Show full item record

    contributor authorMullins, Joshua
    contributor authorMahadevan, Sankaran
    contributor authorUrbina, Angel
    date accessioned2017-11-25T07:20:00Z
    date available2017-11-25T07:20:00Z
    date copyright2016/12/02
    date issued2016
    identifier issn2377-2158
    identifier othervvuq_001_04_041002.pdf
    identifier urihttp://138.201.223.254:8080/yetl1/handle/yetl/4236156
    description abstractEconomic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecise data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. The proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleOptimal Test Selection for Prediction Uncertainty Reduction
    typeJournal Paper
    journal volume1
    journal issue4
    journal titleJournal of Verification, Validation and Uncertainty Quantification
    identifier doi10.1115/1.4035204
    journal fristpage41002
    journal lastpage041002-10
    treeJournal of Verification, Validation and Uncertainty Quantification:;2016:;volume( 001 ):;issue: 004
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian