YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASME
    • Journal of Computational and Nonlinear Dynamics
    • View Item
    •   YE&T Library
    • ASME
    • Journal of Computational and Nonlinear Dynamics
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Causation Entropy Identifies Sparsity Structure for Parameter Estimation of Dynamic Systems

    Source: Journal of Computational and Nonlinear Dynamics:;2017:;volume( 012 ):;issue: 001::page 11008
    Author:
    Kim, Pileun
    ,
    Rogers, Jonathan
    ,
    Sun, Jie
    ,
    Bollt, Erik
    DOI: 10.1115/1.4034126
    Publisher: The American Society of Mechanical Engineers (ASME)
    Abstract: Parameter estimation is an important topic in the field of system identification. This paper explores the role of a new information theory measure of data dependency in parameter estimation problems. Causation entropy is a recently proposed information-theoretic measure of influence between components of multivariate time series data. Because causation entropy measures the influence of one dataset upon another, it is naturally related to the parameters of a dynamical system. In this paper, it is shown that by numerically estimating causation entropy from the outputs of a dynamic system, it is possible to uncover the internal parametric structure of the system and thus establish the relative magnitude of system parameters. In the simple case of linear systems subject to Gaussian uncertainty, it is first shown that causation entropy can be represented in closed form as the logarithm of a rational function of system parameters. For more general systems, a causation entropy estimator is proposed, which allows causation entropy to be numerically estimated from measurement data. Results are provided for discrete linear and nonlinear systems, thus showing that numerical estimates of causation entropy can be used to identify the dependencies between system states directly from output data. Causation entropy estimates can therefore be used to inform parameter estimation by reducing the size of the parameter set or to generate a more accurate initial guess for subsequent parameter optimization.
    • Download: (1.109Mb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Causation Entropy Identifies Sparsity Structure for Parameter Estimation of Dynamic Systems

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4236348
    Collections
    • Journal of Computational and Nonlinear Dynamics

    Show full item record

    contributor authorKim, Pileun
    contributor authorRogers, Jonathan
    contributor authorSun, Jie
    contributor authorBollt, Erik
    date accessioned2017-11-25T07:20:17Z
    date available2017-11-25T07:20:17Z
    date copyright2016/1/9
    date issued2017
    identifier issn1555-1415
    identifier othercnd_012_01_011008.pdf
    identifier urihttp://138.201.223.254:8080/yetl1/handle/yetl/4236348
    description abstractParameter estimation is an important topic in the field of system identification. This paper explores the role of a new information theory measure of data dependency in parameter estimation problems. Causation entropy is a recently proposed information-theoretic measure of influence between components of multivariate time series data. Because causation entropy measures the influence of one dataset upon another, it is naturally related to the parameters of a dynamical system. In this paper, it is shown that by numerically estimating causation entropy from the outputs of a dynamic system, it is possible to uncover the internal parametric structure of the system and thus establish the relative magnitude of system parameters. In the simple case of linear systems subject to Gaussian uncertainty, it is first shown that causation entropy can be represented in closed form as the logarithm of a rational function of system parameters. For more general systems, a causation entropy estimator is proposed, which allows causation entropy to be numerically estimated from measurement data. Results are provided for discrete linear and nonlinear systems, thus showing that numerical estimates of causation entropy can be used to identify the dependencies between system states directly from output data. Causation entropy estimates can therefore be used to inform parameter estimation by reducing the size of the parameter set or to generate a more accurate initial guess for subsequent parameter optimization.
    publisherThe American Society of Mechanical Engineers (ASME)
    titleCausation Entropy Identifies Sparsity Structure for Parameter Estimation of Dynamic Systems
    typeJournal Paper
    journal volume12
    journal issue1
    journal titleJournal of Computational and Nonlinear Dynamics
    identifier doi10.1115/1.4034126
    journal fristpage11008
    journal lastpage011008-14
    treeJournal of Computational and Nonlinear Dynamics:;2017:;volume( 012 ):;issue: 001
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian