YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • AMS
    • Journal of Atmospheric and Oceanic Technology
    • View Item
    •   YE&T Library
    • AMS
    • Journal of Atmospheric and Oceanic Technology
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    On Some Shortcomings of Shannon Entropy as a Measure of Information Content in Indirect Measurements of Continuous Variables

    Source: Journal of Atmospheric and Oceanic Technology:;2018:;volume 035:;issue 005::page 1011
    Author:
    Petty, Grant W.
    DOI: 10.1175/JTECH-D-17-0056.1
    Publisher: American Meteorological Society
    Abstract: AbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors? writing mainly in the data assimilation and climate analysis literature, the Kullback?Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity.
    • Download: (228.9Kb)
    • Show Full MetaData Hide Full MetaData
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      On Some Shortcomings of Shannon Entropy as a Measure of Information Content in Indirect Measurements of Continuous Variables

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/4261019
    Collections
    • Journal of Atmospheric and Oceanic Technology

    Show full item record

    contributor authorPetty, Grant W.
    date accessioned2019-09-19T10:03:15Z
    date available2019-09-19T10:03:15Z
    date copyright3/15/2018 12:00:00 AM
    date issued2018
    identifier otherjtech-d-17-0056.1.pdf
    identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4261019
    description abstractAbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors? writing mainly in the data assimilation and climate analysis literature, the Kullback?Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity.
    publisherAmerican Meteorological Society
    titleOn Some Shortcomings of Shannon Entropy as a Measure of Information Content in Indirect Measurements of Continuous Variables
    typeJournal Paper
    journal volume35
    journal issue5
    journal titleJournal of Atmospheric and Oceanic Technology
    identifier doi10.1175/JTECH-D-17-0056.1
    journal fristpage1011
    journal lastpage1021
    treeJournal of Atmospheric and Oceanic Technology:;2018:;volume 035:;issue 005
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian