On Some Shortcomings of Shannon Entropy as a Measure of Information Content in Indirect Measurements of Continuous VariablesSource: Journal of Atmospheric and Oceanic Technology:;2018:;volume 035:;issue 005::page 1011Author:Petty, Grant W.
DOI: 10.1175/JTECH-D-17-0056.1Publisher: American Meteorological Society
Abstract: AbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors? writing mainly in the data assimilation and climate analysis literature, the Kullback?Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity.
|
Collections
Show full item record
contributor author | Petty, Grant W. | |
date accessioned | 2019-09-19T10:03:15Z | |
date available | 2019-09-19T10:03:15Z | |
date copyright | 3/15/2018 12:00:00 AM | |
date issued | 2018 | |
identifier other | jtech-d-17-0056.1.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl1/handle/yetl/4261019 | |
description abstract | AbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors? writing mainly in the data assimilation and climate analysis literature, the Kullback?Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity. | |
publisher | American Meteorological Society | |
title | On Some Shortcomings of Shannon Entropy as a Measure of Information Content in Indirect Measurements of Continuous Variables | |
type | Journal Paper | |
journal volume | 35 | |
journal issue | 5 | |
journal title | Journal of Atmospheric and Oceanic Technology | |
identifier doi | 10.1175/JTECH-D-17-0056.1 | |
journal fristpage | 1011 | |
journal lastpage | 1021 | |
tree | Journal of Atmospheric and Oceanic Technology:;2018:;volume 035:;issue 005 | |
contenttype | Fulltext |