Abstract
This paper focuses on illustrating 1) the equivalence between Stein's identity and De Bruijn's identity, and 2) two extensions of De Bruijn's identity. First, it is shown that Stein's identity is equivalent to De Bruijn's identity under additive noise channels with specific conditions. Second, for arbitrary but fixed input and noise distributions under additive noise channels, the first derivative of the differential entropy is expressed by a function of the posterior mean, and the second derivative of the differential entropy is expressed in terms of a function of Fisher information. Several applications over a number of fields, such as signal processing and information theory, are presented to support the usefulness of the developed results in this paper.
Original language | English |
---|---|
Article number | 6248705 |
Pages (from-to) | 7045-7067 |
Number of pages | 23 |
Journal | IEEE Transactions on Information Theory |
Volume | 58 |
Issue number | 12 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Keywords
- Bayesian Cramér-Rao lower bound (BCRLB)
- Costa's EPI
- Cramér-Rao lower bound (CRLB)
- De Bruijn's identity
- entropy power inequality (EPI)
- Fisher information inequality (FII)
- Stein's identity