Bounds on the Entropy of a Function of a Random Variable and Their Applications
It is well known that the entropy <inline-formula> <tex-math notation="LaTeX">H(X) </tex-math></inline-formula> of a discrete random variable <inline-formula> <tex-math notation="LaTeX">X </tex-math></inline-formula> is always great...
Saved in:
Published in: | IEEE transactions on information theory Vol. 64; no. 4; pp. 2220 - 2230 |
---|---|
Main Authors: | , , |
Format: | Journal Article |
Language: | English |
Published: |
New York
IEEE
01-04-2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | It is well known that the entropy <inline-formula> <tex-math notation="LaTeX">H(X) </tex-math></inline-formula> of a discrete random variable <inline-formula> <tex-math notation="LaTeX">X </tex-math></inline-formula> is always greater than or equal to the entropy <inline-formula> <tex-math notation="LaTeX">H(f(X)) </tex-math></inline-formula> of a function <inline-formula> <tex-math notation="LaTeX">f </tex-math></inline-formula> of <inline-formula> <tex-math notation="LaTeX">X </tex-math></inline-formula>, with equality if and only if <inline-formula> <tex-math notation="LaTeX">f </tex-math></inline-formula> is one-to-one. In this paper, we give tight bounds on <inline-formula> <tex-math notation="LaTeX">H(f(X)) </tex-math></inline-formula>, when the function <inline-formula> <tex-math notation="LaTeX">f </tex-math></inline-formula> is not one-to-one, and we illustrate a few scenarios, where this matters. As an intermediate step toward our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.2017.2787181 |