Mutual Information Matrices Are Not Always Positive Semidefinite

For discrete random variables X 1 , ..., X n we construct an n by n matrix. In the (i, j)-entry we put the mutual information I(X i ; X j ) between X i and X j . In particular, in the (i, i)-entry we put the entropy H(X i ) = I(X i ; X i ) of X i . This matrix, called the mutual information matrix o...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory Vol. 60; no. 5; pp. 2694 - 2696
Main Author: Jakobsen, Sune K.
Format: Journal Article
Language:English
Published: New York, NY IEEE 01-05-2014
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:For discrete random variables X 1 , ..., X n we construct an n by n matrix. In the (i, j)-entry we put the mutual information I(X i ; X j ) between X i and X j . In particular, in the (i, i)-entry we put the entropy H(X i ) = I(X i ; X i ) of X i . This matrix, called the mutual information matrix of (X 1 , ..., X n ), has been conjectured to be positive semidefinite. In this paper, we give counterexamples to the conjecture, and show that the conjecture holds for up to three random variables.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2014.2311434