Information inequalities: interpretations and applications

Abstract: We review and refine classical inequalities (of Shannon, Han, Shearer etc.) for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. A duality between the upper and new lower bounds for joint entropy is developed, as are connections to entropy power inequalities. Applications include a new upper bound on the number of independent sets in an arbitrary graph, and new determinantal inequalities for positive-definite matrices. This is joint work with Prasad Tetali (Georgia Tech).