Entropy power inequalities and monotonicity in central limit theorems

Abstract: New entropy power inequalities and Fisher information inequalities have recently been developed which relate the information in sums of independent random variables to the information in sums of subsets of the random variables. Arbitrary collections of subsets are allowed. These inequalities have a simple proof developed with Andrew Barron that I present. The special case of singleton subsets is due to Shannon and to Stam and the case of leave-one-out subsets is due to Artstein, Ball, Barthe and Naor. These inequalities demonstrate the monotonicity of convergence in central limit theorems, and strengthen an interpretation of central limit theorems as formulations of the second law of thermodynamics. I will also touch upon related work on distributed estimation and matrix inequalities done in various collaborations with A. Barron (Yale), A. Kagan and T. Yu (University of Maryland).