Professor Chandra Nair
Department of Information Engineering, Chinese University of Hong Kong

Topic: Information Inequalities over Additive Structures

Abstract:

In this lecture series, we will review and explore various information inequalities over additive structures, discussing their implications and proofs across different mathematical contexts. Additive structures play an important role in several parts of information theory. For instance, the additive Gaussian noise model is a standard assumption in wireless settings. However, additive structures also play a significant role in other aspects of mathematics.

In the first two lectures we will consider continuous-valued random variables and establish some well-known inequalities, such as the entropy power inequality and their generalizations. The lecture series will begin with minimal assumptions about the audience’s background, though it will progress quickly, which may be challenging for those not previously exposed to information theory. Therefore, it may be advisable to review the basic information-theoretic measures, such as entropy, differential entropy, and mutual information, along with their properties, including the chain rule and the data-processing inequality. For example, familiarity with the contents of Chapter 2 of Cover and Thomas will be more than sufficient.

The main aim is to provide two proofs here:

a) Proof of a unified form of the Entropy Power Inequality and the Brascamp-Lieb inequality via a sub-additivity argument.

b) Proof of a generalized Entropy Power Inequality as a consequence of a two-point inequality concerning mutual information.

Each of these proofs utilizes techniques that have proven effective in other contexts.

In the third and fourth lecture, we will switch to random variables that take values in finite Abelian groups. The aim here is again to provide two proofs:

a) Proof of an analogous family of inequalities as the first result in the first part of this series (where uniform distributions replace Gaussians).

b) Proof of the optimality of uniform distributions for an information functional devised by Gowers, Green, Manners, and Tao.

Interspersed throughout the lecture series will be open questions. In particular, we will discuss why these results should excite multi-user information theorists. The work of Gowers, Green, Manners, and Tao will be highlighted as particularly relevant.

If time permits, the lecture series will also establish a generalized equivalence theorem that maps inequalities about set sizes in additive combinatorics to entropic inequalities.

Bio:

Chandra Nair is a Professor in the Department of Information Engineering at The Chinese University of Hong Kong. He serves as the Programme Director of the Mathematics and Information Engineering undergraduate program.

Chandra Nair got his Bachelor's degree, B.Tech (EE), from IIT Madras (India) and his Ph.D. degree from the EE department at Stanford University. He has been an Associate Editor for the IEEE Transactions on Information Theory and was a Distinguished Lecturer of the IEEE Information Theory Society. He is a Fellow of the IEEE. He is a co-recipient of the 2016 Information Theory Society paper award, and was a plenary speaker at the 2021 IEEE International Symposium on Information Theory.

His recent research interests and contributions are in developing ideas, tools, and techniques to tackle families of combinatorial and non-convex optimization problems arising primarily in the information sciences.


banner

Stay in the loop!

Subscribe to keep up with the latest from Croucher Foundation.

Passionate about science?
Stay updated with the latest scientific developments in Hong Kong through Croucher News.

Subscribe to our regular newsletter and receive a digest of key science stories straight to your inbox. You'll also get updates from the Croucher Foundation on scholarships, scientific exchanges, and more.

Subscribe now and stay informed about Hong Kong's dynamic scientific landscape.

Email

First name

Last name

Organisation