Theoretical Statistics and Mathematics Unit, ISI Delhi

October 25, 2019 (Friday) ,
3:30 PM at Webinar

Speaker:
Debraj Das,
IIT Kanpur

Title:
High Dimensional Central Limit Theorem

Abstract of Talk

In this talk, I will try to give answer to the simple question:
``\textit{When does Central Limit theorem fail if dimension increases with the sample size $n$?}''
Specifically, I am interested in the normal approximation of suitably scaled version of the sum $\sum_{i=1}^{n}X_i$, uniformly over the class $\mathcal{A}=\{\prod_{j=1}^{p}(-\infty,x_j]:x_1,x_2,\dots, x_p \in \mathcal{R}\}$, where $X_1,\dots,X_n$ are zero mean independent $p-$dimensional random vectors with each having independent and identically distributed (iid) components. According to the recent results of Chernozukov et al. (2017), it is well known that the CLT holds uniformly over $\mathcal{A}$ if $\log p_n=o(n^{1/7})$. They conjectured that for CLT to hold uniformly over the class of hyper-rectangles, the optimal rate is $(\log p_n)^3 = o(n)$. I show instead that under some conditions, the CLT holds uniformly over $\mathcal{A}$, when $(\log p_n)^2=o(n)$. When $\log p_n =\epsilon \sqrt{n}$ for some sufficiently small $\epsilon>0$, the normal approximation is valid with an error $\epsilon$, uniformly over $\mathcal{A}$. Moreover, I show by an example that the uniform CLT over $\mathcal{A}$ fails if $\log p_n\geq 2\sqrt{n\log n}$. Hence the optimal rate of the growth of $\log p_n$ is something in between $\sqrt{n}$ and $\sqrt{n\log n}$. The conjecture of Chernozukov et al. (2017) is indeed partially solved and extended for the class $\mathcal{A}$.