Skip to main content

Inference in Increasing Dimension

Since 1960s, statisticians have started realizing the importance of characterizing the role of parameter dimension in statistical inference. For this, pioneers like Peter Huber started introducing a new feature into asymptotics: the number of parameters, p, is now allowed to increase with the sample size n. Following this track of thinking, in 1980-90's, statisticians like Stephen Portnoy and Enno Mammen developed a set of seminal results, sharply characterizing the limiting behaviors of regression estimators and exploring the growth rate boundary of p with regard to n. More recent results, aiming at tackling more general statistical problems, include He and Shao (2000) and Spokoiny (2012). In this talk, I will give a systematic review of these results, as well as disclose more new features. In particular, simple criterion will be given for guaranteeing asymptotic normality for general problems under the boundary condition p^2/n -> 0. The proof rests on Talagrand generic chaining for empirical processes. Smoothness will be shown, as always, to play the key role in analysis.


Room
409