- This event has passed.

# Bias Reduction and Asymptotic Eﬃciency in Estimation of Smooth Functionals of High-Dimensional Covariance

## November 30 @ 11:00 am - 12:00 pm

Vladimir Koltchinskii (Georgia Institute of Technology)

E18-304

### Event Navigation

** Abstract: ** We discuss a recent approach to bias reduction in a problem of estimation of smooth functionals of high-dimensional parameters of statistical models. In particular, this approach has been developed in the case of estimation of functionals of covariance operator Σ : Rd ^{d} → R^{d} of the form f(Σ), *B* based on n i.i.d. observations X_{1}, . . . , X_{n} sampled from the normal distribution with mean zero and covariance Σ, f : R → R being a sufficiently smooth function and B being an operator with nuclear norm bounded by a constant. This includes such problems as estimation of bilinear forms (for instance, matrix entries in a given basis) of spectral projections of unknown covariance that are of importance in principal component analysis. A “bootstrap chain” bias reduction method, based on an approximate solution of a certain integral equation (the Wishart equation) on the cone of self-adjoint positive semideﬁnite operators, yields asymptotically eﬃcient estimators of the functional f(Σ), B under proper assumptions on the growth of dimension d and smoothness of function f. In particular, this holds under the assumption that d ≤ n^{α} for some α ∈ (0, 1) and that f belongs to a Besov space B^{s}_{∞},1(R) for s > ^{1} . The proof of asymptotic efficiency relies on a number of probabilistic and analytic tools (operator differentiability; Gaussian concentration; properties of Wishart operators and orthogonally invariant functions on the cone of positive semideﬁnite operators; information-theoretic lower bounds).

** Biography: ** Vladimir Koltchinskii is a professor in Mathematics at Georgia Tech. His current research is primarily in high-dimensional statistics and probability.