Universality of regularized estimators in high dimensions (Yandi Shen -- U Chicago)
- Starts: 4:00 pm on Thursday, February 9, 2023
- Ends: 10:13 pm on Friday, October 17, 2025
The study of regularized regression estimators is of fundamental importance in high dimensional inference. From the canonical linear model to many of its extensions, a by now well-established type of analysis has generated a remarkable suite of results on risk and generalization bounds, support recovery, oracle inequalities, and many others. However, this type of analysis typically has a worst-case/minimax flavor and is usually only accurate up to multiplicative constants, thereby still falling short of capturing the precise stochastic behavior of various statistical estimators. In contrast, a recent line of work has focused on pointwise exact analysis, which leads to an exact characterization of key properties of an estimator, such as its risk or distribution. Unfortunately, this second type of analysis has thus far been mostly limited to the setting of a Gaussian design matrix, because it relies on Gaussian-specific techniques such as the Convex Gaussian Min-Max Theorem. In this talk, focusing on the high dimensional linear model, I will establish a universality framework that enables us to extend pointwise exact analysis to the setting of a non- Gaussian design matrix. The key ingredient is a Gaussian approximation argument for the objective function associated with the estimator, after which existing Gaussian analysis based on the Convex Gaussian Min-Max Theorem can be applied. I will then apply this new framework to well-known regularized estimators, such as the (debiased) lasso, ridge, and robust regression, in order to obtain guarantees on the risk, distribution, confidence intervals, residuals, and other key quantities of interest. This is joint work with Qiyang Han at Rutgers University.
- Location:
- CDS, 665 Comm Ave (Room 950)