Join the Salem Center and Panos Toulis (Chicago Booth).
Standard statistical inference in regression models, including bootstrap-based procedures, relies on assumptions on the asymptotics of the covariate/error distribution. These assumptions are generally strong—for example, they are typically violated by simple heavy-tailed distributions. In this talk, we propose a new paradigm of inference using randomization theory. Our main method relies only on an invariance assumption on the regression errors (e.g., exchangeability) without requiring any additional independence or normality assumptions. We prove general conditions that guarantee asymptotic validity of our method, which relate mostly to the “leverage structure” of the covariate/error distribution. This new approach to inference has three main advantages over standard methods, including the bootstrap: (1) it addresses the inference problem in a unified way as it can accommodate complex error structures beyond iid, while bootstrap typically needs to be adapted to the task; (2) it works under weaker assumptions, and does not rely on asymptotic normality or CLT results; and (3) in some settings it can even be valid in finite samples. In extensive empirical evaluations, the randomization inference method performs favorably against many alternatives, including wild bootstrap variants and asymptotic robust error methods.