Least tail-trimmed squares for infinite variance autoregressions


Department of Economics, University of North Carolina-Chapel Hill,


We develop a robust least squares estimator for autoregressions with possibly heavy tailed errors. Robustness to heavy tails is ensured by negligibly trimming the squared error according to extreme values of the error and regressors. Tail-trimming ensures asymptotic normality and super-inline image-convergence with a rate comparable to the highest achieved amongst M-estimators for stationary data. Moreover, tail-trimming ensures robustness to heavy tails in both small and large samples. By comparison, existing robust estimators are not as robust in small samples, have a slower rate of convergence when the variance is infinite, or are not asymptotically normal. We present a consistent estimator of the covariance matrix and treat classic inference without knowledge of the rate of convergence. A simulation study demonstrates the sharpness and approximate normality of the estimator, and we apply the estimator to financial returns data. Finally, tail-trimming can be easily extended beyond least squares estimation for a linear stationary AR model. We discuss extensions to quasi-maximum likelihood for GARCH, weighted least squares for a possibly non-stationary random coefficient autoregression, and empirical likelihood for robust confidence region estimation, in each case for models with possibly heavy tailed errors.