This article explores the problem of estimating stationary autoregressive models from observed data using the Bayesian least absolute shrinkage and selection operator (LASSO). By characterizing the model in terms of partial autocorrelations, rather than coefficients, it becomes straightforward to guarantee that the estimated models are stationary. The form of the negative log-likelihood is exploited to derive simple expressions for the conditional likelihood functions, leading to efficient algorithms for computing the posterior mode by coordinate-wise descent and exploring the posterior distribution by Gibbs sampling. Both empirical Bayes and Bayesian methods are proposed for the estimation of the LASSO hyper-parameter from the data. Simulations demonstrate that the Bayesian LASSO performs well in terms of prediction when compared with a standard autoregressive order selection method.