This is a revision of a paper that I presented at the John Nankervis Memorial Conference in July 2013. The purposes are to describe the research produced jointly by John and I and to give some personal comments.

]]>The problem of non-parametric spectral density estimation for discrete-time series in the presence of missing observations has a long history. In particular, the first consistent estimators of the spectral density have been developed at about the same time as consistent estimators for non-parametric regression. On the other hand, while for now, the theory of efficient (under the minimax mean integrated squared error criteria) and adaptive nonparametric regression estimation with missing data is well developed, no similar results have been proposed for the spectral density of a time series whose observations are missed according to an unknown stochastic process. This article develops the theory of efficient and adaptive estimation for a class of spectral densities that includes classical causal autoregressive moving-average time series. The developed theory shows how a missing mechanism affects the estimation and what penalty it imposes on the risk convergence. In particular, given costs of a single observation in time series with and without missing data and a desired accuracy of estimation, the theory allows one to choose the cost-effective time series. A numerical study confirms the asymptotic theory.

]]>A frequency domain methodology is proposed for estimating parameters of covariance functions of stationary spatio-temporal processes. Finite Fourier transforms of the processes are defined at each location. Based on the joint distribution of these complex valued random variables, an approximate likelihood function is constructed. The sampling properties of the estimators are investigated. It is observed that the expectation of these transforms can be considered to be a frequency domain analogue of the classical variogram. We call this measure frequency variogram. The method is applied to simulated data and also to Pacific wind speed data considered earlier by Cressie and Huang Cressie and Huang (1999). The proposed method does not depend on the distributional assumptions about the process.

]]>The distributions of cointegration tests are affected when the innovation variance varies over time. In panels, one must also pay attention to dependence among units. To obtain a panel cointegration test robust to both heteroskedasticity and dependence, we adapt the nonlinear instruments method proposed for the Dickey–Fuller test by Chang (2002, J Econometrics 110, 261–292) to an error-correction framework. We show that IV-based testing of the no error-correction null in individual equations yields standard normal test statistics when computed with heteroskedasticity-robust standard errors. The result holds under endogenous regressors, irrespective of the number of integrated covariates and for any variance profile. A non-cointegration test combining single-equation tests retains these nice properties. In panels of fixed cross-sectional dimension, such test statistics from individual units are shown to be asymptotically independent even under dependence, leading to panel tests robust to dependence and heteroskedasticity. The tests perform well in finite panels.

]]>A two-step estimation method is proposed for periodic autoregressive parameters via residuals when the observations contain trend and periodic autoregressive time series. The oracle efficiency of the proposed Yule–Walker-type estimator is established. The performance is illustrated by simulation studies and real data analysis.

]]>We consider a model for the discrete nonboundary wavelet coefficients of autoregressive fractionally integrated moving average (ARFIMA) processes in each scale. Because the utility of the wavelet transform for the long-range dependent processes, which many authors have explained in semi-parametrical literature, is approximating the transformed processes to white noise processes in each scale, there have been few studies in a parametric setting. In this article, we propose the model from the forms of the (generalized) spectral density functions (SDFs) of these coefficients. Since the discrete wavelet transform has the property of downsampling, we cannot directly represent these (generalized) SDFs. To overcome this problem, we define the discrete non-decimated nonboundary wavelet coefficients and compute their (generalized) SDFs. Using these functions and restricting the wavelet filters to the Daubechies wavelets and least asymmetric filters, we make the (generalized) SDFs of the discrete nonboundary wavelet coefficients of ARFIMA processes in each scale clear. Additionally, we propose a model for the discrete nonboundary scaling coefficients in each scale.

In a recent paper, Harvey *et al.* (2013) (HLT) propose a new unit root test that allows for the possibility of multiple breaks in trend. Their proposed test is based on the infimum of the sequence (across all candidate break points) of local GLS detrended augmented Dickey–Fuller-type statistics. HLT show that the power of their unit root test is robust to the magnitude of any trend breaks. In contrast, HLT show that the power of the only alternative available procedure of Carrion-i-Silvestre *et al.* (2009), which employs a pretest-based approach, can be very low indeed (even zero) for the magnitudes of trend breaks typically observed in practice. Both HLT and Carrion-i-Silvestre *et al.* (2009) base their approaches on the assumption of homoskedastic shocks. In this article, we analyse the impact of non-stationary volatility (for example, single and multiple abrupt variance breaks, smooth transition variance breaks and trending variances) on the tests proposed in HLT. We show that the limiting null distribution of the HLT unit root test statistic is not pivotal under non-stationary volatility. A solution to the problem, which does not require the practitioner to specify a parametric model for volatility, is provided using the wild bootstrap and is shown to perform well in practice. A number of different possible implementations of the bootstrap algorithm are discussed.

This article investigates the statistical properties of the recently introduced quantile periodogram for time series with time-dependent variance. The asymptotic distribution of the quantile periodogram is derived in the case where the time series consists of i.i.d. random variables multiplied by a time-dependent scale parameter. It is shown that the time-dependent variance is represented approximately additively in the mean of the asymptotic distribution of the quantile periodogram. It is also shown that the strength of the representation is proportional to the squared quantile of the i.i.d. random variables, so that a stronger characterization is expected at upper and lower quantile levels if the time series is centred at zero. These properties are further demonstrated by simulation results. The series of daily returns from the Dow Jones Industrial Average, which is known to exhibit heteroscedastic volatility, serves to motivate the investigation.

]]>We study inference and diagnostics for count time series regression models that include a feedback mechanism. In particular, we are interested in negative binomial processes for count time series. We study probabilistic properties and quasi-likelihood estimation for this class of processes. We show that the resulting estimators are consistent and asymptotically normally distributed. These facts enable us to construct probability integral transformation plots for assessing any assumed distributional assumptions. The key observation in developing the theory is a mean parameterized form of the negative binomial distribution. For transactions data, it is seen that the negative binomial distribution offers a better fit than the Poisson distribution. This is an immediate consequence of the fact that transactions can be represented as a collection of individual activities that correspond to different trading strategies.

]]>This paper is concerned with a version of empirical likelihood method for spectral restrictions, which handles stationary time series data via the frequency domain approach. The asymptotic properties of frequency domain generalized empirical likelihood are studied for either strictly stationary processes with vanishing cumulant spectral density function of order 4 or linear processes generated by iid innovations with possibly non-zero fourth order cumulant. Several statistics for testing parametric restrictions, over-identified spectral restrictions, and additional spectral restrictions are shown to have the limiting chi-squared distributions. Some numerical results are presented to investigate the finite sample performance of the proposed procedures. Copyright © 2013 John Wiley & Sons, Ltd.

]]>Based on the concept of a Lévy copula to describe the dependence structure of a multi-variate Lévy process, we present a new estimation procedure. We consider a parametric model for the marginal Lévy processes as well as for the Lévy copula and estimate the parameters by a two-step procedure. We first estimate the parameters of the marginal processes and then estimate in a second step only the dependence structure parameter. For infinite Lévy measures, we truncate the small jumps and base our statistical analysis on the large jumps of the model. Prominent example will be a bivariate stable Lévy process, which allows for analytic calculations and, hence, for a comparison of different methods. We prove asymptotic normality of the parameter estimates from the two-step procedure, and in particular, we derive the Godambe information matrix, whose inverse is the covariance matrix of the normal limit law. A simulation study investigates the loss of efficiency because of the two-step procedure and the truncation.

The Gaussian assumption generally employed in many state-space models is usually not satisfied for real time series. Thus, in this work, a broad family of non-Gaussian models is defined by integrating and expanding previous work in the literature. The expansion is obtained at two levels: at the observational level, it allows for many distributions not previously considered, and at the latent state level, it involves an expanded specification for the system evolution. The class retains analytical availability of the marginal likelihood function, uncommon outside Gaussianity. This expansion considerably increases the applicability of the models and solves many previously existing problems such as long-term prediction, missing values and irregular temporal spacing. Inference about the state components can be performed because of the introduction of a new and exact smoothing procedure, in addition to filtered distributions. Inference for the hyperparameters is presented from the classical and Bayesian perspectives. The results seem to indicate competitive results of the models when compared with other non-Gaussian state-space models available. The methodology is applied to Gaussian and non-Gaussian dynamic linear models with time-varying means and variances and provides a computationally simple solution to inference in these models. The methodology is illustrated in a number of examples.

]]>We study the limit law of a vector made up of normalized sums of functions of long-range dependent stationary Gaussian series. Depending on the memory parameter of the Gaussian series and on the Hermite ranks of the functions, the resulting limit law may be (a) a multi-variate Gaussian process involving dependent Brownian motion marginals, (b) a multi-variate process involving dependent Hermite processes as marginals or (c) a combination. We treat cases (a) and (b) in general and case (c) when the Hermite components involve ranks 1 and 2. We include a conjecture about case (c) when the Hermite ranks are arbitrary, although the conjecture can be resolved in some special cases.

]]>This article explores the problem of estimating stationary autoregressive models from observed data using the Bayesian least absolute shrinkage and selection operator (LASSO). By characterizing the model in terms of partial autocorrelations, rather than coefficients, it becomes straightforward to guarantee that the estimated models are stationary. The form of the negative log-likelihood is exploited to derive simple expressions for the conditional likelihood functions, leading to efficient algorithms for computing the posterior mode by coordinate-wise descent and exploring the posterior distribution by Gibbs sampling. Both empirical Bayes and Bayesian methods are proposed for the estimation of the LASSO hyper-parameter from the data. Simulations demonstrate that the Bayesian LASSO performs well in terms of prediction when compared with a standard autoregressive order selection method.

]]>The article reviews methods of inference for single and multiple change-points in time series, when data are of retrospective (off-line) type. The inferential methods reviewed for a single change-point in time series include likelihood, Bayes, Bayes-type and some relevant non-parametric methods. Inference for multiple change-points requires methods that can handle large data sets and can be implemented efficiently for estimating the number of change-points as well as their locations. Our review in this important area focuses on some of the recent advances in this direction. Greater emphasis is placed on multivariate data while reviewing inferential methods for a single change-point in time series. Throughout the article, more attention is paid to estimation of unknown change-point(s) in time series, and this is especially true in the case of multiple change-points. Some specific data sets for which change-point modelling has been carried out in the literature are provided as illustrative examples under both single and multiple change-point scenarios.

]]>This article proposes a hybrid bootstrap approach to approximate the augmented Dickey–Fuller test by perturbing both the residual sequence and the minimand of the objective function. Since random errors can be dependent, this allows the inclusion of conditional heteroscedasticity models. The new bootstrap method is also applied to least absolute deviation-based unit root test statistics, which are efficient in handling heavy-tailed time series data. The asymptotic distributions of resulting bootstrap tests are presented, and Monte Carlo studies demonstrate the usefulness of the proposed tests.

]]>We consider a heteroscedastic nonparametric regression model with an autoregressive error process of finite known order *p*. The heteroscedasticity is incorporated using a scaling function defined at uniformly spaced design points on an interval [0,1]. We provide an innovative nonparametric estimator of the variance function and establish its consistency and asymptotic normality. We also propose a semiparametric estimator for the vector of autoregressive error process coefficients that is consistent and asymptotically normal for a sample size *T*. Explicit asymptotic variance covariance matrix is obtained as well. Finally, the finite sample performance of the proposed method is tested in simulations.

This article first studies the non-stationarity of the first-order double AR model, which is defined by the random recurrence equation , where *γ*_{0} > 0, *α*_{0} ≥ 0, and {*η*_{t}}is a sequence of i.i.d. symmetric random variables. It is shown that the double AR(1) model is explosive under the condition . Based on this, it is shown that the quasi-maximum likelihood estimator of (*φ*_{0},*α*_{0}) is consistent and asymptotically normal so that the unit root problem does not exist in the double AR(1) model. Simulation studies are carried out to assess the performance of the quasi-maximum likelihood estimator in finite samples.

This article proposes a flexible set of transformed polynomial functions for modelling the conditional mean of autoregressive processes. These functions enjoy the same approximation theoretic properties of polynomials and, at the same time, ensure that the process is strictly stationary, is ergodic, has fading memory and has bounded unconditional moments. The consistency and asymptotic normality of the least-squares estimator is easily obtained as a result. A Monte Carlo study provides evidence of good finite sample properties. Applications in empirical time-series modelling, structural economics and structural engineering problems show the usefulness of transformed polynomials in a wide range of settings.

]]>The concentration of aerosol particles, largely caused by traffic volume and often enhanced during temperature inversion episodes in the cold season, can be a concern for human health in the urban environment. This particulate matter is typically recorded as PM_{10}, the total mass of particles below 10 μm in diameter. It is suspected that, within the PM_{10} class, ultrafine particles ( < 100 nm) may be responsible for causing respiratory and cardiovascular diseases. Because of their low mass, ultrafine particles are hard to detect, and researchers try to utilize PM_{10} in combination with nitrogen oxides NO_{x} and other trace gases to monitor their dynamic evolution. To meet pollution standards set by national government and European Union regulation, the city of Klagenfurt, Austria, began using calcium magnesium acetate as a deicer on 11 January 2012, hoping to literally glue pollutants to the ground and thereby reducing pollution concentrations. With the statistical methodology developed in this article, the dynamic evolution of PM_{10} and NO_{x} is traced for the time period starting 4 January and ending 25 January 2012, and a change in dynamics is found. The findings are based on on-line monitoring procedures that sequentially detect structural breaks in the mean and the parameter values of an autoregressive moving average process. These are defined in terms of model residuals and one-step ahead predictors. Theoretical properties are studied, and a simulation study shows that the proposed procedures work well in finite samples.

Consider an infinite dimensional vector linear process. Under suitable assumptions on the parameter space, we provide consistent estimators of the autocovariance matrices. In particular, under causality, this includes the infinite-dimensional vector autoregressive (IVAR) process. In that case, we obtain consistent estimators for the parameter matrices. An explicit expression for the estimators is obtained for IVAR(1), under a fairly realistic parameter space. We also show that under some mild restrictions, the consistent estimator of the marginal large dimensional variance–covariance matrix has the same convergence rate as that in case of i.i.d. samples.

The consistency of the quasi-maximum likelihood estimator for random coefficient autoregressive models requires that the coefficient be a non-degenerate random variable. In this article, we propose empirical likelihood methods based on weighted-score equations to construct a confidence interval for the coefficient. We do not need to distinguish whether the coefficient is random or deterministic and whether the process is stationary or non-stationary, and we present two classes of equations depending on whether a constant trend is included in the model. A simulation study confirms the good finite-sample behaviour of our resulting empirical likelihood-based confidence intervals. We also apply our methods to study US macroeconomic data.

]]>