We introduce a wavelet characterization of continuous-time periodically correlated processes based on a linear combination of infinite-dimensional stationary processes. The finite version of this linear combination converges to the main process. The first-order and second-order estimators based on the wavelets are presented. Under a simple and easy algorithm, the periodically correlated process is simulated for a given autocovariance function. The proposed algorithm has two main advantages: first, it is fast, and second, it is distribution free. We indicate through four examples that the simulated data are periodically correlated with the desired period.

]]>In this article, we propose a Bayesian non-parametric model for the analysis of multiple time series. We consider an autoregressive structure of order *p* for each of the series and borrow strength across the series by considering a common error population that is also evolving in time. The error populations (distributions) are assumed non-parametric whose law is based on a series of dependent Polya trees with zero median. This dependence is of order *q* and is achieved via a dependent beta process that links the branching probabilities of the trees. We study the prior properties and show how to obtain posterior inference. The model is tested under a simulation study and is illustrated with the analysis of the economic activity index of the 32 states of Mexico.

Of interest is comparing the out-of-sample forecasting performance of two competing models in the presence of possible instabilities. To that effect, we suggest using simple structural change tests, sup-Wald and *UDmax* for changes in the mean of the loss differences. It is shown that Giacomini and Rossi () tests have undesirable power properties, power that can be low and non-increasing as the alternative becomes further from the null hypothesis. On the contrary, our statistics are shown to have higher monotonic power, especially the UDmax version. We use their empirical examples to show the practical relevance of the issues raised.

We propose a variance ratio-type unit root test where the nuisance parameter cancels asymptotically under both the null of a unit root and a local-to-unity alternative. Critical values and asymptotic power curves can be computed using standard numerical techniques. Our test exhibits higher power compared with tests that share the virtue of being free of tuning parameters. In fact, the local asymptotic power curves of our procedure get close to the power functions of the point optimal test, where the latter suffers from the drawback of having to correct for a nuisance parameter consistently.

]]>For discrete panel data, the dynamic relationship between successive observations is often of interest. We consider a dynamic probit model for short panel data. A problem with estimating the dynamic parameter of interest is that the model contains a large number of nuisance parameters, one for each individual. Heckman proposed to use maximum likelihood estimation of the dynamic parameter, which, however, does not perform well if the individual effects are large. We suggest new estimators for the dynamic parameter, based on the assumption that the individual parameters are random and possibly large. Theoretical properties of our estimators are derived, and a simulation study shows they have some advantages compared with Heckman's estimator and the modified profile likelihood estimator for fixed effects.

]]>The aim of this article is to estimate the probability distribution of power threshold generalized autoregressive conditional heteroskedasticity processes by establishing bounds for their finite dimensional laws. These bounds only depend on the parameters of the model and on the distribution function of its independent generating process. The application of this study to some particular models allows us to conjecture that this procedure is an adequate alternative to the corresponding estimation using the empirical distribution functions, particularly useful in the development of control charts for this kind of models.

]]>Quantile autoregression (QAR) is particularly attractive for censored data. However, unlike the standard regression models, the autoregressive models must take account of censoring on both response and regressors. In this article, we show that the existing censored quantile regression methods produce consistent estimators for QAR models when using only the fully observed regressors. A new algorithm is proposed to provide a censored QAR estimator by adopting imputation methods. The algorithm redistributes probability mass of censored points appropriately and iterates towards self-consistent solutions. Monte Carlo simulations and empirical applications are conducted to demonstrate merits of the proposed method.

]]>Bartlett correction, which improves the coverage accuracies of confidence regions, is one of the desirable features of empirical likelihood. For empirical likelihood with dependent data, previous studies on the Bartlett correction are mainly concerned with Gaussian processes. By establishing the validity of Edgeworth expansion for the signed root empirical log-likelihood ratio statistics, we show that the Bartlett correction is applicable to empirical likelihood for short-memory time series with possibly non-Gaussian innovations. The Bartlett correction is established under the assumptions that the variance of the innovation is known and the mean of the underlying process is zero for a single parameter model. In particular, the order of the coverage errors of Bartlett-corrected confidence regions can be reduced from *O*(*n*^{−1}) to *O*(*n*^{−2}).

Perron and Zhu (2005) established the consistency, convergence rate and limiting distributions of parameter estimates in time trends with a change in slope with or without a concurrent level change for the cases with I(1) or I(0) errors. We extend their analysis to the general case of fractionally integrated errors with memory parameter d^{∗}. Our results uncover interesting features; e.g., with a level shift allowed, the convergence rate for the break date estimate is the same for all d^{∗}∈(−0.5,0.5). In other cases, it is decreasing as d^{∗} increases. We also provide results about the so-called spurious break issue.

The existing estimation methods for the model parameters of the unified GARCH–Itô model (Kim and Wang, ) require long period observations to obtain the consistency. However, in practice, it is hard to believe that the structure of a stock price is stable during such a long period. In this article, we introduce an estimation method for the model parameters based on the high-frequency financial data with a finite observation period. In particular, we establish a quasi-likelihood function for daily integrated volatilities, and realized volatility estimators are adopted to estimate the integrated volatilities. The model parameters are estimated by maximizing the quasi-likelihood function. We establish asymptotic theories for the proposed estimator. A simulation study is conducted to check the finite sample performance of the proposed estimator. We apply the proposed estimation approach to the Bank of America stock price data.

]]>This article examines asymptotically point optimal tests for parameter instability in realistic circumstances when little information about the unstable parameter process and error distribution is available. We first show that, under a correctly specified error distribution, if the unstable parameter processes converge weakly to a Wiener process, then any asymptotic optimal tests for structural breaks and time-varying parameters are asymptotically equivalent. Our finding is then extended to a semi-parametric set-up in which the error distribution is treated as an unknown infinite-dimensional nuisance parameter. We find that semi-parametric tests can be adaptive without further restrictive conditions on the error distribution.

]]>Multivariate processes with long-range dependent properties are found in a large number of applications including finance, geophysics and neuroscience. For real-data applications, the correlation between time series is crucial. Usual estimations of correlation can be highly biased owing to phase shifts caused by the differences in the properties of autocorrelation in the processes. To address this issue, we introduce a semiparametric estimation of multivariate long-range dependent processes. The parameters of interest in the model are the vector of the long-range dependence parameters and the long-run covariance matrix, also called functional connectivity in neuroscience. This matrix characterizes coupling between time series. The proposed multivariate wavelet-based Whittle estimation is shown to be consistent for the estimation of both the long-range dependence and the covariance matrix and to encompass both stationary and nonstationary processes. A simulation study and a real-data example are presented to illustrate the finite-sample behaviour.

]]>This article discusses filtering, prediction and simulation in univariate and multivariate noncausal processes. A closed-form functional estimator of the predictive density for noncausal and mixed processes is introduced that provides prediction intervals up to a finite horizon H. A state-space representation of a noncausal and mixed multivariate vector autoregressive process is derived in two ways-by the partial fraction decomposition or from the real Jordan canonical form. A recursive BHHH algorithm for the maximization of the approximate log-likelihood function is proposed, which calculates the filtered values of the unobserved causal and noncausal components of the process. The new methods are illustrated by a simulation study involving a univariate noncausal process with infinite variance.

]]>Regularity conditions are given for the consistency of the Poisson quasi-maximum likelihood estimator of the conditional mean parameter of a count time series model. The asymptotic distribution of the estimator is studied when the parameter belongs to the interior of the parameter space and when it lies at the boundary. Tests for the significance of the parameters and for constant conditional mean are deduced. Applications to specific integer-valued autoregressive (INAR) and integer-valued generalized autoregressive conditional heteroscedasticity (INGARCH) models are considered. Numerical illustrations, Monte Carlo simulations and real data series are provided.

]]>Many studies record replicated time series epochs from different groups with the goal of using frequency domain properties to discriminate between the groups. In many applications, there exists variation in cyclical patterns from time series in the same group. Although a number of frequency domain methods for the discriminant analysis of time series have been explored, there is a dearth of models and methods that account for within-group spectral variability. This article proposes a model for groups of time series in which transfer functions are modelled as stochastic variables that can account for both between-group and within-group differences in spectra that are identified from individual replicates. An ensuing discriminant analysis of stochastic cepstra under this model is developed to obtain parsimonious measures of relative power that optimally separate groups in the presence of within-group spectral variability. The approach possesses favourable properties in classifying new observations and can be consistently estimated through a simple discriminant analysis of a finite number of estimated cepstral coefficients. Benefits in accounting for within-group spectral variability are empirically illustrated in a simulation study and through an analysis of gait variability.

]]>The aim of this article is to introduce new resampling scheme for nonstationary time series, called generalized resampling scheme (GRS). The proposed procedure is a generalization of well known in the literature subsampling procedure and is simply related to existing block bootstrap techniques. To document the usefulness of GRS, we consider the example of model with almost periodic phenomena in mean and variance function, where the consistency of the proposed procedure was examined. Finally, we prove the consistency of GRS for the spectral density matrix for nonstationary, multivariate almost periodically correlated time series. We consider both zero mean and non-zero mean case. The consistency holds under general assumptions concerning moment and *α*-mixing conditions for multivariate almost periodically correlated time series. Proving the consistency in this case poses a difficulty since the estimator of the spectral density matrix can be interpreted as a sum of random matrixes whose dependence grow with the sample size.

In blind source separation, one assumes that the observed p time series are linear combinations of p latent uncorrelated weakly stationary time series. To estimate the unmixing matrix, which transforms the observed time series back to uncorrelated latent time series, second-order blind identification (SOBI) uses joint diagonalization of the covariance matrix and autocovariance matrices with several lags. In this article, we find the limiting distribution of the well-known symmetric SOBI estimator under general conditions and compare its asymptotical efficiencies to those of the recently introduced deflation-based SOBI estimator. The theory is illustrated by some finite-sample simulation studies.

]]>An order selection test is proposed to check the equality of two independent stationary time series in their correlation structures. The asymptotic distribution of the order selection test statistic under the null hypothesis is obtained. Furthermore, it is shown that the proposed test is consistent not only under any fixed alternative hypothesis but also under a sequence of local alternative hypotheses. A simulation study is conducted to examine the finite sample performance of the test in comparison to some existing methods. The proposed test is also applied to an analysis of a biomedical data set.

]]>Identification and estimation of outliers in time series is proposed by using empirical likelihood methods. Theory and applications are developed for stationary autoregressive models with outliers distinguished in the usual additive and innovation types. Some other useful outlier types are considered as well. A simulation experiment is used for studying the behaviour of the empirical likelihood-based method in finite samples and indicates that the proposed methods are preferable when dealing with the non-Gaussian data. Our simulations suggest that the usual sequential procedure for multiple outlier detection is suitable also for the methods based on empirical likelihood.

]]>This article explores the problem of estimating stationary autoregressive models from observed data using the Bayesian least absolute shrinkage and selection operator (LASSO). By characterizing the model in terms of partial autocorrelations, rather than coefficients, it becomes straightforward to guarantee that the estimated models are stationary. The form of the negative log-likelihood is exploited to derive simple expressions for the conditional likelihood functions, leading to efficient algorithms for computing the posterior mode by coordinate-wise descent and exploring the posterior distribution by Gibbs sampling. Both empirical Bayes and Bayesian methods are proposed for the estimation of the LASSO hyper-parameter from the data. Simulations demonstrate that the Bayesian LASSO performs well in terms of prediction when compared with a standard autoregressive order selection method.

]]>The article reviews methods of inference for single and multiple change-points in time series, when data are of retrospective (off-line) type. The inferential methods reviewed for a single change-point in time series include likelihood, Bayes, Bayes-type and some relevant non-parametric methods. Inference for multiple change-points requires methods that can handle large data sets and can be implemented efficiently for estimating the number of change-points as well as their locations. Our review in this important area focuses on some of the recent advances in this direction. Greater emphasis is placed on multivariate data while reviewing inferential methods for a single change-point in time series. Throughout the article, more attention is paid to estimation of unknown change-point(s) in time series, and this is especially true in the case of multiple change-points. Some specific data sets for which change-point modelling has been carried out in the literature are provided as illustrative examples under both single and multiple change-point scenarios.

]]>We consider a heteroscedastic nonparametric regression model with an autoregressive error process of finite known order *p*. The heteroscedasticity is incorporated using a scaling function defined at uniformly spaced design points on an interval [0,1]. We provide an innovative nonparametric estimator of the variance function and establish its consistency and asymptotic normality. We also propose a semiparametric estimator for the vector of autoregressive error process coefficients that is consistent and asymptotically normal for a sample size *T*. Explicit asymptotic variance covariance matrix is obtained as well. Finally, the finite sample performance of the proposed method is tested in simulations.

No abstract is available for this article.

]]>No abstract is available for this article.

]]>In this article, we propose a nonparametric procedure for validating the assumption of stationarity in multivariate locally stationary time series models. We develop a bootstrap-assisted test based on a Kolmogorov–Smirnov-type statistic, which tracks the deviation of the time-varying spectral density from its best stationary approximation. In contrast to all other nonparametric approaches, which have been proposed in the literature so far, the test statistic does not depend on any regularization parameters like smoothing bandwidths or a window length, which is usually required in a segmentation of the data. We additionally show how our new procedure can be used to identify the components where non-stationarities occur and indicate possible extensions of this innovative approach. We conclude with an extensive simulation study, which shows finite-sample properties of the new method and contains a comparison with existing approaches.

]]>Long-memory effects can be found in many data sets from finance to hydrology. Therefore, models that can reflect these properties have become more popular in recent years. Mandelbrot–Van Ness fractional Lévy processes allow for such stationary long-memory effects in their increments and have been used in different settings ranging from fractionally integrated continuous-time ARMA–GARCH-type setups to general stochastic differential equations. However, their conditional distributions have not yet been considered in detail. In this article, we provide a closed formula for their conditional characteristic functions and suggest several applications to continuous-time ARMA–GARCH-type models with long memory.

]]>A two-step approach for conditional value at risk estimation is considered. First, a generalized quasi-maximum likelihood estimator is employed to estimate the volatility parameter, then the empirical quantile of the residuals serves to estimate the theoretical quantile of the innovations. When the instrumental density *h* of the generalized quasi-maximum likelihood estimator is not the Gaussian density, both the estimations of the volatility and of the quantile are generally asymptotically biased. However, the two errors counterbalance and lead to a consistent estimator of the value at risk. We obtain the asymptotic behavior of this estimator and show how to choose optimally *h*.

For autoregressive count data time series, a goodness-of-fit test based on the empirical joint probability generating function is considered. The underlying process is contained in a general class of Markovian models satisfying a drift condition. Asymptotic theory for the test statistic is provided, including a functional central limit theorem for the non-parametric estimation of the stationary distribution and a parametric bootstrap method. Connections between the new approach and existing tests for count data time series based on moment estimators appear in limiting scenarios. Finally, the test is applied to a real data set.

]]>This work develops maximum likelihood-based unit root tests in the noncausal autoregressive (NCAR) model with a non-Gaussian error term formulated by Lanne and Saikkonen (2011, *Journal of Time Series Econometrics* 3, Issue 3, Article 2). Finite-sample properties of the tests are examined via Monte Carlo simulations. The results show that the size properties of the tests are satisfactory and that clear power gains against stationary NCAR alternatives can be achieved in comparison with available alternative tests. In an empirical application to a Finnish interest rate series, evidence in favour of an NCAR model with leptokurtic errors is found.

Stationary processes are a natural choice as statistical models for time series data, owing to their good estimating properties. In practice, however, alternative models are often proposed that sacrifice stationarity in favour of the greater modelling flexibility required by many real-life applications. We present a family of time-homogeneous Markov processes with nonparametric stationary densities, which retain the desirable statistical properties for inference, while achieving substantial modelling flexibility, matching those achievable with certain non-stationary models. A latent extension of the model enables exact inference through a trans-dimensional Markov chain Monte Carlo method. Numerical illustrations are presented.

]]>