On testing for independence between generalized error models of several time series
We define generalized innovations associated with generalized error models having arbitrary distributions, that is, distributions that can be mixtures of continuous and discrete distributions. These models include stochastic volatility models and regime-switching models. We also propose statistics for testing independence between the generalized errors of these models, extending previous results of Duchesne, Ghoudi and Remillard (2012) obtained for stochastic volatility models. We define families of empirical processes constructed from lagged generalized errors, and we show that their joint asymptotic distributions are Gaussian and independent of the estimated parameters of the individual time series. Moebius transformations of the empirical processes are used to obtain tractable covariances. Several tests statistics are then proposed, based on Cramer-von Mises statistics and dependence measures, as well as graphical methods to visualize the dependence. In addition, numerical experiments are performed to assess the power of the proposed tests. Finally, to show the usefulness of our methodologies, examples of applications for financial data and crime data are given to cover both discrete and continuous cases. ll developed methodologies are implemented in the CRAN package IndGenErrors.
💡 Research Summary
The paper introduces a comprehensive framework for testing conditional independence between the “generalized errors” (innovations) of multiple time‑series models that may have arbitrary marginal distributions, including continuous, discrete, or mixtures of both. Traditional approaches to independence testing in multivariate time‑series have been limited to continuous‑margin models such as stochastic volatility (SV) models. By leveraging a randomization technique originally proposed by Brockwell (2007), the authors define a generalized innovation
(U_t = G_t(X_t^{-}) + V_t\Delta G_t(X_t)),
where (G_t) is the conditional cumulative distribution function of the series given the past, (\Delta G_t) its jump at the observed value, and (V_t\sim\mathcal{U}(0,1)) an independent uniform random variable. This construction guarantees that, regardless of whether the marginal distribution is continuous, discrete, or a mixture, the transformed series ({U_t}) are i.i.d. uniform on (
Comments & Academic Discussion
Loading comments...
Leave a Comment