Long Memory in Nonlinear Processes

Reading time: 5 minute
...

📝 Original Info

  • Title: Long Memory in Nonlinear Processes
  • ArXiv ID: 0706.1836
  • Date: 2008-12-02
  • Authors: ** 논문에 명시된 저자 정보가 제공되지 않았습니다. (원문에 저자 명시가 없으며, 해당 장은 아마도 교과서 혹은 편집된 논문집의 챕터로 추정됩니다.) **

📝 Abstract

It is generally accepted that many time series of practical interest exhibit strong dependence, i.e., long memory. For such series, the sample autocorrelations decay slowly and log-log periodogram plots indicate a straight-line relationship. This necessitates a class of models for describing such behavior. A popular class of such models is the autoregressive fractionally integrated moving average (ARFIMA) which is a linear process. However, there is also a need for nonlinear long memory models. For example, series of returns on financial assets typically tend to show zero correlation, whereas their squares or absolute values exhibit long memory. Furthermore, the search for a realistic mechanism for generating long memory has led to the development of other nonlinear long memory models. In this chapter, we will present several nonlinear long memory models, and discuss the properties of the models, as well as associated parametric andsemiparametric estimators.

💡 Deep Analysis

📄 Full Content

It is generally accepted that many time series of practical interest exhibit strong dependence, i.e., long memory. For such series, the sample autocorrelations decay slowly and log-log periodogram plots indicate a straight-line relationship. This necessitates a class of models for describing such behavior. A popular class of such models is the autoregressive fractionally integrated moving average (ARFIMA) (see [Ade74], [GJ80]), [Hos81], which is a linear process. However, there is also a need for nonlinear long memory models. For example, series of returns on financial assets typically tend to show zero correlation, whereas their squares or absolute values exhibit long memory. See, e.g., [DGE93]. Furthermore, the search for a realistic mechanism for generating long memory has led to the development of other nonlinear long memory models. (Shot noise, special cases of which are Parke, Taqqu-Levy, etc). In this chapter, we will present several nonlinear long memory models, and discuss the properties of the models, as well as associated parametric and semiparametric estimators.

Long memory has no universally accepted definition; nevertheless, the most commonly accepted definition of long memory for a weakly stationary process X = {X t , t ∈ Z} is the regular variation of the autocovariance function: there exist H ∈ (1/2, 1) and a slowly varying function L such that cov(X 0 , X t ) = L(t)|t| 2H-2 .

(1)

Under this condition, it holds that:

The condition (2) does not imply (1). Nevertheless, we will take (2) as an alternate definition of long memory. In both cases, the index H will be referred to as the Hurst index of the process X. This definition can be expressed in terms of the parameter d = H -1/2, which we will refer to as the memory parameter. The most famous long memory processes are fractional Gaussian noise and the ARF IM A(p, d, q) process, whose memory parameter is d and Hurst index is H = 1/2 + d. See for instance [Taq03] for a definition of these processes.

The second-order properties of a stationary process are not sufficient to characterize it, unless it is a Gaussian process. Processes which are linear with respect to an i.i.d. sequence (strict sense linear processes) are also relatively well characterized by their second-order structure. In particular, weak convergence of the partial sum process of a Gaussian or strict sense linear long memory processes {X t } with Hurst index H can be easily derived. Define

)ds in continuous time. Then var(S n (1)) -1/2 S n (t) converges in distribution to a constant times the fractional Brownian motion with Hurst index H, that is the Gaussian process B H with covariance function

In this paper, we will introduce nonlinear long memory processes, whose second order structure is similar to that of Gaussian or linear processes, but which may differ greatly from these processes in many other aspects. In Section 2, we will present these models and their second-order properties, and the weak convergence of their partial sum process. These models include conditionally heteroscedastic processes (Section 2.1) and models related to point processes (Section 2.2). In Section 3, we will consider the problem of estimating the Hurst index or memory parameter of these processes.

These models are defined by

where {v t } is an independent identically distributed series with finite variance and σ 2 t is the so-called volatility. We now give examples.

The Long Memory Stochastic Volatility (LMSV) and Long Memory Stochastic Duration (LMSD) models are defined by Equation (3), where σ 2 t = exp(h t ) and {h t } is an unobservable Gaussian long memory process with memory parameter d ∈ (0, 1/2), independent of {v t }. The multiplicative innovation series {v t } is assumed to have zero mean in the LMSV model, and positive support with unit mean in the LMSD model. The LMSV model was first introduced by [BCdL98] and [Har98] to describe returns on financial assets, while the LMSD model was proposed by [DHH05] to describe durations between transactions on stocks.

Using the moment generating function of a Gaussian distribution, it can be shown (see [Har98]) for the LMSV/LMSD model that for any real

where ρ s (j) denotes the autocorrelation of {|x t | s } at lag j, with the convention that s = 0 corresponds to the logarithmic transformation. As shown in [SV02], the same result holds under more general conditions without the requirement that {h t } be Gaussian.

In the LMSV model, assuming that {h t } and {v t } are functions of a multivariate Gaussian process, [Rob01] obtained similar results on the autocorrelations of {|X t | s } with s > 0 even if {h t } is not independent of {v t }. Similar results were obtained in [SV02], allowing for dependence between {h t } and {v t }.

The LMSV process is an uncorrelated sequence, but powers of LMSV or LMSD may exhibit long memory. [SV02] proved the convergence of the centered and renormalized partial sums of any absolute power of these proces

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut