Time-varying Coefficients Estimation in Differential Equation Models with Noisy Time-varying Covariates
We study the problem of estimating time-varying coefficients in ordinary differential equations. Current theory only applies to the case when the associated state variables are observed without measurement errors as presented in \cite{chenwu08b,chenwu08}. The difficulty arises from the quadratic functional of observations that one needs to deal with instead of the linear functional that appears when state variables contain no measurement errors. We derive the asymptotic bias and variance for the previously proposed two-step estimators using quadratic regression functional theory.
💡 Research Summary
This paper tackles the challenging problem of estimating time‑varying coefficients in ordinary differential equation (ODE) models when the covariates themselves are observed with measurement error. Earlier work, most notably Chen and Wu (2008), developed a two‑step estimator under the unrealistic assumption that the state variables are measured without error. In that setting the second‑step regression is linear in the observed covariates, allowing standard non‑parametric regression theory to provide bias and variance formulas. However, in many practical applications—pharmacokinetics, ecological modeling, engineering systems—the covariates are noisy time series. When noisy covariates are inserted directly into the regression, the resulting estimating equation involves a quadratic functional of the data rather than a linear one, and the classical theory breaks down.
The authors preserve the appealing two‑step structure but explicitly model the measurement error in both stages. In the first stage, the noisy covariate (X(t)) and its derivative (\dot X(t)) are estimated non‑parametrically using local polynomial smoothing with kernel (K) and bandwidth (h). This stage introduces smoothing bias of order (h^{2}) and a variance term proportional to ((nh)^{-1}), where (n) is the number of observation times. Crucially, the derivative estimator inherits additional noise because it is a linear combination of the noisy observations.
In the second stage the estimated derivative (\hat{\dot X}(t)) is regressed on the estimated covariate (\hat X(t)) to recover the time‑varying coefficient (\beta(t)). Since both (\hat{\dot X}(t)) and (\hat X(t)) contain estimation error, the regression functional becomes quadratic in the original noisy data. To analyse this, the paper adopts the quadratic regression functional framework, which treats the estimator as a U‑statistic of order two. By expanding the U‑statistic and applying kernel moment calculations, the authors derive explicit asymptotic expressions for the bias and variance of (\hat\beta(t)):
\
Comments & Academic Discussion
Loading comments...
Leave a Comment