Some Uniform Limit Results in Additive Regression Model
We establish some uniform limit results in the setting of additive regression model estimation. Our results allow to give an asymptotic 100% confidence bands for these components. These results are stated in the framework of i.i.d random vectors when the marginal integration estimation method is used.
💡 Research Summary
The paper addresses a fundamental problem in additive regression models (ARMs), namely the uniform inference for each additive component when the model is estimated by the marginal integration (MI) method. An ARM expresses a response Y as a sum of unknown smooth functions of individual covariates, Y = μ + ∑_{j=1}^d f_j(X_j) + ε, with ε i.i.d. mean‑zero noise. While pointwise consistency and asymptotic normality for the component estimators have been studied extensively, practitioners often need confidence statements that hold simultaneously over a whole interval of the covariate. This paper fills that gap by establishing sup‑norm (uniform) convergence rates, deriving a functional central limit theorem for the MI estimators, and constructing asymptotically exact 100 % confidence bands for each f_j.
Methodology
-
Estimation – The authors first estimate the full multivariate regression function m(x)=∑ f_j(x_j) using a Nadaraya–Watson kernel smoother with a product kernel K_h and bandwidth h_n that shrinks to zero. To reduce bias, a local polynomial of order r is employed. The marginal integration step then isolates each component:
\hat f_j(x) = ∫ \hat m(x_1,…,x_d) d\hat P_{-j}(x_{-j}) – \hat μ,
where \hat P_{-j} is the empirical marginal distribution of the remaining covariates. -
Assumptions – The analysis assumes (i) i.i.d. observations (X_i, Y_i); (ii) a product kernel of bounded support and order ≥s; (iii) bandwidth satisfying h_n→0 and n h_n^d/ log n →∞; (iv) each f_j belongs to a Hölder class of smoothness s>0; (v) ε has finite fourth moment.
-
Uniform Consistency – Under these conditions the paper proves
|\hat f_j – f_j|_∞ = O_p\big( √{log n/(n h_n^d)} + h_n^s \big).
The logarithmic factor arises from covering‑number arguments needed to control the supremum over a continuum of points. The optimal bandwidth balancing bias (h_n^s) and stochastic error (√{log n/(n h_n^d)}) is h_n ∝ n^{−1/(2s+d)}. -
Functional Central Limit Theorem – By centering and scaling the estimator, the authors show that the stochastic process
G_n^j(x) = √{n h_n^d},(\hat f_j(x) – f_j(x)) / \hat σ_j(x)
converges in distribution in ℓ^∞(
Comments & Academic Discussion
Loading comments...
Leave a Comment