Uniform Bahadur Representation for Local Polynomial Estimates of M-Regression and Its Application to The Additive Model

Uniform Bahadur Representation for Local Polynomial Estimates of   M-Regression and Its Application to The Additive Model
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We use local polynomial fitting to estimate the nonparametric M-regression function for strongly mixing stationary processes ${(Y_{i},\underline{X}_{i})}$. We establish a strong uniform consistency rate for the Bahadur representation of estimators of the regression function and its derivatives. These results are fundamental for statistical inference and for applications that involve plugging in such estimators into other functionals where some control over higher order terms are required. We apply our results to the estimation of an additive M-regression model.


💡 Research Summary

The paper develops a rigorous uniform Bahadur representation for local polynomial estimators of non‑parametric M‑regression when the data form a strongly mixing stationary process. Let ({(Y_i,\mathbf X_i)}_{i\ge1}) be a strictly stationary (\alpha)-mixing sequence with mixing coefficients decaying sufficiently fast. The regression function (m(\mathbf x)) is defined as the minimizer of the expected loss (E{\rho(Y-m(\mathbf X))\mid\mathbf X=\mathbf x}), where (\rho) is a convex, twice‑differentiable loss and (\psi=\rho’) satisfies standard moment conditions.

Using a local polynomial of order (p) with bandwidth (h_n) (satisfying (h_n\to0) and (nh_n^d\to\infty)), the authors prove that for any multi‑index (\mathbf k) with (|\mathbf k|\le p),

\


Comments & Academic Discussion

Loading comments...

Leave a Comment