Robust estimates in generalized partially linear models

Robust estimates in generalized partially linear models
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In this paper, we introduce a family of robust estimates for the parametric and nonparametric components under a generalized partially linear model, where the data are modeled by $y_i|(\mathbf{x}_i,t_i)\sim F(\cdot,\mu_i)$ with $\mu_i=H(\eta(t_i)+\mathbf{x}_i^{$\mathrm{T}$}\beta)$, for some known distribution function F and link function H. It is shown that the estimates of $\beta$ are root-n consistent and asymptotically normal. Through a Monte Carlo study, the performance of these estimators is compared with that of the classical ones.


💡 Research Summary

This paper addresses the problem of robust estimation in generalized partially linear models (GPLMs), a class of semiparametric regression models that combine a parametric linear component with a non‑parametric smooth function. The authors consider observations ((y_i, \mathbf{x}_i, t_i)) such that the conditional distribution of the response given covariates follows a known exponential‑family distribution (F(\cdot,\mu_i)) with mean (\mu_i = H!\big(\eta(t_i) + \mathbf{x}_i^{\top}\beta\big)). Here (H) is a known link function (e.g., logit, log) and (\eta(\cdot)) is an unknown smooth function of a single scalar covariate (t). Traditional inference for GPLMs relies on maximum likelihood or generalized estimating equations, which are highly sensitive to outliers or contaminated observations.

To overcome this vulnerability, the authors develop a robust M‑estimation framework. They introduce a bounded loss function (\rho) and its derivative (\psi = \rho’), together with observation‑specific weights (w_i) derived from a kernel smoother. The estimating equations take the form

\


Comments & Academic Discussion

Loading comments...

Leave a Comment