Nonparametric Conditional Inference for Regression Coefficients with Application to Configural Polysampling
We consider inference procedures, conditional on an observed ancillary statistic, for regression coefficients under a linear regression setup where the unknown error distribution is specified nonparametrically. We establish conditional asymptotic normality of the regression coefficient estimators under regularity conditions, and formally justify the approach of plugging in kernel-type density estimators in conditional inference procedures. Simulation results show that the approach yields accurate conditional coverage probabilities when used for constructing confidence intervals. The plug-in approach can be applied in conjunction with configural polysampling to derive robust conditional estimators adaptive to a confrontation of contrasting scenarios. We demonstrate this by investigating the conditional mean squared error of location estimators under various confrontations in a simulation study, which successfully extends configural polysampling to a nonparametric context.
💡 Research Summary
The paper tackles the problem of inference for regression coefficients in a linear model when the error distribution is completely unspecified. Instead of imposing a normality assumption, the authors adopt a non‑parametric view of the error term and condition all inference on an observed ancillary statistic—specifically, the standardized residual vector. Under mild regularity conditions on the design matrix and on the kernel density estimator (bandwidth shrinking to zero while (nh_n\to\infty)), they prove that the ordinary least‑squares estimator (\hat\beta) is asymptotically normal conditional on the ancillary statistic. Formally,
\
Comments & Academic Discussion
Loading comments...
Leave a Comment