Adaptive almost full recovery in sparse nonparametric models

Adaptive almost full recovery in sparse nonparametric models
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We observe an unknown function of $d$ variables $f(\boldsymbol{t})$, $\boldsymbol{t} \in[0,1]^d$, in the Gaussian white noise model of intensity $\varepsilon>0$. We assume that the function $f$ is regular and that it is a sum of $k$-variate functions, where $k$ varies from $1$ to $s$ ($1\leq s\leq d$). These functions are unknown to us and only a few of them are nonzero. In this article, we address the problem of identifying the nonzero function components of $f$ almost fully in the case when $d=d_\varepsilon\to \infty$ as $\varepsilon\to 0$ and $s$ is either fixed or $s=s_\varepsilon\to \infty$, $s=o(d)$ as $\varepsilon\to 0$. This may be viewed as a variable selection problem. We derive the conditions when almost full variable selection in the model at hand is possible and provide a selection procedure that achieves this type of selection. The procedure is adaptive to the level of sparsity described by the sparsity index $β\in(0,1)$. We also derive conditions that make almost full variable selection in the model of our interest impossible. In view of these conditions, the proposed selector is seen to perform asymptotically optimal. The theoretical findings are illustrated numerically.


💡 Research Summary

**
The paper investigates variable selection in a high‑dimensional nonparametric regression setting modeled by Gaussian white noise. The unknown regression function $f(t)$, $t\in


Comments & Academic Discussion

Loading comments...

Leave a Comment