Aggregating Algorithm competing with Banach lattices
The paper deals with on-line regression settings with signals belonging to a Banach lattice. Our algorithms work in a semi-online setting where all the inputs are known in advance and outcomes are unknown and given step by step. We apply the Aggregating Algorithm to construct a prediction method whose cumulative loss over all the input vectors is comparable with the cumulative loss of any linear functional on the Banach lattice. As a by-product we get an algorithm that takes signals from an arbitrary domain. Its cumulative loss is comparable with the cumulative loss of any predictor function from Besov and Triebel-Lizorkin spaces. We describe several applications of our setting.
💡 Research Summary
The paper addresses online regression in a setting where the input signals belong to a Banach lattice, a broad class of complete normed vector spaces equipped with a lattice (partial‑order) structure. Unlike classical online learning that assumes inputs live in Euclidean space or a reproducing kernel Hilbert space, the authors consider a “semi‑online” scenario: the entire sequence of input vectors ({x_1,\dots,x_T}) is known in advance, while the corresponding outcomes (y_t) are revealed one by one. This model captures situations such as pre‑collected sensor data where only the labels arrive in real time.
The central tool is the Aggregating Algorithm (AA), originally introduced by Vovk for prediction with expert advice. In the present work, each “expert” corresponds to a linear functional (f) acting on the Banach lattice, i.e., (f\in \mathcal{L}={x\mapsto \langle w,x\rangle: w\in \mathcal{B}^*}). The algorithm assigns an initial weight (\pi_0(f)) to each expert and updates the weights exponentially according to the incurred loss: \
Comments & Academic Discussion
Loading comments...
Leave a Comment