Variational Tail Bounds for Norms of Random Vectors and Matrices
We propose a variational tail bound for norms of random vectors under moment assumptions on their one-dimensional marginals. A simplified version of the bound that parametrizes the ``aggregating distribution’’ using a certain pushforward of the Gaussian distribution is also provided. We apply the proposed method to reproduce some of the well-known bounds on norms of Gaussian random vectors, and also obtain dimension-free tail bounds for the Euclidean norm of random vectors with arbitrary moment profiles. Furthermore, we reproduce a dimension-free concentration inequality for sum of independent and identically distributed positive semidefinite matrices with sub-exponential marginals, and obtain a concentration inequality for the sample covariance matrix of sub-exponential random vectors. We also obtain a tail bound for the operator norm of a random matrix series whose random coefficients may have arbitrary moment profiles. Furthermore, we use coupling to formulate an abstraction of the proposed approach that applies more broadly.
💡 Research Summary
The paper introduces a new “variational tail bound” technique for controlling the norms of random vectors and matrices under very mild moment assumptions on their one‑dimensional marginals. The authors start by observing that any norm ‖·‖ can be expressed as the supremum of linear functionals indexed by the unit ball of the dual norm ‖·‖*. By sampling a random direction U from an arbitrary reference distribution P₀, they define M_X(u,p)=E|⟨u,X⟩|^p and a geometric quantity ν(P₀)=sup_{x∈∂B₁}P₀(|⟨U,x⟩|≥1). Lemma 1 shows that for any p≥1, E‖X‖^p ≤ ν(P₀)·E₀
Comments & Academic Discussion
Loading comments...
Leave a Comment