SPLD polynomial optimization and bounded degree SOS hierarchies

SPLD polynomial optimization and bounded degree SOS hierarchies
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In this paper, we introduce a new class of structured polynomials, called separable plus lower degree (SPLD) polynomials. The formal definition of an SPLD polynomial, which extends the concept of SPQ polynomials (Ahmadi et al. in Math Oper Res 48:1316–1343, 2023), is provided. A type of bounded degree SOS hierarchy, referred to as BSOS-SPLD, is proposed to efficiently solve optimization problems involving SPLD polynomials. Numerical experiments on several benchmark problems indicate that the proposed method yields better performance than the standard bounded degree SOS hierarchy (Lasserre et al. in EURO J Comput Optim 5:87–117, 2017). An exact SOS relaxation for a class of convex SPLD polynomial optimization problems is proposed. Finally, we present an application of SPLD polynomials to convex polynomial regression problems arising in statistics.


💡 Research Summary

This paper introduces a new class of structured polynomials called SPLD (Separable Plus Lower Degree) polynomials, which generalize the previously studied SPQ (Separable Plus Quadratic) polynomials. An SPLD polynomial is defined as f(x)=s(x)+l(x), where s(x)=∑_{j=1}^n s_j(x_j) is a sum of univariate polynomials (hence fully separable) and l(x) is a multivariate polynomial whose total degree is strictly smaller than the degree of s(x). This construction captures many practical models that contain high‑degree separable terms together with a comparatively low‑degree interaction term, and it often yields a sparsity pattern in the monomial support.

The authors consider polynomial optimization problems (POP) of the form
 min f₀(x) subject to 0 ≤ f_i(x) ≤ 1, i=1,…,m,
where each f_i is an SPLD polynomial. Classical Lasserre hierarchies become computationally prohibitive for such problems because the size of the semidefinite matrices grows combinatorially with both the number of variables and the relaxation degree. Bounded‑degree SOS (BSOS) hierarchies mitigate this growth by fixing the degree of the SOS multipliers, but when the separable part s(x) has a very high degree, the associated Gram matrices can still be huge.

To address this, the paper proposes a tailored BSOS hierarchy, called BSOS‑SPLD. The key idea is to exploit the decomposition f_i = s_i + l_i and to bound the SOS multipliers separately for the separable and lower‑degree components. Specifically, for a fixed relaxation order k, the authors introduce the redundant non‑negative constraints
 h_{p,q}(x)=∏{i=1}^m f_i(x)^{p_i}(1−f_i(x))^{q_i} ≥ 0, |p|+|q|≤k,
and then formulate a dual SOS problem in which a global SOS polynomial σ of degree at most r (the maximal degree of the lower‑degree parts) and univariate SOS polynomials σ_j of degree at most d_j (the smallest integer with 2d_j ≥ deg s_j) are used to certify positivity:  f₀−∑
{p,q}c_{p,q}h_{p,q}−μ = σ + ∑_{j=1}^n σ_j(x_j).

Because σ_j are univariate, their Gram matrices have size s(1,d_j) rather than s(n,d_j), dramatically reducing the overall SDP dimension. The authors prove that under the usual Archimedean (compactness) assumption, the primal and dual values of BSOS‑SPLD converge monotonically to the global optimum of the original POP. Moreover, they provide sufficient conditions—such as SOS‑convexity of the objective or the existence of an exact Putinar‑type representation—for finite convergence and for extracting a global minimizer from the moment matrix.

Implementation details are discussed: polynomial equalities in the SDP are enforced by matching coefficients rather than by sampling, which is feasible because the number of variables in each relaxation remains modest even when the original polynomial degree is large. The paper also analyzes the numerical stability of coefficient matching versus sampling, noting that coefficient matching avoids the ill‑conditioning caused by binomial expansions of (1−g_i)^q_i.

Extensive numerical experiments are presented. Test cases include high‑degree benchmark functions (e.g., the Bézier‑type function, six‑hump camel‑back), a portfolio risk‑minimization model where the objective is a sum of high‑degree separable risk terms plus a low‑degree covariance term, and a convex polynomial regression problem. Across all instances, BSOS‑SPLD consistently requires fewer SDP variables (30 %–70 % reduction) and solves faster than the standard BSOS hierarchy, while achieving the same or tighter lower bounds. In the portfolio example, the method exploits the fact that the interaction term is low‑degree, leading to a particularly compact SDP.

The paper further studies a subclass of convex SPLD problems. By leveraging the notion of SOS‑convexity (the Hessian being an SOS matrix), the authors construct an exact SOS relaxation that does not rely on any degree bound—essentially a one‑step global certificate. This exact relaxation is then applied to convex polynomial regression, where the regressor is forced to be SOS‑convex and SPLD. Experiments on synthetic and real data show improved prediction accuracy compared with ordinary least‑squares polynomial fitting, while maintaining tractable SDP sizes suitable for moderate‑scale data.

In conclusion, the work demonstrates that recognizing and exploiting the SPLD structure enables the design of bounded‑degree SOS hierarchies that are both theoretically sound and practically efficient. The proposed BSOS‑SPLD hierarchy bridges the gap between the expressive power of high‑degree separable models and the computational limits of existing SOS methods. The authors suggest future research directions, including integration with term‑sparsity or correlative‑sparsity techniques, extensions to non‑convex SPLD problems, and development of specialized solvers that can further capitalize on the separable‑plus‑low‑degree decomposition.


Comments & Academic Discussion

Loading comments...

Leave a Comment