Efficient Minimization of Decomposable Submodular Functions

Efficient Minimization of Decomposable Submodular Functions
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Many combinatorial problems arising in machine learning can be reduced to the problem of minimizing a submodular function. Submodular functions are a natural discrete analog of convex functions, and can be minimized in strongly polynomial time. Unfortunately, state-of-the-art algorithms for general submodular minimization are intractable for larger problems. In this paper, we introduce a novel subclass of submodular minimization problems that we call decomposable. Decomposable submodular functions are those that can be represented as sums of concave functions applied to modular functions. We develop an algorithm, SLG, that can efficiently minimize decomposable submodular functions with tens of thousands of variables. Our algorithm exploits recent results in smoothed convex minimization. We apply SLG to synthetic benchmarks and a joint classification-and-segmentation task, and show that it outperforms the state-of-the-art general purpose submodular minimization algorithms by several orders of magnitude.


💡 Research Summary

The paper addresses a fundamental bottleneck in applying submodular function minimization to large‑scale machine‑learning problems. While submodular functions are the discrete analogue of convex functions and admit strongly polynomial‑time algorithms, the best known general‑purpose methods (e.g., Iwata–Fleischer–Fujishige, Queyranne, Min‑Norm‑Point) become impractical when the number of variables reaches tens of thousands due to high per‑iteration costs and large memory footprints. To overcome this, the authors introduce a new subclass called decomposable submodular functions. A function f:2^V→ℝ belongs to this class if it can be written as

\


Comments & Academic Discussion

Loading comments...

Leave a Comment