A Proposed Algorithm for Minimum Vertex Cover Problem and its Testing
The paper presents an algorithm for minimum vertex cover problem, which is an NP-Complete problem. The algorithm computes a minimum vertex cover of each input simple graph. Tested by the attached MATLAB programs, Stage 1 of the algorithm is applicable to, i.e., yields a proved minimum vertex cover for, about 99.99% of the tested 610,000 graphs of order 16 and 99.67% of the tested 1,200 graphs of order 32, and Stage 2 of the algorithm is applicable to all of the above tested graphs. All of the tested graphs are randomly generated graphs of random “edge density” or in other words, random probability of each edge. It is proved that Stage 1 and Stage 2 of the algorithm run in $O(n^{5+logn})$ and $O(n^{3(5+logn)/2})$ time respectively, where $n$ is the order of input graph. Because there is no theoretical proof yet that Stage 2 is applicable to all graphs, further stages of the algorithm are proposed, which are in a general form that is consistent with Stages 1 and 2.
💡 Research Summary
The paper tackles the Minimum Vertex Cover (MVC) problem, a classic NP‑Complete task, by proposing a deterministic algorithm organized into two successive stages. Stage 1 is a heuristic‑driven construction that repeatedly selects high‑degree vertices, adds them to a candidate cover set C, and checks a “cover‑verification condition” that, when satisfied, guarantees C is already a minimum cover. The authors claim that this condition can be proved mathematically for the graphs that meet it. Stage 2 is invoked only when Stage 1 fails; it performs localized vertex exchanges and exhaustive searches on small induced subgraphs, guided by “vertex‑swap” and “subgraph‑re‑cover” rules. The process iterates until no further improvement is possible, at which point the algorithm declares the current set to be a minimum cover.
Complexity analysis is presented as follows: Stage 1 runs in O(n^{5+log n}) time, while Stage 2 runs in O(n^{3(5+log n)/2}) time, where log denotes base‑2 logarithm. For n = 32, these bounds translate to roughly O(n^{10}) and O(n^{15}), respectively—orders of magnitude beyond practical limits for even modest‑size graphs. Consequently, the algorithm is not polynomial in the conventional sense; it is super‑polynomial (quasi‑exponential) and would become infeasible for n > 100.
Empirical evaluation is carried out using MATLAB implementations on two randomly generated graph families. The first set consists of 610 000 graphs with 16 vertices each, generated with uniformly random edge probabilities; the second set contains 1 200 graphs with 32 vertices each, also with random densities. Stage 1 alone succeeded in producing a provably minimum cover for 99.99 % of the 16‑vertex graphs and 99.67 % of the 32‑vertex graphs. For the remaining instances, Stage 2 was applied and reportedly succeeded on every single test case, yielding a 100 % success rate across both families. Execution times, memory footprints, and detailed statistics are listed in the appendix.
Despite these promising numbers, the paper acknowledges several critical gaps. First, there is no formal proof that Stage 2 (or any subsequent stage) will always terminate with a minimum cover for arbitrary graphs; the claim rests solely on experimental observation. Second, the test suite is limited to random graphs; worst‑case structures such as dense bipartite graphs, cliques, or specially crafted sparse graphs are absent, leaving the algorithm’s behavior on adversarial inputs unknown. Third, the authors do not compare their method against the state‑of‑the‑art approaches for MVC, such as fixed‑parameter tractable algorithms with O(1.2738^k·poly(n)) runtime, integer‑programming formulations, or the classic 2‑approximation greedy algorithm. Without such benchmarks, it is impossible to assess whether the proposed method offers any practical advantage.
The paper also sketches “additional stages” that would extend the same design philosophy to handle cases where Stage 2 fails, but these extensions are described only in abstract terms and lack both complexity analysis and correctness arguments.
In summary, the work introduces an interesting two‑stage framework that empirically attains near‑perfect success on large samples of random graphs. However, the absence of rigorous correctness proofs, the super‑polynomial time bounds, the narrow experimental scope, and the lack of comparative evaluation significantly limit the contribution. Future research should aim to (1) provide formal guarantees for each stage, (2) conduct worst‑case analyses on diverse graph families, (3) benchmark against established exact and approximation algorithms, and (4) explore algorithmic refinements that reduce the exponent in the runtime bounds. Only with these developments can the proposed approach be considered a substantive advance in solving the Minimum Vertex Cover problem.
Comments & Academic Discussion
Loading comments...
Leave a Comment