Composite likelihood estimation of sparse Gaussian graphical models with symmetry
In this article, we discuss the composite likelihood estimation of sparse Gaussian graphical models. When there are symmetry constraints on the concentration matrix or partial correlation matrix, the likelihood estimation can be computational intensive. The composite likelihood offers an alternative formulation of the objective function and yields consistent estimators. When a sparse model is considered, the penalized composite likelihood estimation can yield estimates satisfying both the symmetry and sparsity constraints and possess ORACLE property. Application of the proposed method is demonstrated through simulation studies and a network analysis of a biological data set.
💡 Research Summary
This paper addresses the problem of estimating sparse Gaussian graphical models (GGMs) when symmetry constraints are imposed on the concentration (precision) matrix or the partial correlation matrix. Traditional maximum‑likelihood estimation becomes computationally prohibitive under such constraints because the full log‑likelihood must be optimized over a high‑dimensional, constrained parameter space. The authors propose to replace the full likelihood with a composite likelihood, which is constructed by summing the conditional log‑likelihoods of each variable given the others. In the Gaussian case each conditional distribution is again normal, and its mean and variance are simple functions of the precision matrix entries. Consequently, the composite likelihood decomposes into independent sub‑problems, dramatically reducing computational cost and allowing straightforward parallelisation.
To enforce sparsity, an ℓ₁ penalty is added to the composite likelihood, yielding a penalized composite likelihood (PCL) objective:
\
Comments & Academic Discussion
Loading comments...
Leave a Comment