Latent Multi-group Membership Graph Model

Latent Multi-group Membership Graph Model
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We develop the Latent Multi-group Membership Graph (LMMG) model, a model of networks with rich node feature structure. In the LMMG model, each node belongs to multiple groups and each latent group models the occurrence of links as well as the node feature structure. The LMMG can be used to summarize the network structure, to predict links between the nodes, and to predict missing features of a node. We derive efficient inference and learning algorithms and evaluate the predictive performance of the LMMG on several social and document network datasets.


💡 Research Summary

**
The paper introduces the Latent Multi‑group Membership Graph (LMMG) model, a probabilistic framework that simultaneously captures network topology and rich node feature information. Unlike traditional mixed‑membership stochastic block models that constrain each node to a single latent distribution over groups, LMMG allows a node to belong to multiple groups independently. For each of K latent groups, a node i draws a Bernoulli indicator zik from a Beta‑distributed probability φik. This formulation enables true multi‑group membership, reflecting real‑world scenarios where users have several interests or documents belong to multiple topics.

Node attributes are assumed binary and are modeled with K‑plus‑one dimensional logistic regressions. The group‑membership probabilities φi serve as input features to each logistic model, producing a probability yil = σ(wlᵀφi) for the presence of attribute l on node i. The weight vector wl encodes how strongly each latent group contributes to a particular attribute, and an L1 penalty encourages sparsity, ensuring that only a few groups are relevant for each attribute.

Link formation is governed by a set of K link‑affinity matrices Θk


Comments & Academic Discussion

Loading comments...

Leave a Comment