A Statistical Nonparametric Approach of Face Recognition: Combination of Eigenface & Modified k-Means Clustering
Facial expressions convey non-verbal cues, which play an important role in interpersonal relations. Automatic recognition of human face based on facial expression can be an important component of natu
Facial expressions convey non-verbal cues, which play an important role in interpersonal relations. Automatic recognition of human face based on facial expression can be an important component of natural human-machine interface. It may also be used in behavioural science. Although human can recognize the face practically without any effort, but reliable face recognition by machine is a challenge. This paper presents a new approach for recognizing the face of a person considering the expressions of the same human face at different instances of time. This methodology is developed combining Eigenface method for feature extraction and modified k-Means clustering for identification of the human face. This method endowed the face recognition without using the conventional distance measure classifiers. Simulation results show that proposed face recognition using perception of k-Means clustering is useful for face images with different facial expressions.
💡 Research Summary
The paper addresses a fundamental challenge in automated face recognition: the variability introduced by facial expressions. While humans can effortlessly recognize a familiar face across a wide range of expressions, conventional machine‑based systems often rely on distance‑based classifiers (e.g., Euclidean, Mahalanobis, cosine similarity) that are highly sensitive to intra‑class variations caused by changing expressions. To mitigate this problem, the authors propose a hybrid framework that couples a classic statistical feature extractor—Eigenfaces (Principal Component Analysis, PCA)—with a modified k‑means clustering algorithm that serves directly as a non‑parametric classifier, thereby eliminating the need for an explicit distance‑measure decision rule.
Feature Extraction (Eigenfaces).
All training images are first normalized (size, illumination, alignment) and then stacked into a data matrix. PCA is performed on the covariance matrix of this set, yielding a set of eigenvectors ordered by descending eigenvalues. The top‑ranked eigenvectors (typically 30–50) form the “eigenface” basis. Each face image is projected onto this basis, producing a low‑dimensional feature vector that captures the most significant variance of the face space while discarding noise and redundant information. This linear dimensionality reduction is computationally inexpensive and provides a compact representation that is still discriminative for identity.
Modified k‑means for Classification.
Standard k‑means clusters data by minimizing the sum of squared Euclidean distances to cluster centroids. In a face‑recognition context, this approach would treat each identity as a cluster, but the presence of multiple expressions per person inflates intra‑cluster variance, often causing overlap between clusters of different identities. The authors therefore introduce three key modifications:
- Class‑wise Initialization: For each subject, all available expression images are used to compute an initial centroid equal to the mean of that subject’s projected vectors. This ensures that each identity starts with a representative prototype rather than a random point.
- Weighted Assignment Function: Instead of a hard nearest‑centroid rule, the algorithm computes a soft assignment probability for a sample (x_i) to cluster (C_j) using a Gaussian‑like weighting: \
📜 Original Paper Content
🚀 Synchronizing high-quality layout from 1TB storage...