Title | : | Semi-Supervised Learning for Gaussian Mixture Models |
Speaker | : | Jyostna Devi Bodapati (IITM) |
Details | : | Wed, 2 Dec, 2015 2:00 PM @ BSB 361 |
Abstract: | : | Semi-supervised learning for pattern classification tasks using Gaussian mixture models (GMMs) involves estimation of parameters of GMMs of multiple classes using a small amount of label data of each class and a large amount of unlabeled data. The Expectation-Maximization (EM) method used in the supervised learning of GMMs estimates the parameters of the GMM of each class such that the likelihood for the labeled data of that class is maximized. The EM method for semi-supervised learning estimates the parameters of GMMs of all the classes such that an objective function that includes the likelihood for the labeled data as well as the unlabeled data is maximized. In our work, we extend the EM method for semi-supervised learning of GMMs used for static pattern classification to the GMMs used for varying length pattern classification. The varying length pattern of an example corresponds to the set of local feature vectors representation of the example. As the number of unlabeled examples is significantly higher than the number of labeled examples, a weighted likelihood based objective function is considered for semi-supervised learning of GMMs. We present our studies on datasets used in speech and image processing tasks. |