Monday, March 3, 2008

Throwing in more Gaussians

The variability in the appearance of a single part across different training images here suggests that a single Guassian may not be sufficient in capturing the underlying data. I decided to try out a mixture of Gaussians for each part (with diagonal covariances). The Netlab software for Matlab turned out to be very useful here as it has inbuilt routines for learning and using Gaussian mixture models (e.g. gmm, gmminit, gmmem and gmmprob scripts were a big help).

Here are the resulting log probabilites when using 2 mixture components for each part's appearance. In this case, the default EM initialization is used (uniform priors, random means and identity covariances).
Next, EM was initialized using the gmminit script, which initializes the centers and priors using k-means on the data. The covariance matrices are calculated as the sample covariance of the points closest to the corresponding centres.

No comments: