Hi, I am working on Gaussian Mixture models using EM and I am stuck at a problem. I have used uniform prior probability for the components, Gaussian prior for means and inverse Gamma prior for variance. If I compute the posterior probability of components without multiplying by the prior probabilities of mean and variance then the estimated means are close to true ones, each component has some data points and the EM converges very fast. But the order of estimated means is not correct. On the other hand if I compute the posterior probability after multiplying by the prior probability of mean and variance the estimated means are in the correct order, but most of the data points are assigned to few components and the EM does not converge as the component means oscillate between high and low values in alternate iterations. To avoid the problem I have to use the first case, but is there any way to solve the problem?