Please use this identifier to cite or link to this item: http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7195
Full metadata record
DC FieldValueLanguage
dc.creatorJordan, Michael-
dc.creatorXu, Lei-
dc.date2004-10-20T20:49:25Z-
dc.date2004-10-20T20:49:25Z-
dc.date1995-04-21-
dc.date.accessioned2013-10-09T02:48:31Z-
dc.date.available2013-10-09T02:48:31Z-
dc.date.issued2013-10-09-
dc.identifierAIM-1520-
dc.identifierCBCL-111-
dc.identifierhttp://hdl.handle.net/1721.1/7195-
dc.identifier.urihttp://koha.mediu.edu.my:8181/xmlui/handle/1721-
dc.description"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.-
dc.format9 p.-
dc.format291671 bytes-
dc.format476864 bytes-
dc.formatapplication/postscript-
dc.formatapplication/pdf-
dc.languageen_US-
dc.relationAIM-1520-
dc.relationCBCL-111-
dc.subjectlearning-
dc.subjectneural networks-
dc.subjectEM algorithm-
dc.subjectclustering-
dc.subjectmixture models-
dc.subjectstatistics-
dc.titleOn Convergence Properties of the EM Algorithm for Gaussian Mixtures-
Appears in Collections:MIT Items

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.