Please use this identifier to cite or link to this item:
http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7195
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.creator | Jordan, Michael | - |
dc.creator | Xu, Lei | - |
dc.date | 2004-10-20T20:49:25Z | - |
dc.date | 2004-10-20T20:49:25Z | - |
dc.date | 1995-04-21 | - |
dc.date.accessioned | 2013-10-09T02:48:31Z | - |
dc.date.available | 2013-10-09T02:48:31Z | - |
dc.date.issued | 2013-10-09 | - |
dc.identifier | AIM-1520 | - |
dc.identifier | CBCL-111 | - |
dc.identifier | http://hdl.handle.net/1721.1/7195 | - |
dc.identifier.uri | http://koha.mediu.edu.my:8181/xmlui/handle/1721 | - |
dc.description | "Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models. | - |
dc.format | 9 p. | - |
dc.format | 291671 bytes | - |
dc.format | 476864 bytes | - |
dc.format | application/postscript | - |
dc.format | application/pdf | - |
dc.language | en_US | - |
dc.relation | AIM-1520 | - |
dc.relation | CBCL-111 | - |
dc.subject | learning | - |
dc.subject | neural networks | - |
dc.subject | EM algorithm | - |
dc.subject | clustering | - |
dc.subject | mixture models | - |
dc.subject | statistics | - |
dc.title | On Convergence Properties of the EM Algorithm for Gaussian Mixtures | - |
Appears in Collections: | MIT Items |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.