Please use this identifier to cite or link to this item: http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7206
Full metadata record
DC FieldValueLanguage
dc.creatorJordan, Michael I.-
dc.creatorJacobs, Robert A.-
dc.date2004-10-20T20:49:48Z-
dc.date2004-10-20T20:49:48Z-
dc.date1993-08-01-
dc.date.accessioned2013-10-09T02:48:33Z-
dc.date.available2013-10-09T02:48:33Z-
dc.date.issued2013-10-09-
dc.identifierAIM-1440-
dc.identifierCBCL-083-
dc.identifierhttp://hdl.handle.net/1721.1/7206-
dc.identifier.urihttp://koha.mediu.edu.my:8181/xmlui/handle/1721-
dc.descriptionWe present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.-
dc.format29 p.-
dc.format190144 bytes-
dc.format678911 bytes-
dc.formatapplication/octet-stream-
dc.formatapplication/pdf-
dc.languageen_US-
dc.relationAIM-1440-
dc.relationCBCL-083-
dc.subjectsupervised learning-
dc.subjectstatistics-
dc.subjectdecision trees-
dc.subjectneuralsnetworks-
dc.titleHierarchical Mixtures of Experts and the EM Algorithm-
Appears in Collections:MIT Items

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.