Please use this identifier to cite or link to this item:
http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7206| Title: | Hierarchical Mixtures of Experts and the EM Algorithm |
| Keywords: | supervised learning statistics decision trees neuralsnetworks |
| Issue Date: | 9-Oct-2013 |
| Description: | We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain. |
| URI: | http://koha.mediu.edu.my:8181/xmlui/handle/1721 |
| Other Identifiers: | AIM-1440 CBCL-083 http://hdl.handle.net/1721.1/7206 |
| Appears in Collections: | MIT Items |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
