Please use this identifier to cite or link to this item: http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7257
Full metadata record
DC FieldValueLanguage
dc.creatorMeila, Marina-
dc.creatorJordan, Michael I.-
dc.creatorMorris, Quaid-
dc.date2004-10-20T21:04:25Z-
dc.date2004-10-20T21:04:25Z-
dc.date1998-09-01-
dc.date.accessioned2013-10-09T02:48:49Z-
dc.date.available2013-10-09T02:48:49Z-
dc.date.issued2013-10-09-
dc.identifierAIM-1648-
dc.identifierCBCL-165-
dc.identifierhttp://hdl.handle.net/1721.1/7257-
dc.identifier.urihttp://koha.mediu.edu.my:8181/xmlui/handle/1721-
dc.descriptionThis paper introduces a probability model, the mixture of trees that can account for sparse, dynamically changing dependence relationships. We present a family of efficient algorithms that use EM and the Minimum Spanning Tree algorithm to find the ML and MAP mixture of trees for a variety of priors, including the Dirichlet and the MDL priors. We also show that the single tree classifier acts like an implicit feature selector, thus making the classification performance insensitive to irrelevant attributes. Experimental results demonstrate the excellent performance of the new model both in density estimation and in classification.-
dc.format1320254 bytes-
dc.format477415 bytes-
dc.formatapplication/postscript-
dc.formatapplication/pdf-
dc.languageen_US-
dc.relationAIM-1648-
dc.relationCBCL-165-
dc.titleEstimating Dependency Structure as a Hidden Variable-
Appears in Collections:MIT Items

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.