Please use this identifier to cite or link to this item: http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7045
Full metadata record
DC FieldValueLanguage
dc.creatorJones, Michael J.-
dc.date2004-10-20T20:23:37Z-
dc.date2004-10-20T20:23:37Z-
dc.date1992-09-01-
dc.date.accessioned2013-10-09T02:48:05Z-
dc.date.available2013-10-09T02:48:05Z-
dc.date.issued2013-10-09-
dc.identifierAITR-1396-
dc.identifierhttp://hdl.handle.net/1721.1/7045-
dc.identifier.urihttp://koha.mediu.edu.my:8181/xmlui/handle/1721-
dc.descriptionThis report explores how recurrent neural networks can be exploited for learning high-dimensional mappings. Since recurrent networks are as powerful as Turing machines, an interesting question is how recurrent networks can be used to simplify the problem of learning from examples. The main problem with learning high-dimensional functions is the curse of dimensionality which roughly states that the number of examples needed to learn a function increases exponentially with input dimension. This thesis proposes a way of avoiding this problem by using a recurrent network to decompose a high-dimensional function into many lower dimensional functions connected in a feedback loop.-
dc.format2167097 bytes-
dc.format1325986 bytes-
dc.formatapplication/postscript-
dc.formatapplication/pdf-
dc.languageen_US-
dc.relationAITR-1396-
dc.titleUsing Recurrent Networks for Dimensionality Reduction-
Appears in Collections:MIT Items

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.