Please use this identifier to cite or link to this item: http://dspace.mediu.edu.my:8181/xmlui/handle/1721.1/7111
Full metadata record
DC FieldValueLanguage
dc.creatorFelzenszwalb, Pedro F.-
dc.date2004-10-20T20:32:11Z-
dc.date2004-10-20T20:32:11Z-
dc.date2003-08-08-
dc.date.accessioned2013-10-09T02:48:24Z-
dc.date.available2013-10-09T02:48:24Z-
dc.date.issued2013-10-09-
dc.identifierAITR-2003-016-
dc.identifierhttp://hdl.handle.net/1721.1/7111-
dc.identifier.urihttp://koha.mediu.edu.my:8181/xmlui/handle/1721-
dc.descriptionWe present a set of techniques that can be used to represent and detect shapes in images. Our methods revolve around a particular shape representation based on the description of objects using triangulated polygons. This representation is similar to the medial axis transform and has important properties from a computational perspective. The first problem we consider is the detection of non-rigid objects in images using deformable models. We present an efficient algorithm to solve this problem in a wide range of situations, and show examples in both natural and medical images. We also consider the problem of learning an accurate non-rigid shape model for a class of objects from examples. We show how to learn good models while constraining them to the form required by the detection algorithm. Finally, we consider the problem of low-level image segmentation and grouping. We describe a stochastic grammar that generates arbitrary triangulated polygons while capturing Gestalt principles of shape regularity. This grammar is used as a prior model over random shapes in a low level algorithm that detects objects in images.-
dc.format80 p.-
dc.format6877524 bytes-
dc.format3132998 bytes-
dc.formatapplication/postscript-
dc.formatapplication/pdf-
dc.languageen_US-
dc.relationAITR-2003-016-
dc.subjectAI-
dc.titleRepresentation and Detection of Shapes in Images-
Appears in Collections:MIT Items

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.