2007 IEEE Conference on Computer Vision and Pattern Recognition 2007
DOI: 10.1109/cvpr.2007.383235
|View full text |Cite
|
Sign up to set email alerts
|

Segmenting Motions of Different Types by Unsupervised Manifold Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
108
0
2

Year Published

2010
2010
2016
2016

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 164 publications
(110 citation statements)
references
References 20 publications
0
108
0
2
Order By: Relevance
“…In general, the motion trajectories from a perspective camera will lie on a nonlinear manifold instead of a linear subspace [8]. However, it is possible to approximate the manifold locally (over a short period of time) with a linear subspace.…”
Section: Subspace Theorymentioning
confidence: 99%
“…In general, the motion trajectories from a perspective camera will lie on a nonlinear manifold instead of a linear subspace [8]. However, it is possible to approximate the manifold locally (over a short period of time) with a linear subspace.…”
Section: Subspace Theorymentioning
confidence: 99%
“…The method has been applied to partition affine camera motions with outlying and corrupted trajectories (Rao et al 2008). There has also been work to apply manifold learning techniques such as ISOMAP and local linear embedding to cluster data drawn from multiple manifolds (Souvenir and Pless 2007;Goh and Vidal 2007). In the literature of computer vision, many effective methods have been developed based on the above clustering techniques or more views (Ozden et al 2007;Vidal and Hartley 2008).…”
Section: Relations To Previous Workmentioning
confidence: 99%
“…This motion is either assumed to occur [6], [7], [8] or can be induced by the robot [9], [10]. Although relative motion is a strong cue for segmentation, generating this motion in an unknown pile is oftentimes dangerous and undesirable.…”
Section: A Object Segmentationmentioning
confidence: 99%