Título:
|
Deformable motion 3D reconstruction by union of regularized subspaces
|
Autor/a:
|
Agudo Martínez, Antonio; Moreno-Noguer, Francesc
|
Otros autores:
|
Institut de Robòtica i Informàtica Industrial; Universitat Politècnica de Catalunya. ROBiri - Grup de Robòtica de l'IRI |
Abstract:
|
© 20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Abstract:
|
This paper presents an approach to jointly retrieve camera pose, time-varying 3D shape, and automatic clustering based on motion primitives, from incomplete 2D trajectories in a monocular video. We introduce the concept of order-varying temporal regularization in order to exploit video data, that can be indistinctly applied to the 3D shape evolution as well as to the similarities between images. This results in a union of regularized subspaces which effectively encodes the 3D shape deformation. All parameters are learned via augmented Lagrange multipliers, in a unified and unsupervised manner that does not assume any training data at all. Experimental validation is reported on human motion from sparse to dense shapes, providing more robust and accurate solutions than state-of-the-art approaches in terms of 3D reconstruction, while also obtaining motion grouping results. |
Abstract:
|
Peer Reviewed |
Materia(s):
|
-Àrees temàtiques de la UPC::Informàtica::Automàtica i control -computer vision -optimisation -Non-Rigid Structure from Motion -Order-Varying Regularization -Union of Regularized Subspaces -Classificació INSPEC::Pattern recognition::Computer vision |
Derechos:
|
Attribution-NonCommercial-NoDerivs 3.0 Spain
http://creativecommons.org/licenses/by-nc-nd/3.0/es/ |
Tipo de documento:
|
Artículo - Versión presentada Objeto de conferencia |
Editor:
|
Institute of Electrical and Electronics Engineers (IEEE)
|
Compartir:
|
|