INTAIRACT: Joint hand gesture and fingertip classification for touchless interaction

Otros/as autores/as

Universitat Politècnica de Catalunya. Departament de Teoria del Senyal i Comunicacions

Universitat Politècnica de Catalunya. GPI - Grup de Processament d'Imatge i Vídeo

Fecha de publicación

2012

Resumen

In this demo we present intAIRact, an online hand-based touchless interaction system. Interactions are based on easy-to-learn hand gestures, that combined with translations and rotations render a user friendly and highly configurable system. The main advantage with respect to existing approaches is that we are able to robustly locate and identify fingertips. Hence, we are able to employ a simple but powerful alphabet of gestures not only by determining the number of visible fingers in a gesture, but also which fingers are being observed. To achieve such a system we propose a novel method that jointly infers hand gestures and fingertip locations using a single depth image from a consumer depth camera. Our approach is based on a novel descriptor for depth data, the Oriented Radial Distribution (ORD) [1]. On the one hand, we exploit the ORD for robust classification of hand gestures by means of efficient k-NN retrieval. On the other hand, maxima of the ORD are used to perform structured inference of fingertip locations. The proposed method outperforms other state-of-the-art approaches both in gesture recognition and fingertip localization. An implementation of the ORD extraction on a GPU yields a real-time demo running at approximately 17fps on a single laptop


Peer Reviewed


Postprint (published version)

Tipo de documento

Conference lecture

Lengua

Inglés

Publicado por

Springer

Documentos relacionados

http://link.springer.com/chapter/10.1007/978-3-642-33885-4_62

Citación recomendada

Esta citación se ha generado automáticamente.

Derechos

http://creativecommons.org/licenses/by-nc-nd/3.0/es/

Restricted access - publisher's policy

Attribution-NonCommercial-NoDerivs 3.0 Spain

Este ítem aparece en la(s) siguiente(s) colección(ones)

E-prints [73034]