News
|
Hand-and-Body Gesture RecognitionProject Lead: Yale Song Human gestures are often multi-signal: using both body and hand poses simultaneously is necessary for gesture understanding. Successful gesture recognition thus needs to be able to process multi-signal data seamlessly. Most current gesture recognition systems, however, concentrate on dealing with only a single signal. We developed a multi-signal continuous gesture recognition system that attends to body and hands, allowing a richer gesture vocabulary and more natural human-computer interaction. This system has been tested in the domain of NATOPS aircraft handling signals, a real-world scenario that would benefit from automatic recognition in unmanned vehicles. We therefore introduce a novel database of 24 body-and-hand gestures to the gesture recognition community. |
|
Publications
|
DemosCodeContactPlease direct questions to Yale Song, or MUG. |