Project Lead: Yale Song
This project explores gesture-based interaction between humans and unmanned vehicles on the aircraft carrier deck environment. Based on estimated hand and body poses, the objective is to extract meaningful gestures from a sequence of camera captured image frames.
Project Lead: Ying Yin
In this project, we are developing a real-time 3D hand gesture recognition system for large display (horizontal or vertical) interaction. Using a Kinect, we are able to track the fingertips of bare hands in real-time. The gesture-based interactive interface allows users to do both path and pose gestures seamlessly, and responds to continuous and discrete flow gestures promptly and appropriately.
- Real-time Continous Hand Gesture on a Tabletop: Gesture recognition with a colored glove and a commercial webcam
Project Lead: Tom Ouyang
ChemInk is a new sketch recognition framework for chemical structure drawings that combines multiple levels of visual features using a jointly trained conditional random field. The result is a recognizer that is better able to handle the wide range of drawing styles found in messy freehand sketches. A preliminary user study also showed that participants were on average over twice as fast using our sketch-based system compared to ChemDraw, a popular CAD-based tool for authoring chemical diagrams.
Project Lead: Jeremy Scott
PhysInk is a physics-enabled sketching app for tablets and digital whiteboards that lets you quickly create animations of physical structure and behavior. You can sketch a device's parts in 2D, then move these parts to demonstrate the device's behavior. PhysInk understands 2D physics with the help of the Box2D physics engine. It also understands causality, so it can generate a simulation of your device in action.