Project Lead: Yale SongThis project explores gesture-based interaction between humans and unmanned vehicles on the aircraft carrier deck environment. Based on estimated hand and body poses, the objective is to extract meaningful gestures from a sequence of camera captured image frames.
Project Lead: Ying YinIn this project, we are developing a 3D hand gesture recognition system in the context of large tabletop interaction. Using a Kinect, we are able to track the fingertips of bare hands in real-time. The gesture-based interactive interface will allow users to do both manipulative and communicative gestures without artificial restriction, and hence enabling natural interaction.
- Real-time Continous Hand Gesture on a Tabletop: Gesture recognition with a colored glove and a commercial webcam.
Project Lead: Tom OuyangChemInk is a new sketch recognition framework for chemical structure drawings that combines multiple levels of visual features using a jointly trained conditional random field. The result is a recognizer that is better able to handle the wide range of drawing styles found in messy freehand sketches. A preliminary user study also showed that participants were on average over twice as fast using our sketch-based system compared to ChemDraw, a popular CAD-based tool for authoring chemical diagrams.
Project Lead: Jeremy ScottPhysInk is a physics-enabled sketching app for tablets and digital whiteboards that lets you quickly create animations of physical structure and behavior. You can sketch a device's parts in 2D, then move these parts to demonstrate the device's behavior. PhysInk understands 2D physics with the help of the Box2D physics engine. It also understands causality, so it can generate a simulation of your device in action.
- ASSIST: A Shrewd Sketch Interpretation and Simulation Tool