I participated in the Intel Perceptual Computing Challenge with project Mercury, Digital Sculpting Application - voxel-based 3D mesh edition with bare hands.
The prototype is based on the Glow engine and uses a voxel engine, which was made for the Iron Cube game. Smooth mesh generation was additionally written, which is based on Surface Nets (like modification of Matching Cubes)
The Intel Perceptual Computing SDK and the Creative* Interactive Gesture Camera Developer Kit are used for hand-based interface. I received a camera just one day before mydeadline, so I had only one day to integrate it to the engine. And surprisingly it was done just in a few hours - SDK has a special util library, which makes it easy to use the camera’s features.
SDK reports about several types of commonly used gestures like victory (V), “Thumbs signal” etc and reports about exact 3D positions of every finger. Also it reports about the hand state (open, close) and position.
Only restricted gestures and manipulations are supported currently in the Mercury project, but I hope to make it as easy as in real sculpting. The camera provides enough precision to make it.