I implemented the new CPU based lighting, which helps with rendering on mobile devices. Lighting is divided into sky and local lighting. Sky emits lights from top to bottom of the voxel world, also it creates shadows and darkens caves. Local light’s currently without shadows, but uses normals of objects. Lights are calculated in the editor and then saved to the file to help with loading time on mobile devices.
The previous video was slightly chaotic, plus I improved the gesture handling in Perceptual SDK -
instead of finger coordinates I am using coordinate of LABEL_HAND_FINGERTIP - it seems like it’s more robust.
I participated in the Intel Perceptual Computing Challenge with project Mercury, Digital Sculpting Application - voxel-based 3D mesh edition with bare hands.
The prototype is based on the Glow engine and uses a voxel engine, which was made for the Iron Cube game. Smooth mesh generation was additionally written, which is based on Surface Nets (like modification of Matching Cubes)
The Intel Perceptual Computing SDK and the Creative* Interactive Gesture Camera Developer Kit are used for hand-based interface. I received a camera just one day before mydeadline, so I had only one day to integrate it to the engine. And surprisingly it was done just in a few hours - SDK has a special util library, which makes it easy to use the camera’s features.
SDK reports about several types of commonly used gestures like victory (V), “Thumbs signal” etc and reports about exact 3D positions of every finger. Also it reports about the hand state (open, close) and position.
Only restricted gestures and manipulations are supported currently in the Mercury project, but I hope to make it as easy as in real sculpting. The camera provides enough precision to make it.
I am playing with the rasterization of 3d polygonal models to voxels. The first version of the algorithm is not yet perfect, there are cracks and holes in it.
I prepared a video of the Iron Cube game on Galaxy Note 2. Movement, shooting and rendering are done, the remaining parts left to do are the sound and AI. Voxels can be integrated directly into Recast navigation library, otherwise navigation meshes can be built from a final optimized mesh (like physics).
As I wrote earlier, I participated in the Leap Motion Development program and was selected as a developer. It’s arrived today and I made some tests. The device looks responsible and precise, check video below. I will try to integrate the Leap SDK to the Glow engine in the future. It seems perfect for 3D mesh manipulation or shooters.
OMG, I ported the Glow engine to the Android over the last weekends! (Honestly, the port is based on the Google NACL port of the engine, and I spend a lot of time making the Glow engine truly cross-platform recently.). But anyway, more about it later.
By the way, my new Galaxy Note II is awesome, I will write about it also some time later. It’s incredibly fast and loads the voxel scene from my game in time, comparable to time on PC.
Finally I became console player! But actually without a console. I became very excited after Steam BigPicture mode appeared. Having big library of games in Steam I didn’t play it. It’s not convenient to play it at workplace.
So I attached to big 46 inch TV my shinny new ultrabook , received from Intel after participation in 300 ultrabook competition, bought wireless XBOX360 controller and launched BigPicture! 2D games like Limbo and Braid are played very well but shooters.. Seriously, how do you guys play shooters with controller?