Chrysaora.com meets Kinect HD
Here is my first experiment with Microsoft Kinect. I used OpenNI and PrimeSense to get the skeleton data. The picture in the corner is the UserTracker example from the OpenNI source with some modifications by Shigenobu Yukioka (ndruger). The example is modified to send joint locations to a node.js WebSocket server. I added some further modifications to reduce the data bandwidth (by sending only 4 joint locations instead the whole skeleton). On the server side, I have a physically simulated camera object and I used limb movements to add/remove forces (velocity, torque and dumping). the idea is to have the player move the camera by natural gestures (pushing the air around like under water). The WebGL rendering demo is live here: chrysaora.com I will share my source code and experience on my blog soon. To get updates about the project follow me on twitter @aleksandarrodic or visit my blog: blog.aleksandarrodic.com
Похожие видео
Показать еще