Adjusting hand accuracy and unregister skeleton

Apr 14, 2011 at 12:03 PM


Thanks for sharing this project. I'm trying it out in a WPF/Surface demo where we try to navigate and interact with a mapping control using Kinect. The demo works fine, but it would be great if is was possible to do some tuning.

We're trying to let the user draw a line in the map. But it is not exactly accurate since it is the whole hand that shows on top of the map. Is it possible adjust the depth accuracy in some way, so it would be possible to point and only letting the fingertip decide the point?

Is there a way to "unregister" the skeleton, so a new user can register himself? It would be great to have a button to start all over, instead of having to shut down the application.

Thanks :)

Apr 17, 2011 at 1:32 AM


Glad to hear you were able to get it working.

Accurate drawing is difficult in general because of noise in the skeleton tracking. I'm looking into a filter to make this a bit better.

We don't yet have the capability to determine specific hand poses such as pointing, but this is something I'm also actively working on. When that is available it would also enable better accuracy for pointing. For now watch the small glow that is near the center of the hand -- that is the actual touch point.

Don't worry about the skeleton registration and no need to restart the application for another user. I apply the first calibrated user's skeleton to all other new users and it works well enough. (Maybe wouldn't work for full 3D avatar but for this it is fine.) There is an edge case in the code you have - if a second user is visible in the frame while the first user does the skeleton calibration, the second user will be permanently ignored. For now just make sure there's only one user visible during calibration. Once calibration is complete for one user then others can walk in and start using it.

When I upgrade to the newer version of OpenNI and later the official Kinect SDK, this will be less of a problem.