@Louis > not sure to understand your question. What do you mean by
"always handup and unstable" ?
@Kerem > As the last time you answered only to me but the answer might
interest other people so i answer you here :)
Your question was :
"OpenNI has the native gestures like wave, push, steady but when it
comes to the swipes, its not working accurately at all. So I'm
wondering how did you achieve to make it so accurate as seen in the
video? I appreciate if you could share some of your experience about
the algorithms of the accurate interactivity."
As you said, the "native" gesture detection is quite unusable. Way too
restrictive for the end user. I first tried with it but quickly
changed to a "custom" way to do the same.
I'm actually just using the point (hand) detection feature. So it's
still a bit restrictive for the end user as he has to wave first to be
detected, but once it's done he can move around as he wants.
The first algo consisted of a virtual activation depth, if the hand
was over this "line", then i considered he was in drag mode. But, it's
not a god solution.. you need to be placed perfectly if you want to be
able to do natural moves.
So I changed for a simple speed average computation to base all my
things on.
So instead of having this virtual limit to go through, i just detect
if the hand goes forward at a minimum speed average (the tricky part
to customize) computed on, for what i remember, something like the 4
or 5 last steps. (that's the values you can see at the top left under
the stats panel).
To detect the "release" i do the same thing but with a lower minimum
average speed. So if the hand goes back at a minimum speed average i
release it.
The tricky part is to define those "minimum" values to get something
natural. And of course it depends on users...
I had another idea i wanted to test but didn't yet. Instead of having
this minimum speed for release, having a kind of hysteresis.
When the user goes in "press" mode, i detect the Z position and
consider that the release will be done at Z - X.
I'm not really sure it would be better but it's hard to say without
testing!
Anyway, it looks magical on my video but just because i'm used to the
gesture. Some collegues tried it and had troubles using it. It's far
from being perfectly natural.
Here are the EXE files if you want to play with it :
http://dl.dropbox.com/u/20758492/flash/kinect/coverflow.exe (86Mo, MP3
inside :D)
http://dl.dropbox.com/u/20758492/flash/kinect/face3d.exe (1,3MO the
Han Solo carbonite thing, you can click to change texture)
http://dl.dropbox.com/u/20758492/flash/kinect/water.exe (1,1Mo just a
useless water toy)
Install AIR 3 first if it's not already the case ;)
http://get.adobe.com/fr/air/
Durss