Hi VRgeeks !
First thing : it's amazing ! I believe this is the first time I experienced real presence :)
Don't except to have something perfect : It has a lot of limitations, but it's really promising.
** Interference with positionnal tracking :
Since Leap uses IR leds, I though it could be a problem for rift positionnal tracking. Turns out it's OK, because Leap leds are flashing (like Oculus leds).
Sometimes, simultaneous flash can happens but thanks to sensors fusion, rift positionnal tracking remains good.
Oculus tracking is *really* solid.
** Hands tracking
Hands tracking is quite good, but you need to have your hands really close to the HMD. The range is small !
Don't except to do fine manipulation, it is not robust enough.
The latency is actually really good.
** Video Passthrough
Another benefit is that you see a B&W video of your close environment.
It feels a lot more secure. No risk to bump into your desk ;)
Easier to dive in, and easier to return to reality !
** AR
I was really worried about the latency of the video stream. In fact it's quite good for passthrough, but not enough for AR.
Virtual objects don't move at the same speed than your environment.
However, I believe that there is a bug in the current implementation in all demos.
I'm quite sure we can improve the results using timewarp correctly : first TW for video source, then regular oculus timewarp.
Has anybody tried this ?
Unfortunately, Leap is not a depth camera. You can't obtain a cloud points like Kinect.
So it works well as video passthrough, but you can't cull virtual objects hidden by real world. This can be really disturbing.
I believe we'll need to have some kind of depth camera in order to use an HMD for AR.
Video passthrough is not enough ! :/
Cheers !
Alexandre.