Leap motion with oculus review

46 views
Skip to first unread message

Alexandre Avenel

unread,
Sep 18, 2014, 6:50:14 PM9/18/14
to vr-g...@googlegroups.com
Hi VRgeeks !


I've just tested a Leap motion mounted on Oculus rift, and I thought I could give you a quick review about this .(https://developer.leapmotion.com/vr)


First thing : it's amazing ! I believe this is the first time I experienced real presence :)
Don't except to have something perfect : It has a lot of limitations, but it's really promising.


** Interference with positionnal tracking : 

Since Leap uses IR leds, I though it could be a problem for rift positionnal tracking. Turns out it's OK, because Leap leds are flashing (like Oculus leds). 
Sometimes, simultaneous flash can happens but thanks to sensors fusion, rift positionnal tracking remains good. 
Oculus tracking is *really* solid.


** Hands tracking

Hands tracking is quite good, but you need to have your hands really close to the HMD. The range is small !
Don't except to do fine manipulation, it is not robust enough.
The latency is actually really good.

** Video Passthrough

Another benefit is that you see a B&W video of your close environment.
It feels a lot more secure. No risk to bump into your desk ;)
Easier to dive in, and easier to return to reality !


** AR

I was really worried about the latency of the video stream. In fact it's quite good for passthrough, but not enough for AR. 
Virtual objects don't move at the same speed than your environment.
However, I believe that there is a bug in the current implementation in all demos.
I'm quite sure we can improve the results using timewarp correctly : first TW for video source, then regular oculus timewarp.
Has anybody tried this ?

Unfortunately, Leap is not a depth camera. You can't obtain a cloud points like Kinect.
So it works well as video passthrough, but you can't cull virtual objects hidden by real world. This can be really disturbing.
I believe we'll need to have some kind of depth camera in order to use an HMD for AR. 
Video passthrough is not enough ! :/


Cheers !


Alexandre.

Jan Ciger

unread,
Sep 19, 2014, 4:12:17 AM9/19/14
to vr-g...@googlegroups.com
Hello
 
Unfortunately, Leap is not a depth camera. You can't obtain a cloud points like Kinect.
So it works well as video passthrough, but you can't cull virtual objects hidden by real world. This can be really disturbing.
I believe we'll need to have some kind of depth camera in order to use an HMD for AR. 
Video passthrough is not enough ! :/

Well, if they allow image access in the SDK now, it is not that hard to generate depth maps, e.g. with OpenCV. Leap is nothing else but a stereo camera rig.

On the other hand, the utility of the setup is questionable, because the range of the sensor is atrocious.

However, I believe that there is a bug in the current implementation in all demos.
I'm quite sure we can improve the results using timewarp correctly : first TW for video source, then regular oculus timewarp.
Has anybody tried this ?

That would be hard. Oculus can implement the time warp feature, because they have an extremely tight control over the entire chain from the sensor to the display pixels and know exactly how much time which bit needs.  There is no information about the Leap cameras, then you have undetermined latency in the driver and the software itself.

The way the time warp in the Oculus SDK works is that they sample the IMU, generate the image and then just before rendering, they sample the IMU again and distort/warp the rendered image to match. There is no "time travel" there to get the sensor data faster possible, it only reduces the perceived lag of the rendered image behind the sensor data. I am not sure how you would want to implement that with a camera which runs at an order (or two) of magnitude slower than a 1000Hz IMU and getting an image from it together with the required processing could take a comparable (or even longer) time than rendering a typical frame (typical optical tracker runs at 30-120Hz, anything more gets very expensive very fast). And you still have to do the image correction (the "warp" part) once you get the new image data. Warping to account for a bit of head (rendering camera) rotation is relatively easy (you shift the rendered image a little), but you cannot do that for two hands that are moving independently - remember, you must work on the entire image at once.

Regards,

J.

Jan Ciger

unread,
Sep 19, 2014, 11:11:26 AM9/19/14
to vr-g...@googlegroups.com
Hello,

I have tested the setup with DK2 and Leap mysef - if someone wants the play, the demo is here: http://leapmotion.us6.list-manage.com/track/click?u=b6db0e3e157a48749e1048615&id=5fbad1b97f&e=02d363f3d1

They did a clever hack - extracting the hands of the user and colourising them (the Leap cameras are only grayscale because they are IR!) and using a depth map to allow proper object occlusion in the 3D scene.

Unfortunately, the Leap SDK tracking is very close range - it will refuse to track the hands beyond some 30-40cm from your face and the tracking drops out very often. This seems to be a hardwired limitation in the SDK (perhaps they fail to fit the hand model correctly?), because the live video in the Leap testing app shows the hands still visible OK.

All in all, it is a fun setup  and coupled with some global head tracking (e.g. the DK2 camera or Hydra/STEM) you could have really immersive tracking for the hands. What really helps is the low latency of the camera - everyone immediately identifies the hand in front of them as their hand. The bad thing is that, unfortunately, the Leap is not really practical for this, it really needs larger stereobase and a more robust tracker. Probably bolting two fast webcams on the Rift and doing a bit of image processing in OpenCV would give even better results.

Regards,

J.


Lorne Covington

unread,
Sep 19, 2014, 11:55:36 AM9/19/14
to vr-g...@googlegroups.com

Thanks guys, I had been planning this week to try it out, and I think you two have saved me the trouble.  I had tried this a while back, and found the interaction area too small to be very useful, but thought since they had improved the back-angle hand tracking I'd try it again and see if it had improved.

Guess the Intel Senz3D is still the best out of what's available, but I was curious to see the hands pass-through with the Leap.

The StructureIO sensor is out now, which looks like it could fill the bill - there's this video, but it has terrible lag on the hands.  I had good success using a Primesense on a Rift (with MUCH less lag), and it just looks like a much more expensive version of one, so it should be workable.  (I've seen one recently, but only doing Skanect, certainly none of the stuff they have in their marketing video!)

Oh well, ever onward!

- Lorne


http://noirflux.com
--
You received this message because you are subscribed to the Google Groups "VR Geeks" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vr-geeks+u...@googlegroups.com.
Visit this group at http://groups.google.com/group/vr-geeks.
For more options, visit https://groups.google.com/d/optout.

-- 

http://noirflux.com

Dave Buchhofer

unread,
Sep 19, 2014, 12:04:53 PM9/19/14
to vr-g...@googlegroups.com
I may be wrong (as I sit on a plane and can't verify) but the occipital sensor uses the front facing camera off the iPad it is attached to for the color, and they are just now announcing a calibration system to register the color with the depth (per a recent announcement)

We did get a short (live running) demo of the next leap prototype at a recent meet up, and while they asked to not share many details as things are still changing, it does have a more standard stereo basis of 64 vs 34mm(?) Giving a much better depth range and including a much larger image. 

In all it seems much more exciting for our use cases, albeit in the future.

Jan Ciger

unread,
Sep 19, 2014, 1:17:01 PM9/19/14
to vr-g...@googlegroups.com

On Fri, Sep 19, 2014 at 6:04 PM, Dave Buchhofer <dbuch...@gmail.com> wrote:
We did get a short (live running) demo of the next leap prototype at a recent meet up, and while they asked to not share many details as things are still changing, it does have a more standard stereo basis of 64 vs 34mm(?) Giving a much better depth range and including a much larger image. 


Well, that sounds like something more usable than the current Leap Motion gimmick. On the other hand, then they are going to compete directly with devices like the Kinects, Asus Xtion, the Senz3D etc. It seems that their product designers are finally getting some engineering input as well and not only from their marketing and art departments.

On the other hand, if they don't hurry, it is likely that HMDs with built-in cameras (like the Totem that is on Kickstarter now) will snatch this market from them.

J.



Lorne Covington

unread,
Sep 19, 2014, 1:42:22 PM9/19/14
to vr-g...@googlegroups.com

Speaking of which, did anyone ever try out the OVRVision 1?  I didn't get one when it was on Amazon:

    http://www.amazon.com/gp/product/B00IKBDAW6

but seems they are still kicking and working on the DK2 version available mid-October.

Cheers!

- Lorne


http://noirflux.com
--
You received this message because you are subscribed to the Google Groups "VR Geeks" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vr-geeks+u...@googlegroups.com.
Visit this group at http://groups.google.com/group/vr-geeks.
For more options, visit https://groups.google.com/d/optout.

Jan Ciger

unread,
Sep 19, 2014, 3:34:24 PM9/19/14
to vr-g...@googlegroups.com
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 09/19/2014 07:42 PM, Lorne Covington wrote:
>
> Speaking of which, did anyone ever try out the OVRVision 1? I
> didn't get one when it was on Amazon:
>
> http://www.amazon.com/gp/product/B00IKBDAW6
>
> but seems they are still kicking
> <http://translate.google.com/translate?hl=en&sl=ja&u=http://ovrvision.com/&prev=/search%3Fq%3Dovrvision%26client%3Dfirefox-a%26hs%3DVHj%26rls%3Dorg.mozilla:en-US:official%26channel%3Dsb>
>
>
and working on the DK2 version available mid-October.

Interesting, should be apparently released in October.

However, $160 for a pair of 640x480 webcams in a custom housing is a
bit steep price.

Do you know any specs of the device? What sort of cameras are there?
Are they synchronized (probably, as they deliver a single side-by-side
image) and global shutter? If not, it is not going to be useful for
tracking.

J.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iD8DBQFUHIU+n11XseNj94gRAk77AKDKj1HDBJBGwQlivhcxdye6UtEc2ACgwY09
HO71LERGy0YsXu7TvGZy8yo=
=RdJ0
-----END PGP SIGNATURE-----
Reply all
Reply to author
Forward
0 new messages