multiple Kinect multiplexing issue solved with polarized light filters?

602 views
Skip to first unread message

Sean Kean

unread,
Nov 12, 2010, 1:33:44 PM11/12/10
to OpenKinect
just a thought, but perhaps you could have the IR light differently
polarized for each kinect unit to make it individually addressable at
full framerate?

in most theatre stereoscopic 3d projection systems this is how it is
done - to my lay person's knowledge - there are two projectors - one
for each eye - the projector displaying the image for the right eye
has a polarized lens rotated at -30 degrees - which corresponds to a
"decoding" polarized lens over the right eye on the 3d glasses you
get. the other projector which displays the image for the left eye is
rotated to 30 degrees - and so the plastic lens over the left eye in
the glasses is also 'decoding' at 30 degrees. its two channel video
encoded and decoded at full bandwidth per say. perhaps you could add a
third channel at 90 degree rotation...thats somehow vaguely how it
works..

wikipedia explains it better: http://en.wikipedia.org/wiki/Polarized_3D_glasses

so for kinect - put one lense over the ir transmitter, and one over
the depth camera - for each kinect the lenses the two lenses are at
the same angle, for every kinect unit in the system they must have
their lenses rotated at different angles. then they can all send out
the ir at full speed without multiplexing (?) - and only capture back
the signals it sends indivudually... polorazing the light tends to
make it more opaque... but the glasses you get in Jackass 3D for
example would probably be a good start - they've tried to reduce the
opacity of them commercially so people would go to the theatre and
have less eye strain.

just a thought! dont know if any of that actually applies. keep
chugging along folks!

btw, kinect is a time-of-flight camera, right?
http://en.wikipedia.org/wiki/Time-of-flight_camera - strange that its
not linked in on that page.. or is it a structured light camera?

Sean

Murilo Saraiva de Queiroz

unread,
Nov 12, 2010, 1:41:07 PM11/12/10
to openk...@googlegroups.com
On Fri, Nov 12, 2010 at 4:33 PM, Sean Kean <sean....@gmail.com> wrote:
btw, kinect is a time-of-flight camera, right?
http://en.wikipedia.org/wiki/Time-of-flight_camera - strange that its
not linked in on that page.. or is it a structured light camera?

Kinect uses structured light, it's NOT a time of flight camera. 

It's funny because  I wrote this long, didactic article (in Portuguese) explaining that Kinect is a TOF camera, and then Alex Kipman (who's Brazilian, BTW) himself told me I was wrong... It was quite embarassing. :-)

The original, wrong article: 

Kipman's correction, on Twitter:

My second article, with the right explanation:

Joshua Blake

unread,
Nov 12, 2010, 1:44:32 PM11/12/10
to openk...@googlegroups.com
Excellent idea about the polarized lenses. That would probably work as long as the polarization (x2) doesn't make it too dark to work.

Felix

unread,
Nov 12, 2010, 2:38:12 PM11/12/10
to OpenKinect
The polarization could change during the reflection on some materials.
Special materials that do not change the polarization are used as 3d
projection areas.
As another idea we could use different wavelength of the infrared
spectrum. With the assumption that the kinect emits a broad spectrum
of ir light, four ir-pass filters could maybe be used with two
kinects, covering each projection and ir camera. But accurate filters
are expensive and consume a lot of light.
> >http://en.wikipedia.org/wiki/Time-of-flight_camera- strange that its

Hector Martin

unread,
Nov 12, 2010, 3:08:38 PM11/12/10
to openk...@googlegroups.com
The Kinect uses an IR laser sent through a diffraction grating to
produce the IR constellation, so chances are the IR light is almost
entirely monochromatic. If I had to take a guess, it's probably 808nm.


--
Hector Martin (hec...@marcansoft.com)
Public Key: http://www.marcansoft.com/marcan.asc

David Sweeney

unread,
Nov 12, 2010, 3:25:24 PM11/12/10
to openk...@googlegroups.com
But you can see them!

Nink

unread,
Nov 12, 2010, 3:29:58 PM11/12/10
to openk...@googlegroups.com
Is interferance from 2 kinects that big a problem. Since the kinect has the abilaty to align itself through the servos and accelerometer could a calibration be automatically carried out between the two kinects to ensure we reduce any interference or are atleast aware of it. Turn 1 kinect IR projector on calirate turn it off and turn on the other kinect IR projector calibrate. Turn them both on account for the deltas and now align to reduce interference.

So if you have 2 kinects 90 degrees from each other and place an object between them do a comparison between what both kinects can see before and after the object was placed there could we still calculate what is actually there?
Sent from my BlackBerry

Hector Martin

unread,
Nov 12, 2010, 3:49:38 PM11/12/10
to openk...@googlegroups.com
Sufficiently bright near-IR looks deep red to the eye ;)

Adam Sheesley

unread,
Nov 12, 2010, 4:53:53 PM11/12/10
to openk...@googlegroups.com
I think to tackle this problem I'm going to have to get two kinects as soon as possible and start experimenting with what works and what's possible.

The use case I'm rolling around in my head is to have two, three, maybe four(crazy I know) kinects setup at different vantage points then combine their depth information in software to get a real time decent accuracy 3D scanner.  For brain-splosion add pico projectors.

Taha

unread,
Nov 12, 2010, 5:04:00 PM11/12/10
to OpenKinect
There are 2 ways i see stitching 2 kinects may work 1) is by
offsetting the frame capture of the i.e. kinects are off sync this
would also mean that the IR projector would be pulsing with the frame
itself this isnt ideal since doing this we would effectively have to
reduce the framerate by 50%. The other way is to disable the IR
projectors on one of the devices and sync all 4 cameras together using
an external trigger pulse, the issue that may arise out of this method
is could be syncing the cameras themselves and which is dependent on
the camera shutter itself i.e. if either of the cameras is using a
rolling shutter we may see issues in that. But never the less the 2nd
one method is probably what we should be aiming for imo since it far
more robust.
> >>>>>http://en.wikipedia.org/wiki/Time-of-flight_camera-strange that its

Adam Sheesley

unread,
Nov 12, 2010, 5:19:13 PM11/12/10
to openk...@googlegroups.com
You make a good point about reduced frame rate.  An obstacle that comes to mind in the case of using only one IR source is the occlusion that would be caused be objects in the path of the IR dot matrix.  For example the back surface of an object has no dot matrix on it so the IR camera of an offset kinect would not have any dot matrix to read.

This brings another question does anyone know if the dot matrix a fixed pattern, or can it be changed?

William Cox

unread,
Nov 12, 2010, 5:26:22 PM11/12/10
to openk...@googlegroups.com
If the Kinect is actually using a laser source instead of a LED, then the output will already be quite polarized - unless the projection medium destroys this.
A simple way of testing: View the dot projection and place a sheet polarizer in front of your lens. Rotate the polarizer. If the intensity decreases then the light is (linearly) polarized. Otherwise it's a) circularly polarized or b) non-polarized.

If it turns out to be polarized, then try placing a sheet polarizer in front of the camera. if the unit still works, then there's a good chance we could use two units that are cross polarized. 

It'd be really useful if we could determine the exact wavelength of the light used. Anyone have a spectrometer? One clue, does the IR project show up using a normal CCD or CMOS camera sensor?

Thanks.
-William

Taha

unread,
Nov 12, 2010, 6:34:41 PM11/12/10
to OpenKinect
@Adam
I believe the dot matrix is from a diffraction grating and have idea
if it can be changed it would depend on how the diffraction grating is
attached to the LED if its actually infront of it as a lens of sorts
of if its actually coated on the led it self, someone would have to
open the projector module to find that out.

@William Yes the projection can be viewed by any camera thats
sensitive to IR or has there IR filter removed.


On Nov 12, 5:26 pm, William Cox <gallam...@gmail.com> wrote:
> If the Kinect is actually using a laser source instead of a LED, then the
> output will already be quite polarized - unless the projection medium
> destroys this.
> A simple way of testing: View the dot projection and place a sheet polarizer
> in front of your lens. Rotate the polarizer. If the intensity decreases then
> the light is (linearly) polarized. Otherwise it's a) circularly polarized or
> b) non-polarized.
>
> If it turns out to be polarized, then try placing a sheet polarizer in front
> of the camera. if the unit still works, then there's a good chance we could
> use two units that are cross polarized.
>
> It'd be really useful if we could determine the exact wavelength of the
> light used. Anyone have a spectrometer? One clue, does the IR project show
> up using a normal CCD or CMOS camera sensor?
>
> Thanks.
> -William
>
>
>
>
>
>
>
> On Fri, Nov 12, 2010 at 5:19 PM, Adam Sheesley <myd...@gmail.com> wrote:
> > You make a good point about reduced frame rate.  An obstacle that comes to
> > mind in the case of using only one IR source is the occlusion that would be
> > caused be objects in the path of the IR dot matrix.  For example the back
> > surface of an object <http://www.youtube.com/watch?v=nvvQJxgykcU#t=0m36s> has
> > no dot matrix on it so the IR camera of an offset kinect would not have any
> > dot matrix to read.
>
> > This brings another question does anyone know if the dot matrix a fixed
> > pattern, or can it be changed?
>

Kihnect

unread,
Nov 13, 2010, 1:18:19 AM11/13/10
to OpenKinect
On Nov 12, 12:29 pm, "Nink" <nink...@gmail.com> wrote:
> Is interferance from 2 kinects that big a problem.

My understanding in past discussions with Microsoft engineers is that
they were already using multiple Kinect's for some of their projects
and didn't make it sound like there was a lot of extra effort to do
so.

Kihnect

unread,
Nov 13, 2010, 1:26:21 AM11/13/10
to OpenKinect

> This brings another question does anyone know if the dot matrix a fixed
> pattern, or can it be changed?

Just looking at the projection portion and the patents, I'm guessing
some form of fixed holographic diffraction pattern in front of an
infrared laser and based on the wiring in one of the teardowns, I
didn't see enough for any actual data beyond some basic switching, so
pretty sure it's a fixed/preprinted style pattern.
Reply all
Reply to author
Forward
0 new messages