hardware hacking...

353 views
Skip to first unread message

Mike Harrison

unread,
Nov 18, 2010, 8:52:58 AM11/18/10
to openk...@googlegroups.com
Just got a Kinect with the intention of ripping it apart and investigating the details of the
hardware, in particular the illuminator and depth sensor. I have extenmsive experience in hardware
reverse engineering so hope to be able to get some useful information!

I know zero Linux & would rather not have to spend ages getting it to turn on - would be greatful if
someone can someone point me to the quickest & easiest way to get a display of depth data on a
windows PC.

Christopher Jowett

unread,
Nov 18, 2010, 8:59:00 AM11/18/10
to openk...@googlegroups.com
In the Git repository, there are windows drivers and example code...
we are doing all of this cross platform for the big 3 OS's (Linux, Mac
OSX and Windows)

https://github.com/openkinect/libfreenect

after you git clone it, take a look in the platform folder.

Zsolt Ero

unread,
Nov 18, 2010, 9:14:56 AM11/18/10
to openk...@googlegroups.com
Instead of ripping it apart why don't you just read this at iFixit?

Mike Harrison

unread,
Nov 18, 2010, 9:23:07 AM11/18/10
to openk...@googlegroups.com
On Thu, 18 Nov 2010 14:14:56 +0000, you wrote:

>Instead of ripping it apart why don't you just read this at iFixit?
>http://www.ifixit.com/Teardown/Microsoft-Kinect-Teardown/4066/1
>

><http://www.ifixit.com/Teardown/Microsoft-Kinect-Teardown/4066/1>
>

Because I can't connect a scope to their pictures, or measure the optical output of the illuminator.

And I like taking things apart..

>In the Git repository, there are windows drivers and example code...
>we are doing all of this cross platform for the big 3 OS's (Linux, Mac
>OSX and Windows)

>https://github.com/openkinect/libfreenect

>after you git clone it, take a look in the platform folder.

Is there a newbie guide to using github anywhere... can't see any obvious documentation at
github.com -

I was kinda hoping someone had done some some precompiled USB driver /.exe demos available as a
simple download by now...!

Nink

unread,
Nov 18, 2010, 9:26:20 AM11/18/10
to openk...@googlegroups.com
>And I like taking things apart..

That sounds like a good enough reason to me. Ifixit= dream job.
Sent from my BlackBerry

johncalgary

unread,
Nov 18, 2010, 9:34:57 AM11/18/10
to OpenKinect
easiest one just to get a display on windows is here:
http://groups.google.com/group/openkinect/browse_thread/thread/ab150d174e105794#

William Cox

unread,
Nov 18, 2010, 9:40:20 AM11/18/10
to openk...@googlegroups.com
Mike,
Glad to hear there's some hardware hackers here.
Things I want to know:
1) laser wavelength
2) optical power
3) whether it's pulsed or continuous
4) output polarization

We'll start there :)
-William

Mike Harrison

unread,
Nov 18, 2010, 9:44:07 AM11/18/10
to openk...@googlegroups.com
On Thu, 18 Nov 2010 09:40:20 -0500, you wrote:

>Mike,
>Glad to hear there's some hardware hackers here.
>Things I want to know:
>1) laser wavelength
>2) optical power
>3) whether it's pulsed or continuous
>4) output polarization
>
>We'll start there :)
>-William

Me too, as soon as I can get it fired up, which is why I'm looking for a quick-start to get the
thing running without learning a ton of software stuff first!!

I have a monochromator to determine wavelength, laaser power meter and should be able to determine
IR sensor resolution by looking at the data from the sensors.

Mike Harrison

unread,
Nov 18, 2010, 1:10:08 PM11/18/10
to openk...@googlegroups.com
On Thu, 18 Nov 2010 06:34:57 -0800 (PST), you wrote:

>easiest one just to get a display on windows is here:
>http://groups.google.com/group/openkinect/browse_thread/thread/ab150d174e105794#

Thanks - that's exactly what I was after!


A few findings :

Wavelength is 830nm, no modulation. Camera probably has a narrowband filter, as it seems pretty
immune to light from other sources, e.g. blasting a 950nm IR remote directly at it.

Power measures about 60mW at the aperture. It's likely the patterning element introduces significant
loss, so raw laser power could easily be in the 100-200mW range - certainly an eye hazard if the
diffuser is removed!

The peltier element behind the laser is NOT for cooling, but for temperature stabilisation - this is
not uncommon in lasers to stabilise the wavelength. I suspect there may be some optical feedback to
stabilise power level.
measuring the current into the peltier, it stabilises at around 120mA at room temp. Spraying freezer
on the backplate causes this to increas to over 500mA. Applying heat to the packplte, the current
decreases through zero,changes polarity then increases.
At room temp, the peltier is actually heating the laser, not cooling it.
There must be something important about a stable wavelength to justify the cost of active control -
either the pattern generator is wavelength-sensitive, or it's to keep it within the camera filter's
passband.

There's a thermal fuse in contact with the illuminator housing - clearly a safety-critical
protection device as it's wired in series with the 12v supply.
I wonder if this is to protect against a damaged optical assembly causing the laser to hit the
housing wall and start melting it...

Or could be to do with peltier power if the sensing element fails, or both.

Only the IR cam has a shield over it, and also has a ferrite-loaded sleeve on the flex - my guess
this is to minimise noise pickup on the camera's analogue section.

Based on sync and pixel data waveforms, the sistance camera appears to be 1200x960
Pixel clock 48MHz, Frame period 33.33ms of which 31ms active, line period 31.8uS of which about
25.2uS active


Florian Echtler

unread,
Nov 18, 2010, 1:44:13 PM11/18/10
to openk...@googlegroups.com
Awesome. I bow to your oscilloscope skills. :-)
I'm surprised that the laser isn't pulsed - I would have expected that.
Did anyone happen to try a Kinect in direct sunlight so far? I doubt
that an unpulsed laser would be able to cope with that...

Florian

Arthur Wolf

unread,
Nov 18, 2010, 6:11:10 PM11/18/10
to OpenKinect

>
> Wavelength is 830nm, no modulation. Camera probably has a narrowband filter, as it seems pretty
> immune to light from other sources, e.g. blasting a 950nm IR remote directly at it.
>
>

Do you think that means it can be used in water ?
( http://groups.google.com/group/openkinect/browse_thread/thread/3321fd34fc051ce0/f173de32103bcae6?lnk=gst&q=under+water#f173de32103bcae6
and http://en.wikipedia.org/wiki/File:Water_absorption_spectrum.png )

Mike Harrison

unread,
Nov 18, 2010, 6:20:33 PM11/18/10
to openk...@googlegroups.com

Probably, however if there's an air gap between the water and camera/illuminator lenses, the
refractive index at the boundaries may mess things up.
You may need to use something like a clear silicone compound to fill the gap to avoid this, making
an air-free path from the lens surfaces and the water.

Mathieu Bosi

unread,
Nov 18, 2010, 6:32:38 PM11/18/10
to openk...@googlegroups.com
Hi,
some interesting links appear in google with the following keywords:

 "near infrared" water

best

Mathieu

2010/11/19 Mike Harrison <mi...@whitewing.co.uk>

Nink

unread,
Nov 18, 2010, 6:47:02 PM11/18/10
to openk...@googlegroups.com

>Do you think that means it can be used in water ?

Why don't you test it and find out!

http://www.wikihow.com/Make-Your-Own-Vacuum-Sealed-Storage-Bags
Sent from my BlackBerry

-----Original Message-----
From: Arthur Wolf <wolf....@gmail.com>
Sender: openk...@googlegroups.com
Date: Thu, 18 Nov 2010 15:11:10
To: OpenKinect<openk...@googlegroups.com>
Reply-To: openk...@googlegroups.com

vinot

unread,
Nov 18, 2010, 7:06:32 PM11/18/10
to OpenKinect
Wow, no modulation no TOF!!!

Florian, you are right. So it means cams and dots must be
calibrated ?? they made a good job.

> Wavelength is 830nm, no modulation. Camera probably has a narrowband filter, as it seems pretty
> immune to light from other sources, e.g. blasting a 950nm IR remote directly at it.

interesting, this weekend will do some test outside at sunlight

> Power measures about 60mW at the aperture. It's likely the patterning element introduces significant
> loss, so raw laser power could easily be in the 100-200mW range - certainly  an eye hazard if the
> diffuser is removed!

sure, and strange because no advertising label outside! I guess it was
powered, but not as this range.

> Based on sync and pixel data waveforms, the sistance camera appears to be 1200x960
> Pixel clock 48MHz, Frame period 33.33ms of which 31ms active, line period 31.8uS of which about
> 25.2uS active

sorry, what does it exactly means? 1200x960 pixel resolution at 30
fps? this is a very good sensor, enough to get a very good DIY motion
capture system....

Toni

Mike Harrison

unread,
Nov 18, 2010, 7:09:46 PM11/18/10
to openk...@googlegroups.com

A thought on multiple kinects - as the illuminator is not modulated, it ought to be possible to
sequentially switch multiple illuminators and select the corresponding frames from the appropriate
camera.

You'd need to sync the cameras together, and switch the illuminators accordingly, but this shouldn't
be too hard, by using a phase-locked loop on the 12MHz crystal, and the frame sync signal from the
depth camera sensor.

This assumes there is no frame-to-frame processing going on.

Incidentally I did see the camera module receiving 2 bytes of data via (presumably) I2C every frame,
but didn't notice any differences in the data value - it's conceivable this may be something like
gain control for auto-exposure.

Another thing I noticed was the relative positions of the illuminator and sensor are very sensitive
to change, which may explain the metal frame - even a sligt bend of the frame makes a noticeable
difference to the depth image.

Mike Harrison

unread,
Nov 18, 2010, 7:21:21 PM11/18/10
to openk...@googlegroups.com

>> Power measures about 60mW at the aperture. It's likely the patterning element introduces significant
>> loss, so raw laser power could easily be in the 100-200mW range - certainly  an eye hazard if the
>> diffuser is removed!
>
>sure, and strange because no advertising label outside! I guess it was
>powered, but not as this range.

It's a Class 1 device (mostly harmless) because the power is spread out over a wide area,
A gazillion watt laser product is still class 1 if the beam can't get out of the box.
I suspect it should really be class 1M (unsafe with optical aids) as it may be possible to refocus
the pattern into a small area.

>> Based on sync and pixel data waveforms, the sistance camera appears to be 1200x960
>> Pixel clock 48MHz, Frame period 33.33ms of which 31ms active, line period 31.8uS of which about
>> 25.2uS active
>
>sorry, what does it exactly means? 1200x960 pixel resolution at 30
>fps?

Yes.

But the depth resolution is less as it needs to resolve the dot pattern - depth image resolution is
widely quoted as 640x480, i.e. about 1.5 image pixels per depth pixel, which seems plausible.
For images comprising known-size dots, it could conceivably be doing centroid fitting to get
sub-pixel resolution on dot position.

Shaun Husain

unread,
Nov 18, 2010, 7:50:43 PM11/18/10
to openk...@googlegroups.com
Think the only way you'll be able to get someone to pre-compile something for you at this point is by giving a full description of your setup so someone can just compile the code on a similar system and send it off to ya.  If you have an extra PC sitting around (even a piece of crap) ubuntu 10.10 is very very easy to get running, (depending on internet connection can be up and running in about an hour, including downloading the iso and burning), from that point I could walk you through a simple step by step to download the files you need and get it running basically this is it
open a terminal in ubutnu you can find this under the applications menu in the top left panel under accessories -> terminal
type or copy this to get the dependencies for building on there

sudo apt-get install build-essential freeglut3-dev libusb-1.0-dev cmake

put in your password when prompted and hit Y when it asks if you'd like to install these packages
download the tar.gz from the git site
That link will get you the tar ball (like zip in windows terms, ubuntu has an app to open it installed) of the source as it is, no need for git if you're not trying to keep on the latest software or contribute code.  So then you extract that folder somewhere just drag and drop is the easiest (from the file roller) to like your home directory, go back to the console go into the open kinect folder you just dropped (cd is change directory dir will work for basic listings usually ls for list is more commonly used).  So you navigate into there in the terminal then type:
mkdir build
cd build
cmake ..
sudo ./glview
This will work and isn't all that difficult, if you run into issues you can let me know, otherwise I would suggest posting details about the system you're attempting to hook it to including OS version and hardware (particularly if you're 32 or 64 bit).

Shaun

Kihnect

unread,
Nov 18, 2010, 8:47:28 PM11/18/10
to OpenKinect
While you're poking around, the PrimeSense patent says there should be
a JTAG interface in there somewhere. It would be interesting to know
what information may be available from that.

Dan Small

unread,
Nov 18, 2010, 11:23:23 PM11/18/10
to openk...@googlegroups.com
Florian,

I have tried the system in direct sunlight coming in through a window with really bad results.  The IR in the sunlight kills the signal to noise ratio.  It is basically black (undefined depth values).  It would be very cool to try it at night outside though. I think the range in a non-sunlit area goes out to about 7m!  Also, it seems to do fine with glass in front.

--Dan

Arthur Wolf

unread,
Nov 19, 2010, 3:29:46 AM11/19/10
to OpenKinect

On Nov 19, 12:47 am, "Nink" <nink...@gmail.com> wrote:
> >Do you think that means it can be used in water ?
>
> Why don't you test it and find out!  

Because I don't have my kinect yet.
I'm also waiting for an old digital camera to modify for seeing
infrared.

Anyone with an aquarium and a kinect around here ? :)

About the difraction, my instinct tells me it may still give me usable
data even if it's transformed a bit.
Anyone tried putting lenses in front of the IR projector to see if
they can change the range of the kinect ( see closer/see further ? )


>
> http://www.wikihow.com/Make-Your-Own-Vacuum-Sealed-Storage-Bags
> Sent from my BlackBerry
>
>
>
>
>
>
>
> -----Original Message-----
> From: Arthur Wolf <wolf.art...@gmail.com>
>
> Sender:openk...@googlegroups.com
> Date: Thu, 18 Nov 2010 15:11:10To: OpenKinect<openk...@googlegroups.com>Reply-To:openk...@googlegroups.com
> Subject: Re: IR camera resoultion, illuminator info (was Re: hardware hacking...)
>
> > Wavelength is 830nm, no modulation. Camera probably has a narrowband filter, as it seems pretty
> > immune to light from other sources, e.g. blasting a 950nm IR remote directly at it.
>
> Do you think that means it can be used in water ?
> (http://groups.google.com/group/openkinect/browse_thread/thread/3321fd...
> andhttp://en.wikipedia.org/wiki/File:Water_absorption_spectrum.png)

Chriss

unread,
Nov 19, 2010, 4:20:15 AM11/19/10
to OpenKinect
On 19 Nov., 05:23, Dan Small <dansmal...@gmail.com> wrote:
> Florian,
>
>... Also, it seems to do fine with glass in front.
>
> --Dan

Thanks for the info. This is great. I was wondering if it is possible
to use kinect under a glas table to rebuild a system like the MS
Surface.
As soon as i have the kinect I'll try it with libtisch

Mike Harrison

unread,
Nov 19, 2010, 9:56:03 AM11/19/10
to openk...@googlegroups.com

nothing obvious - there is an unpopulated FFC connector but this aligns with a hole on the faceplate
so probably for an optional feature.

There are tons of testpoints though, so JTAG is probably there somewhere - you'd need to get the
pinout of the Primesense chip to find it though.

Joshua Blake

unread,
Nov 19, 2010, 9:59:58 AM11/19/10
to openk...@googlegroups.com

What would the JTAG let us do?

On Nov 19, 2010 9:56 AM, "Mike Harrison" <mi...@whitewing.co.uk> wrote:

Zsolt Ero

unread,
Nov 19, 2010, 12:09:11 PM11/19/10
to openk...@googlegroups.com
The problem with this is that we don't know what is exactly the
calibration routine which the PrimeSense chip runs at every startup
and how long it is. Maybe if it sees a blank picture, and then again
some real data, it triggers a calibration start, and we have no idea
how much time is needed to finish that. Even if it takes say just 0.1
second then you couldn't do more than 5 fps with two cameras.

Can someone measure how much time does it take for the PrimeSense chip
to start sending point cloud data after the IR source is turned off
then on? Maybe it doesn't trigger a calibration.

Shaun Husain

unread,
Nov 19, 2010, 2:19:04 PM11/19/10
to openk...@googlegroups.com
Arthur,

I tried throwing one under my multi-touch table to see if it got any depth beyond the surface, unfortunately in my case it doesn't, I do have a stretched piece of rear projection material below the acrylic on my setup so that's likely reflecting the IR, although the IR my table puts through the acrylic seems to get through to a modified MS HD cam without much trouble, so not sure what the deal is there.

Kihnect

unread,
Nov 20, 2010, 12:47:29 AM11/20/10
to OpenKinect

> There are tons of testpoints though, so JTAG is probably there somewhere - you'd need to get the
> pinout of the Primesense chip to find it though.

This looked like it might make the process easier:

http://elinux.org/JTAG_Finder

> What would the JTAG let us do?

It's hard to tell without finding it and doing some reverse
engineering on it, but it could provide useful debug information on
things like the calibration process and might be useful for more
direct robotic interfacing projects.

Kihnect

unread,
Nov 20, 2010, 12:51:47 AM11/20/10
to OpenKinect
On Nov 19, 11:19 am, Shaun Husain <shaun.hus...@gmail.com> wrote:
> Arthur,
>
> I tried throwing one under my multi-touch table to see if it got any depth
> beyond the surface, unfortunately in my case it doesn't, I do have a
> stretched piece of rear projection material below the acrylic on my setup so
> that's likely reflecting the IR, although the IR my table puts through the
> acrylic seems to get through to a modified MS HD cam without much trouble,
> so not sure what the deal is there.

Kinect performs poorly with reflective surfaces. Try holding a
mirror in front of one.

Adam Crow

unread,
Nov 20, 2010, 1:19:33 AM11/20/10
to openk...@googlegroups.com
perhaps you can somehow polarise the rear projection material and
filter it out on the Kinect?
I'm unsire how this could be done.
Sunlight reflection off water can be polarised. Fishermen wear
polarised glasses to be able to peer into the water and overcome the
glare from the Sun.

--
Adam Crow BEng (hons) MEngSc MIEEE
Technical Director
DC123 Pty Ltd
Suite 10, Level 2
13 Corporate Drive
HEATHERTON VIC 3202
http://dc123.com
phone: 1300 88 04 07
fax: +61 3  9923 6590
int: +61 3 8689 9798

Mike Harrison

unread,
Nov 20, 2010, 6:47:34 AM11/20/10
to openk...@googlegroups.com
Something that has niggled at me since first opening this thing is how few connections there are on
the connector that links the front board to the rear one. There just don't seem enough pins for
there to be a sufficiently high bandwidth link from the sensor board to the Marvell chip to have
much to do with the sensing.

So I started wondering what that big Marvell chip is actually for...

Turns out it's for the audio, and has nothing to do with the depth sensor system.

I followed its connection to the USB hub, and shorted the data lines together.
The cameras still work - the only difference is the "NUI Audio" device doesn't appear.
There are tracks connecting it to the audio ADC chips.

Now what are they doing with that much processing power just for audio?
Can't find the exact part on the Marvell site but looking at their range it looks like one of their
Armada series with a clock of the order of 400MHz.
There is a SPI flash memory attatched, which I suspect contains a bootloader to allow the XBOX to
load code into the SDRAM.

Do any of the current XBOX games do cool stuff with audio?

I suspect this may also account for the discrepency between the power draw stated on the label and
actually measured when connected to a PC- the Marvell chip isn't actually doing anything!
Running on a PC, the heatsink certainly doesn't get hot enough to justify its presence.


Joshua Blake

unread,
Nov 20, 2010, 9:57:08 AM11/20/10
to openk...@googlegroups.com

Eventually with a custom firmware or loaded instructions we could make the Marvell do our bidding. If there are hidden hooks in the USB protocol, we could offload computations to it.

On Nov 20, 2010 6:47 AM, "Mike Harrison" <mi...@whitewing.co.uk> wrote:

Mike Harrison

unread,
Nov 20, 2010, 10:49:32 AM11/20/10
to openk...@googlegroups.com
I've been tinkering around the connector between the sensor and rear boards...
http://electricstuff.co.uk/kinhac1.jpg

I now have it running with only power and USB pins connected.
This means that with an external supply (+5V,+3.3V and +1.8V), the image sensor board should be
useable on its own, which would reduce power draw and size significatly for those wanting to use the
sensor on small/mobile platforms.

Unfortunately, the overall power draw is still too high to run from a single USB port, due to the
initial power draw of the peltier. Looks like the figures in the Primesense ref design didn't
include this.... There might be some ways round this.

I've updated the hardware page with pinouts etc.
http://openkinect.org/wiki/Hardware_info

Unfortunately I can't currently find the connector in stock anywhere..

One suggestion arising from this for those doing drivers and other software is not to assume that
the USB hub or audio or motor devices are present, and therefore not throw an error if they aren't
found.

Mike Harrison

unread,
Nov 20, 2010, 11:34:56 AM11/20/10
to openk...@googlegroups.com
I have now confirmed that connecting the USB pins on the camera board direct to a PC USB port works
:
http://electricstuff.co.uk/kinhac3.jpg

The only thing coming for the rear PCB is power.

Nink

unread,
Nov 20, 2010, 12:51:02 PM11/20/10
to openk...@googlegroups.com
The audio appears to be very sophisticated and has background noise cancellation built in. You can sit at the other end of the room and issue voice commands to the console such as xbox kinect play disk. I suspect the filtering out of the background noise must be happening on the marvel chip as I could not imagine they are doing the voice recognition on chip (or are they).

I have not seen any games using high end audio and voice rec is only available in the US and canada english.
Sent from my BlackBerry

-----Original Message-----

Jesús Torres

unread,
Nov 20, 2010, 8:00:44 PM11/20/10
to openk...@googlegroups.com
Interesante detalle. El sistema se puede aligerar.

Saludos.

Diarmuid Wrenne

unread,
Nov 20, 2010, 9:13:02 PM11/20/10
to OpenKinect
Hi Mike,

I got it working on windows with very little drama based on this
link.

http://openkinect.org/wiki/Getting_Started_Windows

The INF files for the 3 usb devices are included.

Good Luck

Diarmuid

cloverstreet

unread,
Feb 23, 2011, 1:13:33 AM2/23/11
to openk...@googlegroups.com
I want to an Audio Engineering Society presentation at Microsoft Research Labs, in which the head-dude of the audio processing stuff gave a presentation on the microphone array algorhythms.  It was way over my head, but basically there is a whole lot of untapped potential there.  The system has way better bg-noise subtraction, but also is able to detect where the sound is coming from.  I played an actual kinect game for the first time today and when you win a trophy you can record you and your fellow player. The avatars lips moving matched the person who was making sound.

The four channel mic array would be very fun to play with, but it is hard to come up with many creative uses.  I keep imagining some sort of Dune like voice-weapon/projectile game, but I don't think think the tech is there yet.

Christopher Overstreet
Reply all
Reply to author
Forward
0 new messages