Converting depth map into an editable 3D point cloud

4,139 views
Skip to first unread message
Message has been deleted

coredump9

unread,
Nov 24, 2010, 5:48:18 AM11/24/10
to OpenKinect

Hi all,

I would like to share my notes on how I dump and export the Kinect
depth maps into Meshlab so you can manipulate and visualize the 3D
point cloud. I looked around but did not find a complete processing
pipeline so I put one together.

http://www.borglabs.com/blog/create-point-clouds-from-kinect

I'm sure others have their own tools/methods. I'd like to exchange
ideas.

rgds,
Ben Bongalon

jonas Biensack

unread,
Nov 24, 2010, 6:23:16 AM11/24/10
to OpenKinect
Wow! When did you stolen my thoughts? :)

I came up with something very simular to your "tool chain". Please
excuse my english....
I am new to Meshlab, but what i've seen so far, sounds very well in
terms of flexibility and so on...
I would althoug like to first record some depth data (maybe like you
in a [some] bin file[s]) and then doing some
exporting / importing to MEshlab. In meshlab there are several nice
features like hole-closing algorithms, merging
point cloud algorithm...etc. The best since you have your data in
meshlab the first time, all test can be done with it.
All algorithems expecting the same kind of data...

keep up that good work. When I am at home, i will test your Code and
maybe we can starting doing something together...

greetz jones

whatnick

unread,
Nov 24, 2010, 8:38:19 AM11/24/10
to OpenKinect
There was the demo which adds points on the fly to blender. This takes
advantage of the libfreenect python bindings and blenders support for
python. You can grab the dpeth frame off the device and create a
corresponding blnder mesh. Instead of redrawing and refreshing it you
can simply pause the script and save the mesh to ply or any other
blender supported format using the export routines.

The live point preview can also help you identify and area of interest
or put a tableau together before investing in a mesh export.

Cheers,

Tisham.

Nink

unread,
Nov 24, 2010, 8:49:31 AM11/24/10
to openk...@googlegroups.com
Hi Tisham.

I only ever saw the video for the blender kinect demo. Do you know if anyone released any code
Sent from my BlackBerry

Nink

unread,
Nov 24, 2010, 8:57:17 AM11/24/10
to openk...@googlegroups.com
http://www.youtube.com/watch?v=yZSXXFwsyhc
Sent from my BlackBerry

-----Original Message-----
From: Florian Lier <f...@icram.de>
Sender: openk...@googlegroups.com
Date: Wed, 24 Nov 2010 14:51:44
To: <openk...@googlegroups.com>
Reply-To: openk...@googlegroups.com
Subject: Re: Converting depth map into an editable 3D point cloud

Hey all,

Can you (re)post the Blender-Demo link? I'm
basically implementing the same right now :/

Cheers,
Florian

Mikkel Staunsholm

unread,
Nov 24, 2010, 9:03:22 AM11/24/10
to openk...@googlegroups.com
really nice. only a "few" steps away from doing this in realtime: http://www.youtube.com/watch?v=5qF_qbaWt3Q

Mkkl.

Florian Lier

unread,
Nov 24, 2010, 9:11:02 AM11/24/10
to openk...@googlegroups.com
This video contains content from WMG. It is not available in your country.  :/

jonas Biensack

unread,
Nov 24, 2010, 9:33:43 AM11/24/10
to OpenKinect
Sounds good. But I want to process the depth data from the kinect. As
long as my own cluster at home contains only 2 pc's (lol)
the process power is limited. Real-time processing seems far away. So
I want to record first the depth data to -> raw, .obj , bin files or
something else.
Then I need a data format ready to be read by the meshlab software.
Then play with the data, calculate....etc.
I know and love Blender, but for reconstructing point cloud data...?!

greetz jones
> Tisham.- Zitierten Text ausblenden -
>
> - Zitierten Text anzeigen -

Mikkel Staunsholm

unread,
Nov 24, 2010, 9:40:12 AM11/24/10
to openk...@googlegroups.com
Ah. No biggy. Newest video from Linkin Park: waiting for the end. 

Mkkl.
Message has been deleted

Mikkel Staunsholm

unread,
Nov 24, 2010, 9:51:32 AM11/24/10
to openk...@googlegroups.com
agreed. Sorry.

Mkkl.

Den 24/11/2010 kl. 15.45 skrev jonas Biensack <jonas.b...@googlemail.com>:

> ???
>
> very constructiv...! Tagged as spam


>
> On 24 Nov., 15:40, Mikkel Staunsholm <staunsh...@gmail.com> wrote:
>> Ah. No biggy. Newest video from Linkin Park: waiting for the end.
>>
>> Mkkl.
>>
>> Den 24/11/2010 kl. 15.11 skrev Florian Lier <f...@icram.de>:
>>
>>
>>
>>> This video contains content from WMG. It is not available in your country. :/
>>
>>> On 11/24/2010 03:03 PM, Mikkel Staunsholm wrote:
>>
>>>> really nice. only a "few" steps away from doing this in realtime:http://www.youtube.com/watch?v=5qF_qbaWt3Q
>>
>>>> Mkkl.
>>

coredump9

unread,
Nov 24, 2010, 11:24:54 AM11/24/10
to OpenKinect

yes, I'd appreciate your feedback. I'll be going more work on the
code--- cleanup
(right now it's kinda hacky) and command options to for grabbing
sequences of
images with adjustable intervals, initial coundown delay, etc.

For the images I'm planning to add an option to save in PGM and PPM
formats
(see http://en.wikipedia.org/wiki/Netpbm_format) so the images can be
quickly
viewed using Gimp, Xv or other image viewers.

rgds,
Ben

On Nov 24, 6:33 am, jonas Biensack <jonas.biens...@googlemail.com>
wrote:

Nicolas Burrus

unread,
Nov 24, 2010, 11:26:49 AM11/24/10
to openk...@googlegroups.com
The rgbdemo I released yesterday has point cloud export to .ply file
that can be imported to meshlab / blender. It can also export a
triangulated mesh con texture coords, useful to combine with the color
grab and load a UV textured mesh. It can also grab color images /
depth output to .png files.

Maybe we could merge the code if you're implementing some additional features :)

coredump9

unread,
Nov 24, 2010, 11:30:11 AM11/24/10
to OpenKinect
Nice vids, and thanks Florian for the Blender idea. It could also
prove handy

Ben

On Nov 24, 6:03 am, Mikkel Staunsholm <staunsh...@gmail.com> wrote:
> really nice. only a "few" steps away from doing this in realtime:http://www.youtube.com/watch?v=5qF_qbaWt3Q
>
> Mkkl.
>

whatnick

unread,
Nov 24, 2010, 11:43:13 AM11/24/10
to OpenKinect
The Blender integration code uses the Python Wrappers and is very
simple indeed:

https://github.com/geckoslayer/blenderkinect/blob/master/

Cheers,

Tisham.

vinot

unread,
Nov 24, 2010, 1:29:37 PM11/24/10
to OpenKinect

Hello Nicholas,

thanks for this release, I'm running it ok. It goes very well (I
started to scan a moto) but it becames to slow when using 3d viewer.
So even it is very amazing to see the 3d output through a viewer, I
think it could be better to save ply without loosing lots of frames. I
say it just thinking about the possibility to add a prefix ($M$S$MS ?)
for every ply saved and to make several scans at once (here many
combinations could be useful). Maybe a command line? At least, when
calibrated, you're only interested in the scans, isn't it?

amazing this real time scanning on blender, will try to install it...

if interested in preformance or results, I can put it here.

yours,

toni ventura

Robert Walter

unread,
Nov 24, 2010, 3:42:21 PM11/24/10
to openk...@googlegroups.com
hey guys,

did someone already managed to make an accurate calibration of the depth sensor? this includes the intrinsic parameters of the sensor plus lens distortions plus depth calibration and would enable us to calculate the actual metric 3d positions of the measured points in relation to the sensor. it could be used for measureing distances and stuff. additionaly it could be useful to find the projection of these points into the RGB image.

so if someone has done an accurate calibration it would be cool to post all the parameters here.

cheers
robert

2010/11/24 vinot <frea...@gmail.com>

whatnick

unread,
Nov 24, 2010, 11:01:16 PM11/24/10
to OpenKinect
Hi Robert,

The Kinect sensors are not finely machined, i.e. calibrations do not
apply to all devices. You will have to calibrate your own. Willow
Garage and others have done a fair bit of work on establishing
calibration procedures.

Cheers,

Tisham.

coredump9

unread,
Nov 25, 2010, 11:00:12 AM11/25/10
to OpenKinect

Hi Nicolas,

Your rgbdemo seems impressive. It is much more feature-rich than what
I have been working on. Had I known about this, I may have not built
my
own tool chain.

But I tried to install in on a Mac OS X (10.5) and ran into compiler
issues.
Has anyone been able to get the RGB Demo working on the Mac?

/Ben

ps - HAPPY THANKSGIVING to everyone celebrating in the US!


On Nov 24, 8:26 am, Nicolas Burrus <nicolas.bur...@gmail.com> wrote:
> The rgbdemo I released yesterday has point cloud export to .ply file
> that can be imported to meshlab / blender. It can also export a
> triangulated mesh con texture coords, useful to combine with the color
> grab and load a UV textured mesh. It can also grab color images /
> depth output to .png files.
>
> Maybe we could merge the code if you're implementing some additional features :)
>
> On Wed, Nov 24, 2010 at 5:24 PM, coredump9 <b...@enablix.com> wrote:
>
> > yes, I'd appreciate your feedback. I'll be going more work on the
> > code--- cleanup
> > (right now it's kinda hacky) and command options to for grabbing
> > sequences of
> > images with adjustable intervals, initial coundown delay, etc.
>
> > For the images I'm planning to add an option to save in PGM and PPM
> > formats
> > (seehttp://en.wikipedia.org/wiki/Netpbm_format) so the images can be

Sigmac

unread,
Nov 25, 2010, 4:50:24 PM11/25/10
to OpenKinect
I am trying to get your python script working, however, I am getting
this in the terminal:

Traceback (most recent call last):
File "depth2cloud.py", line 9, in <module>
from pylab import *
ImportError: No module named pylab

Any ideas? I am running OSX 10.6.5

coredump9

unread,
Nov 26, 2010, 12:49:53 PM11/26/10
to OpenKinect
You will need to install pylab.
I believe you can install the matplotlib module and you should be in
business. You can find the instructions here:
http://matplotlib.sourceforge.net/users/installing.html

You can also more detailed information in
http://www.mtheory.co.uk/support/index.php?title=Installing_Python_-_iPython%2C_Numpy%2C_Scipy_and_Matplotlib_on_OS_X

Failing that, feel free to email me and I will help.

rgds,
Ben Bongalon

yair reshef

unread,
Nov 26, 2010, 1:23:24 PM11/26/10
to openk...@googlegroups.com, b...@enablix.com
thanks Ben for the ply dump
i see in the screencapture you posted, meshlab states 307200 points in a frame. 
as i understand every depth pixel is calculated based on 8 points.
with this much points (640x480) it looks like there is some extrapolation happening.
is it so? is there a raw raw dump :)
--
yair99@gmail
050-6301212
tlv, israel

coredump9

unread,
Nov 26, 2010, 9:33:57 PM11/26/10
to OpenKinect
Yair,
It's been discussed on a previous thread that even though the depth
images outputted by the Kinect is 640x480, the actual spatial
resolution may be less and some interpolation must be taking place.
While the argument is reasonable, there was no rigorous data to back
it up.

Having said that, the ROS team just released new code that provides
raw access to the IR image:
http://www.ros.org/news/2010/11/some-ros-treats-for-thanksgiving.html
and Radu has added it into the libfreenect github
https://github.com/ros-pkg-git/libfreenect

Thanks Radu!

With this new capability in hand, one can now do a quantitative
analysis on the number of IR dots projected on the image scene, the
diffraction pattern, etc.

Plus I think one could make a cool night vision camera. :)

Ben


On Nov 26, 10:23 am, yair reshef <yai...@gmail.com> wrote:
> thanks Ben for the ply dump
> i see in the screencapture
> <http://borglabs.com/wp-content/uploads/2010/11/livingroom_001-300x190...>you
> posted, meshlab states 307200 points in a frame.
> as i understand every depth pixel is calculated based on 8 points.
> with this much points (640x480) it looks like there is some extrapolation
> happening.
> is it so? is there a raw raw dump :)
>
>
>
> On Fri, Nov 26, 2010 at 7:49 PM, coredump9 <b...@enablix.com> wrote:
> > You will need to install pylab.
> > I believe you can install the matplotlib module and you should be in
> > business. You can find the instructions here:
> >http://matplotlib.sourceforge.net/users/installing.html
>
> > You can also more detailed information in
>
> >http://www.mtheory.co.uk/support/index.php?title=Installing_Python_-_...
Reply all
Reply to author
Forward
0 new messages