Announcing OpenNI

274 views
Skip to first unread message

Joshua Blake

unread,
Dec 8, 2010, 12:18:14 PM12/8/10
to OpenKinect
Hello OpenKinect!
 
A few weeks ago, I was contacted by PrimeSense co-founder Tamir Berliner, who wanted to discuss their plans and ways PrimeSense could engage with the community. (Microsoft licensed PrimeSense’s technology and based Kinect on the PrimeSense reference design.) After seeing the dozens of Kinect videos and amazing enthusiasm from our community, PrimeSense decided to move up their plans and today announced the OpenNI initiative and have open sourced their middleware for natural interaction and drivers for their depth cameras.
 
PrimeSense is setting up OpenNI is being set up as a standards consortium for natural interaction, including motion tracking, voice, and others. As a part of today's announcement, they have released three things:
 
1) Open source drivers for the PrimeSensor reference device http://github.com/PrimeSense/Sensor
 
This is based upon the same SOC as Kinect. There are probably some knowledge in that code!
 
2) The open source OpenNI framework https://github.com/OpenNI/OpenNI
 
This is framework supports pluggable nodes that can let us create applications that work against any depth sensor that supports OpenNI and take advantage of any node (for example, skeleton tracking) that is available.
 
3) NITE (See http://openni.org/?q=node/2 for download link and license key)
 
This is a binary-only release for Linux and Windows that includes a PrimeSense implementation of skeleton tracking and gesture recognition, plus sample source code. NITE is an OpenNI module.
 
I'll be posting more information later about how some of this works. I already have started an OpenNI module interface for libfreenect that will allow us to plug Kinect in with OpenNI and take advantage of the capabilities of OpenNI and NITE, as shown in these videos:
 
 
Scene Analyzer (user separation):
 http://www.youtube.com/watch?v=-_j7BzSmlfU
 
Thanks!
Josh

---
Joshua Blake
Microsoft Surface MVP
OpenKinect Community Founder http://openkinect.org

(cell) 703-946-7176
Twitter: http://twitter.com/joshblake
Blog: http://nui.joshland.org
Multitouch on Windows book: http://manning.com/blake


Eidur Arnason

unread,
Dec 8, 2010, 1:16:35 PM12/8/10
to openk...@googlegroups.com
Is it possible to dump the vector points through OSC to other programs ? (I am thinking Pure Data)

Best regards. Eidur

GUNNM

unread,
Dec 8, 2010, 1:26:21 PM12/8/10
to OpenKinect
Congratulation, result is very good, regarding the video.
I will follow your project.

> Hello OpenKinect!
>
> A few weeks ago, I was contacted by PrimeSense co-founder Tamir Berliner,
> who wanted to discuss their plans and ways PrimeSense could engage with the
> community. (Microsoft licensed PrimeSense’s technology and based Kinect on
> the PrimeSense reference
> design<http://www.wired.co.uk/magazine/archive/2010/11/features/the-game-cha...>.)
> After seeing the dozens of Kinect videos and amazing enthusiasm from our
> community, PrimeSense decided to move up their plans and today announced the
> OpenNI <http://www.openni.org/> initiative and have open sourced their
> middleware for natural interaction and drivers for their depth cameras.
>
> PrimeSense is setting up OpenNI is being set up as a standards consortium
> for natural interaction, including motion tracking, voice, and others. As a
> part of today's announcement, they have released three things:
>
> 1) Open source drivers for the PrimeSensor reference devicehttp://github.com/PrimeSense/Sensor
>
> This is based upon the same SOC as Kinect. There are probably some knowledge
> in that code!
>
> 2) The open source OpenNI frameworkhttps://github.com/OpenNI/OpenNI
>
> This is framework supports pluggable nodes that can let us create
> applications that work against any depth sensor that supports OpenNI and
> take advantage of any node (for example, skeleton tracking) that is
> available.
>
> 3) NITE (Seehttp://openni.org/?q=node/2for download link and license key)
>
> This is a binary-only release for Linux and Windows that includes a
> PrimeSense implementation of skeleton tracking and gesture recognition, plus
> sample source code. NITE is an OpenNI module.
>
> I'll be posting more information later about how some of this works. I
> already have started an OpenNI module interface for libfreenect that will
> allow us to plug Kinect in with OpenNI and take advantage of the
> capabilities of OpenNI and NITE, as shown in these videos:
>
> Skeleton Extraction:
>  http://www.youtube.com/watch?v=nr8vgCnb9_0
>
> Scene Analyzer (user separation):
>  http://www.youtube.com/watch?v=-_j7BzSmlfU
>
> Thanks!
> Josh
>
> ---
> Joshua Blake
> Microsoft Surface MVP
> OpenKinect Community Founderhttp://openkinect.org

Patrick Bouffard

unread,
Dec 8, 2010, 2:14:24 PM12/8/10
to OpenKinect
From their docs, under "Capabilities":
"""
Cropping: Enables a map generator (depth, image, IR) to output a
selected area of the frame as opposed to the entire frame. When
cropping is enabled, the size of the generated map is reduced to fit a
lower resolution (less pixels). For example, if the map generator is
working in VGA resolution (640x480) and the application chooses to
crop at 300x200, the next pixel row will begin after 300 pixels.
Cropping can be very useful for performance boosting.
"""

Does this happen on-board the device itself, so that the extraneous
information is never even sent over USB? I'm hoping the answer is
'yes'... :)

Pat

On Dec 8, 9:18 am, Joshua Blake <joshbl...@gmail.com> wrote:
> Hello OpenKinect!
>
> A few weeks ago, I was contacted by PrimeSense co-founder Tamir Berliner,
> who wanted to discuss their plans and ways PrimeSense could engage with the
> community. (Microsoft licensed PrimeSense’s technology and based Kinect on
> the PrimeSense reference
> design<http://www.wired.co.uk/magazine/archive/2010/11/features/the-game-cha...>.)
> After seeing the dozens of Kinect videos and amazing enthusiasm from our
> community, PrimeSense decided to move up their plans and today announced the
> OpenNI <http://www.openni.org/> initiative and have open sourced their
> middleware for natural interaction and drivers for their depth cameras.
>
> PrimeSense is setting up OpenNI is being set up as a standards consortium
> for natural interaction, including motion tracking, voice, and others. As a
> part of today's announcement, they have released three things:
>
> 1) Open source drivers for the PrimeSensor reference devicehttp://github.com/PrimeSense/Sensor
>
> This is based upon the same SOC as Kinect. There are probably some knowledge
> in that code!
>
> 2) The open source OpenNI frameworkhttps://github.com/OpenNI/OpenNI
>
> This is framework supports pluggable nodes that can let us create
> applications that work against any depth sensor that supports OpenNI and
> take advantage of any node (for example, skeleton tracking) that is
> available.
>
> 3) NITE (Seehttp://openni.org/?q=node/2for download link and license key)
>
> This is a binary-only release for Linux and Windows that includes a
> PrimeSense implementation of skeleton tracking and gesture recognition, plus
> sample source code. NITE is an OpenNI module.
>
> I'll be posting more information later about how some of this works. I
> already have started an OpenNI module interface for libfreenect that will
> allow us to plug Kinect in with OpenNI and take advantage of the
> capabilities of OpenNI and NITE, as shown in these videos:
>
> Skeleton Extraction:
>  http://www.youtube.com/watch?v=nr8vgCnb9_0
>
> Scene Analyzer (user separation):
>  http://www.youtube.com/watch?v=-_j7BzSmlfU
>
> Thanks!
> Josh
>
> ---
> Joshua Blake
> Microsoft Surface MVP
> OpenKinect Community Founderhttp://openkinect.org

Chris OShea

unread,
Dec 8, 2010, 2:57:13 PM12/8/10
to OpenKinect
Great!

Does the driver (1) only work with PrimseSense dev camera, or will it
work with Kinect on Windows, or is there limitations by Microsoft to
do this?

I've only used Kinect on Mac so far. Have installed the "PrimeSensor
PSDK 5.0 Driver build 24 for windows" but with Kinect plugged in,
doesn't seem to do anything. Just curious.

Thanks

Kyle Machulis

unread,
Dec 8, 2010, 3:03:41 PM12/8/10
to openk...@googlegroups.com
On Wed, Dec 8, 2010 at 10:16 AM, Eidur Arnason <sprot...@googlemail.com> wrote:
Is it possible to dump the vector points through OSC to other programs ? (I am thinking Pure Data)


Yup, should be possible. We need to analyze what all NITE is capable of, what it outputs, and then build an OSC server from there.

On Wed, Dec 8, 2010 at 11:57 AM, Chris OShea <pixe...@gmail.com> wrote:
Great!

Does the driver (1) only work with PrimseSense dev camera, or will it
work with Kinect on Windows, or is there limitations by Microsoft to
do this?

Their driver only works with the PrimeSense reference implementation, but the framework is written os that new "nodes" supporting other hardware (kinect or otherwise) can be implemented. JoshBlake is working on a libfreenect wrapper currently, and there's also a quick hack version at


I've only used Kinect on Mac so far. Have installed the "PrimeSensor
PSDK 5.0 Driver build 24 for windows" but with Kinect plugged in,
doesn't seem to do anything. Just curious.

Yeah, it's looking for a completely different VID/PID, so it won't find the kinect. Best bet for quickest implementation for skeletal tracking stuff is probably the above solution of someone getting an OSC server running on top of the NITE layer. Takes two OSes at that point to do anything, which kinda blows, but since we're dependant on a binary layer for the fun stuff, we'll need OS X support from PS for NITE before it can go direct. OSC seems viable for this though, since it's a lot less data than "The whole image" (though at this point it also sounds like having a local side image server so multiple sources could share the camera output might be a good idea).

holger

unread,
Dec 8, 2010, 3:09:40 PM12/8/10
to OpenKinect
can't wait to try this with the kinect. when will you have the module
ready so that we can try it?
isn't it possible to use their driver directly with the kinect?
thanks!

On 8 Dez., 18:18, Joshua Blake <joshbl...@gmail.com> wrote:
> Hello OpenKinect!
>
> A few weeks ago, I was contacted by PrimeSense co-founder Tamir Berliner,
> who wanted to discuss their plans and ways PrimeSense could engage with the
> community. (Microsoft licensed PrimeSense’s technology and based Kinect on
> the PrimeSense reference
> design<http://www.wired.co.uk/magazine/archive/2010/11/features/the-game-cha...>.)
> After seeing the dozens of Kinect videos and amazing enthusiasm from our
> community, PrimeSense decided to move up their plans and today announced the
> OpenNI <http://www.openni.org/> initiative and have open sourced their
> middleware for natural interaction and drivers for their depth cameras.
>
> PrimeSense is setting up OpenNI is being set up as a standards consortium
> for natural interaction, including motion tracking, voice, and others. As a
> part of today's announcement, they have released three things:
>
> 1) Open source drivers for the PrimeSensor reference devicehttp://github.com/PrimeSense/Sensor
>
> This is based upon the same SOC as Kinect. There are probably some knowledge
> in that code!
>
> 2) The open source OpenNI frameworkhttps://github.com/OpenNI/OpenNI
>
> This is framework supports pluggable nodes that can let us create
> applications that work against any depth sensor that supports OpenNI and
> take advantage of any node (for example, skeleton tracking) that is
> available.
>
> 3) NITE (Seehttp://openni.org/?q=node/2for download link and license key)
>
> This is a binary-only release for Linux and Windows that includes a
> PrimeSense implementation of skeleton tracking and gesture recognition, plus
> sample source code. NITE is an OpenNI module.
>
> I'll be posting more information later about how some of this works. I
> already have started an OpenNI module interface for libfreenect that will
> allow us to plug Kinect in with OpenNI and take advantage of the
> capabilities of OpenNI and NITE, as shown in these videos:
>
> Skeleton Extraction:
>  http://www.youtube.com/watch?v=nr8vgCnb9_0
>
> Scene Analyzer (user separation):
>  http://www.youtube.com/watch?v=-_j7BzSmlfU
>
> Thanks!
> Josh
>
> ---
> Joshua Blake
> Microsoft Surface MVP
> OpenKinect Community Founderhttp://openkinect.org

Chris OShea

unread,
Dec 8, 2010, 3:41:35 PM12/8/10
to OpenKinect
@holger
Kyle Machulis had already replied about this, see the email before
yours.

Radu Bogdan Rusu

unread,
Dec 8, 2010, 6:00:42 PM12/8/10
to openk...@googlegroups.com
The driver works with Kinect too. You can try our branch, full with ROS integration at https://github.com/ros-pkg-git/ni

Cheers,
Radu.
--
http://pointclouds.org

Message has been deleted
Message has been deleted

Sebastian Ortiz

unread,
Dec 10, 2010, 7:21:39 PM12/10/10
to openk...@googlegroups.com
Anyone have detailed instructions on how to use libfreenect with OpenNI and NITE? I did the folllowing with limited success (after switching to SSE2 since I have an older 3.6Ghz P4):

(1) cloned OpenNI , make && make install without any errors (as far as I can tell)
(2) grabbed kinect branch of github.com/boilerbots/Sensor/ make && make install without any errors
(3) Ran OpenNI/Platform/Linux-x86/Bin/Release/NiViewer. This runs but at less than 1 frame/sec.

I also built the NITE samples but none of them run due to this error: 
"Error initializing: Device Protocol: Bad Parameter sent!"

Has anyone been able to use NITE with freenect but without the enormous dependency of ROS? I'm running gentoo and after a considerable amount of time trying to get the NI branch of ROS working I'm stuck on this step:
rosinstall ~/ni /opt/ros/cturtle /tmp/ni.ri
I assume this means:
 rosinstall /NI_install_destination /ros_install_path/ /tmp/ni.ri
but after running that the command fails with: 

other config element is not a ros path /projects/ros/stacks
other config element is not a ros path /projects/ros/ros/core
other config element is a ros path /projects/ros/ros
<snip>
other config element is not a ros path /projects/ros/stacks/vision_opencv
other config element is not a ros path /projects/ros/stacks/visualization
other config element is not a ros path /projects/ros/stacks/visualization_common
other config element is not a ros path /projects/ros/stacks/visualization_tutorials
Bootstrapping ROS build
Rospack failed to build
Traceback (most recent call last):
  File "/usr/bin/rosinstall", line 5, in <module>
    pkg_resources.run_script('rosinstall==0.5.9', 'rosinstall')
  File "/usr/lib/python2.6/site-packages/pkg_resources.py", line 468, in run_script
    self.require(requires)[0].run_script(script_name, ns)
  File "/usr/lib/python2.6/site-packages/pkg_resources.py", line 1201, in run_script
    execfile(script_filename, namespace, namespace)
  File "/usr/lib/python2.6/site-packages/rosinstall-0.5.9-py2.6.egg/EGG-INFO/scripts/rosinstall", line 506, in <module>
    sys.exit(not rosinstall_main(sys.argv))
  File "/usr/lib/python2.6/site-packages/rosinstall-0.5.9-py2.6.egg/EGG-INFO/scripts/rosinstall", line 497, in rosinstall_main
    subprocess.check_call("source %s && rosmake ros --rosdep-install" % (os.path.join(options.path, 'setup.sh')), shell=True, executable='/bin/bash')
  File "/usr/lib/python2.6/subprocess.py", line 488, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'source /projects/kinect/NI/setup.sh && rosmake ros --rosdep-install' returned non-zero exit status 255


If anyone has detailed instructions on using NITE with ROS, I suppose I'd also be interested. Thanks for any help.
Reply all
Reply to author
Forward
0 new messages