From their docs, under "Capabilities":
"""
Cropping: Enables a map generator (depth, image, IR) to output a
selected area of the frame as opposed to the entire frame. When
cropping is enabled, the size of the generated map is reduced to fit a
lower resolution (less pixels). For example, if the map generator is
working in VGA resolution (640x480) and the application chooses to
crop at 300x200, the next pixel row will begin after 300 pixels.
Cropping can be very useful for performance boosting.
"""
Does this happen on-board the device itself, so that the extraneous
information is never even sent over USB? I'm hoping the answer is
'yes'... :)
Pat
On Dec 8, 9:18 am, Joshua Blake <
joshbl...@gmail.com> wrote:
> Hello OpenKinect!
>
> A few weeks ago, I was contacted by PrimeSense co-founder Tamir Berliner,
> who wanted to discuss their plans and ways PrimeSense could engage with the
> community. (Microsoft licensed PrimeSense’s technology and based Kinect on
> the PrimeSense reference
> design<
http://www.wired.co.uk/magazine/archive/2010/11/features/the-game-cha...>.)
> After seeing the dozens of Kinect videos and amazing enthusiasm from our
> community, PrimeSense decided to move up their plans and today announced the
> OpenNI <
http://www.openni.org/> initiative and have open sourced their
> middleware for natural interaction and drivers for their depth cameras.
>
> PrimeSense is setting up OpenNI is being set up as a standards consortium
> for natural interaction, including motion tracking, voice, and others. As a
> part of today's announcement, they have released three things:
>
> 1) Open source drivers for the PrimeSensor reference devicehttp://
github.com/PrimeSense/Sensor
>
> This is based upon the same SOC as Kinect. There are probably some knowledge
> in that code!
>
> 2) The open source OpenNI frameworkhttps://
github.com/OpenNI/OpenNI
>
> This is framework supports pluggable nodes that can let us create
> applications that work against any depth sensor that supports OpenNI and
> take advantage of any node (for example, skeleton tracking) that is
> available.
>
> 3) NITE (Seehttp://
openni.org/?q=node/2for download link and license key)
>
> This is a binary-only release for Linux and Windows that includes a
> PrimeSense implementation of skeleton tracking and gesture recognition, plus
> sample source code. NITE is an OpenNI module.
>
> I'll be posting more information later about how some of this works. I
> already have started an OpenNI module interface for libfreenect that will
> allow us to plug Kinect in with OpenNI and take advantage of the
> capabilities of OpenNI and NITE, as shown in these videos:
>
> Skeleton Extraction:
>
http://www.youtube.com/watch?v=nr8vgCnb9_0
>
> Scene Analyzer (user separation):
>
http://www.youtube.com/watch?v=-_j7BzSmlfU
>
> Thanks!
> Josh
>
> ---
> Joshua Blake
> Microsoft Surface MVP
> OpenKinect Community Founderhttp://
openkinect.org