DepthGenerator: Module or Node?

232 views
Skip to first unread message

Philip

unread,
Dec 15, 2010, 5:18:16 AM12/15/10
to OpenNI
Hi there,

a short question about how to integrate further 3d image sensors. For
first tests, I'd like to implement the DepthGenerator only. It's clear
that it's a ProductionNode -> Generator -> MapGenerator ->
DepthGenerator, but I'm not sure about which class to use:
xn::ModuleDepthGenerator oder xn::DepthGenerator?

At the moment, I'm not sure about the difference between these two
classes and their purpose. If someone could shed some light to this,
I'd be happy :-)

Regards,

Philip

P.S.: Same question goes for the other related classes like
ImageGenerator, ..., actually up til the difference between
ProductionNode and ModuleProductionNode.

Alon Neubach

unread,
Dec 15, 2010, 6:38:42 AM12/15/10
to openn...@googlegroups.com
Hi Philip,

If you'd like to integrate a new module into OpenNI that provides depth, you would have to write a class that inherits from xn::ModuleDepthGenerator and put that in a dll/shared library. You can have a look at the sample called NiSampleModule under OpenNI's Samples directory to see how to do that. The same goes for any other type of generator in OpenNI (scene analyzer, skeleton, etc).

The xn::DepthGenerator class is what an application can use to get the depth map. The OpenNI framework will know how to give it data from either the depth module you wrote or from our (PrimeSense's) depth generator module (But our depth generator is the best! ;).

Hope that answers your question,

Alon

Hi there,

Regards,

Philip

--
You received this message because you are subscribed to the Google Groups "OpenNI" group.
To post to this group, send email to openn...@googlegroups.com.
To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/openni-dev?hl=en.

Philip

unread,
Dec 15, 2010, 7:13:06 AM12/15/10
to OpenNI
Hi Alon,

thanks for your quick response! Actually I had a vague idea that it
might work this way, but your detailled explanation is really helpful.

I had the NiSampleModule in mind and hoped to be able to build upon
that, so that's what I'll do next.

> (But our depth generator is the best! ;).
Fact is, we have quite a few ToF cameras already integrated in our
lab. Nevertheless I've already signed up for a PrimeSensor SDK and am
looking forward to testing it and fact-checking you ;-)

Regards,

Philip

...who now

---

theowl84

unread,
Dec 21, 2010, 9:21:23 AM12/21/10
to OpenNI
Hi!

I have a similar problem where I want to write my own
ModuleDepthGenerator. I wrote an DLL similar to NiSampleModule and can
successfully send depth data to OpenNI (NiViewer successfully displays
the depth image).
Now I want to have NiUserTracker to fit a Skeleton into that depth
data. It works until the line

"nRetVal = g_UserGenerator.Create(g_Context, 0, &errors);"

which return the error 65537 and the console reads "Couldn't get
maxShift" twice after [Creating node "User1" of type "User..."].

I have PrimeSense/XnVSkeletonGenerator/1.3.0.17 installed. Executing
the same application with a Kinect senor and PrimeSense/SensorV2 works
fine.

Is there any special interface my DepthGenerator needs to implement or
perform calculations that are not included in the NiSampleModule?

Regards,
Matthias

Alon Neubach

unread,
Dec 21, 2010, 9:59:34 AM12/21/10
to openn...@googlegroups.com
Hi Matthias,

NITE can only work with PrimeSense's Sensor, not with any depth generator in general.

Alon

-----Original Message-----
From: openn...@googlegroups.com [mailto:openn...@googlegroups.com] On Behalf Of theowl84
Sent: Tuesday, December 21, 2010 4:21 PM
To: OpenNI

Hi!

Regards,
Matthias

--

Joshua Blake

unread,
Dec 21, 2010, 10:57:24 AM12/21/10
to openn...@googlegroups.com

Surely that is not true, otherwise what is the point of OpenNI?

CARMELO VELARDO

unread,
Dec 21, 2010, 11:03:16 AM12/21/10
to openn...@googlegroups.com
As from the Documentation of OpenNI:

OpenNI is a multi-language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API [...]


I read standard  as  standard also for device manufacturers.

Best,


CarmeloVELARDO

www.velardo.org

Joshua Blake

unread,
Dec 21, 2010, 11:17:55 AM12/21/10
to openn...@googlegroups.com

Exactly. If NITE (an OpenNI module) only works with the Sensor implementation of the Depth generator, then something is seriously wrong with either NITE, OpenNI, or both, and it hurts the credibility of OpenNI.

Especially if this is the case, I'd rather start fresh with a community-driven technical standard rather than use a de facto standard of a single API implementation.

Josh

On Dec 21, 2010 11:03 AM, "CARMELO VELARDO" <carmelo...@gmail.com> wrote:
> As from the Documentation of OpenNI:
>
> *OpenNI is a multi-language, cross-platform framework that defines APIs for

> writing applications utilizing Natural Interaction. OpenNI APIs are composed
> of a set of interfaces for writing NI applications. The main purpose of
> OpenNI is to form a standard API [...]
> *

>
> I read standard as standard also for device manufacturers.
>
> Best,
>
>
> CarmeloVELARDO*
> *
>
> *www.velardo.org* <http://www.velardo.org/>

>> .
>> > For more options, visit this group at
>> http://groups.google.com/group/openni-dev?hl=en.
>> >
>> > --
>> > You received this message because you are subscribed to the Google Groups
>> "OpenNI" group.
>> > To post to this group, send email to openn...@googlegroups.com.
>> > To unsubscribe from this group, send email to

>> .
>> > For more options, visit this group at
>> http://groups.google.com/group/openni-dev?hl=en.
>> >
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "OpenNI" group.
>> To post to this group, send email to openn...@googlegroups.com.
>> To unsubscribe from this group, send email to

Tamir Berliner

unread,
Dec 21, 2010, 12:39:27 PM12/21/10
to openn...@googlegroups.com

Hi guys,

 

This is a great discussion and I totally believe we should continue this thread and make sure everyone agree and are aligned with the results of this discussion!

 

OpenNI:

OpenNI’s goal is to make sure that ANY application developed on top of OpenNI does not break when being used with ANY OpenNI compliant device and middleware.

               

                Devices / middlewares that are supported today:

                                Skeleton (as user)

                                HandPoint

                                Gestures

                                SceneAnalyzer

                                Depth

                                Image

                                Audio

 

As far as the application cares, it doesn’t matter whether the skeleton was created with a depth map or an RGB camera or for that matter a motion capture device. As long as the Skeleton is OpenNI compliant the application will run.


Same goes for applications that use the depth or depth and image, as long as those use OpenNI any device that supports Image and depth and is OpenNI compliant will enable that application to work.

 

The standard does not define the separation between the device and the driver/middleware – for example it does not manage any USB protocol (such as UVC or UAC). So a possible OpenNI compliant middleware can be an OpenCV module that creates depth from two RGB cameras.

 

Summary: the relationship between the different modules of OpenNI was intentionally left undefined in order to be able to support any constellation of devices / middleware as the goal of the standard is to create Natural Interaction APIs.

 

One of the goals of OpenNI IS TO PROMOTE competition on all the different layers and make sure that any vendor can implement any interface without dependencies. This does not mean that any vendor must make any part their solution be a general module that supports all other devices (think of PS move for example, they can implement an OpenNI skeleton without having depth! We don’t want to limit such a possible solution!)

 

NITE / PrimeSense’s PSDK:

               

The PSDK of PrimeSense implements the Depth and Image portions of OpenNI using the hardware and implements the Skeleton, SceneAnalyzer, HandPoint and Gestures portion using the NITE middleware.

 

Any application that uses OpenNI and uses any of these modules can use PrimeSense’s implementation of OpenNI. It doesn’t mean that any device that implements depth will have PrimeSense’s skeleton work out of the box (most likely a ToF depth will not be compatible due to the nature of the data and of course a PS move driver cannot be used to make PrimeSense’s skeleton work)

 

Having said that, I’ve already asked that PrimeSense will help to articulate how NITE is using the PSDK to extract the extra juice and get better performance (up to a level of IP regarding the skeletal extraction algorithms).

 

Please note that PrimeSense did mention from the beginning the NITE is designed and meant to run solely on PrimeSense’s SoC based devices.

From NITE’s license agreement: copy and distribute the end-user version of the NITE SDK in object code format to be used with Applications (“NITE”) on hardware produced, licensed by, or available under authority of PrimeSense or containing the chip validly associated with PrimeSense  technology (“Authorized Hardware”) pursuant to an online end-user license agreement with PrimeSense (“EULA”);

 

One quick note – I personally, both as the OpenNI community manager and as the founder of PrimeSense promote creation of an alternative skeleton that can run on all devices. I’ve even approached people who discussed creation of an alternative skeleton and expressed the fact that we can support them

 

PrimeSense is a commercial entity with a business model of selling hardware components (chips). I’ve been working very hard on getting the company to open it’s middleware to the community and I am still working on making sure OpenNI gets a consortium ASAP. NITE is PrimeSense’s implementation for some of the OpenNI capabilities and I believe that most people will agree that it does a good job there.

 

I would like to open the following subject to discussion though: Do you believe that we need to make the certification of OpenNI to include also the protocol between the different middlewares and the devices? If so does OpenNI need to also define a depth protocol over USB? (like UVC and UAC?)

 

So far my goal in OpenNI was to lead a standard so that applications can be deployed everyone and with every device that supports the OpenNI implementation of the required modules. In the longer term I would like to make sure all vendors will want to support OpenNI as every application out there will run with their HW, and every application developer will want to work with OpenNI because it knows it will run on the largest installed based possible.

 

The business models I had in mind that could stimulate each of the different vendors:

1.       HW – don’t worry about middleware and applications, you don’t need to create your own content as it’s already out there.

2.       Middleware – create device agnostic middleware and provide the best performance, this will make people want to buy a license to your middleware as it provides the best solution in the market

3.       Applications – Any application using OpenNI can know his application will work with any device but can still choose preferred middleware / HW configurations (hence the production chains in OpenNI)

 

I think this can be a great discussion and I look forwards to hear you opinion!

 

Thanks!

TamirB.

Joshua Blake

unread,
Dec 21, 2010, 2:11:14 PM12/21/10
to openn...@googlegroups.com
Tamir,
 
Thanks for the extensive response.
 
I understand that skeleton extraction middleware may a certain range of qualities about a depthmap generator. We wouldn't expect NITE skeleton to work with low spatial resolution devices such as LIDAR or geospatial depth maps (except to perhaps look for extraterrestrial constructs). Perhaps the standard OpenNI needs to require metadata describing the expected spatial resolution, uncertainty/noise, or other parameters that generators have so that middleware that consumes the data can say "You know I probably won't be able to generate hand points/a skeleton from a 1m resolution depth map." The same metadata should go for other modalities such as audio, etc.
 
I also understand that PrimeSense is sharing NITE with the license restriction that it should only be used with PrimeSense chips ("PrimeSense Inside"?) It is a significant gift to the community and industry that PrimeSense has made NITE available for free.
 
That aside, if there is an unbreakable coupling between Sensor and NITE then I think NITE would be better as a (binary) plug-in to Sensor, not OpenNI. In general if a generator requires explicit knowledge or out-of-band (non-standard) communication with the hardware, then it would make sense that all of that is exposed as depth, skeleton, etc. generators from a single module, not multiple modules.
 
My impression of the current situation was that NITE could read any OpenNI Depth generator and produce gestures/skeleton/hand points etc. The projects that record and play back data imply that should work, but I might be mis-remembering the project associations.
 
Another point of clarification -- I was messing around with managedNITE, which is a wrapper for NITE. In that case am I even using OpenNI at all, or is NITE exposing a separate API/SDK? It is unclear what API applications should use in order to be independent of a specific middleware.
 
Here is what it should look like, IMHO:
 
Application
^^^^^^^^^^^^^^^^^
OpenNI
^^^^^^^^^^^^^^^^^
Sensor (open source)   <===== NITE plugin (binary or eventually open source)
^^^^^^^^^^^^^^^^^
PrimeSensor
 
where Sensor exposes Depth, Image, Skeleton/User, etc. generators within a single OpenNI module.
 
In this case it would be clear that you have to use Sensor to get NITE, and thus use Sensor compatible hardware (or the modified SensorKinect with Kinect perhaps.) It also eliminates some of the licensing risk regarding enforcing NITE only being used with PS hardware.
 
I do think that OpenNI should also account for hardware independent middleware as you described (but not any kind of depth over USB standard). For that to work the standard would have to include the metadata I described above.
 
Thanks,
Josh

---
Joshua Blake
Microsoft Surface MVP
OpenKinect Community Founder http://openkinect.org

(cell) 703-946-7176
Twitter: http://twitter.com/joshblake
Blog: http://nui.joshland.org
Multitouch on Windows book: http://manning.com/blake




cadet project

unread,
Jan 3, 2011, 9:31:12 AM1/3/11
to OpenNI
hi all,

we here @ CADET wanted to make something alike OpenNI, a modern open
source plugin architecture for input, output and transform nodes
mainly to support Fullbody Interaction, Immersive Experiences and
Affective Computing. To be honest my life would have been much easier
if PrimeSense wouldn't have released OpenNI and NITE.
For us it's quite hard to make a decission wheter to continue our old
plans or start developping in the framework of OpenNI. In terms of
skeleton segmentation we decided to go our own ways, sure the solution
of NITE is quite sophisticated but nevertheless it has it's flaws and
as it is closed source there won't be scientific exchange and we are
actually interested in that the most.
So far I wasn't aware that NITE is just intended to be used with PS
sensors, I find that a little disturbing as the whole thing was
released in the hype of KinectHacks. To my mind it undermined some
community efforts like ours or the NUI guys, show the masses something
fancy working almost perfectly so everyone imagine a bright future and
stops working and thinking about own solutions. In the end of the day,
I am still puzzled by Prime Sense's strategy behind that move. Sure
they are selling their sensor's now but they don't even have the
capacity to fulfill needs anyway, I used that form three times and
haven't even got a message back. To keep a long story short, I am
still skeptical in terms of OpenNI, maybe I am just paranoid.
However as far as I evaluated OpenNI right now, it feels quite
monolithic and not too modern in it's design principles: Interface
Classes, Inheritance, a basic PropertyManagement System, Parallization
and Synchronisation Interfaces,
sure it serves the purpose but in the end it feels like a quick and
specialiced shot at quite a complex problem regarding all different
vendors of middleware and available sensors. Not to get me wrong it is
sort of a good shot but not in terms if it should become a real open
source standard alike OpenGL, and that is what I get when I read about
it on the webpage or the manuals.
Maybe I just flip a coin if we will develop our own framework ;-)
best,
Robert

CADET - Center for Advances in Digital Entertainment Technologies
Robert Praxmarer, Scientific Head
http://www.cadet.at



On 21 Dez. 2010, 20:11, Joshua Blake <joshbl...@gmail.com> wrote:
> Tamir,
>
> Thanks for the extensive response.
>
> I understand that skeleton extraction middleware may a certain range of
> qualities about a depthmap generator. We wouldn't expect NITE skeleton to
> work with low spatial resolution devices such as LIDAR or geospatial
> depth maps (except to perhaps look for extraterrestrial
> constructs<http://science.nasa.gov/science-news/science-at-nasa/2001/ast24may_1/>).
> OpenKinect Community Founderhttp://openkinect.org
> > the NITE SDK in object code format to be used with Applications (“*NITE*”)
> > on hardware produced, licensed by, or available under authority of
> > PrimeSense or containing the chip validly associated with PrimeSense
> > technology (“*Authorized Hardware*”) pursuant to an online end-user
> > license agreement with PrimeSense (“*EULA*”);
> > *From:* openn...@googlegroups.com [mailto:openn...@googlegroups.com] *On
> > Behalf Of *Joshua Blake
>
> > *Sent:* Tuesday, December 21, 2010 6:18 PM
> > *To:* openn...@googlegroups.com
> > *Subject:* Re: [OpenNI-dev] Re: DepthGenerator: Module or Node?
>
> > Exactly. If NITE (an OpenNI module) only works with the Sensor
> > implementation of the Depth generator, then something is seriously wrong
> > with either NITE, OpenNI, or both, and it hurts the credibility of OpenNI.
>
> > Especially if this is the case, I'd rather start fresh with a
> > community-driven technical standard rather than use a de facto standard of a
> > single API implementation.
>
> > Josh
>
> > On Dec 21, 2010 11:03 AM, "CARMELO VELARDO" <carmelo.vela...@gmail.com>
> > wrote:
> > > As from the Documentation of OpenNI:
>
> > > *OpenNI is a multi-language, cross-platform framework that defines APIs
> > for
> > > writing applications
>
> ...
>
> Erfahren Sie mehr »

SteveElbows

unread,
Jan 3, 2011, 10:42:11 AM1/3/11
to OpenNI
I cannot really comment on the technical strengths and weaknesses of
OpenNI as a framework, due to lack of best-practice knowledge on my
part.

But I do have thoughts about some of the other stuff you mention. I
dont think PrimeSenses decision to release this stuff is at all
strange. They are a hardware company who want to work with other
companies to bring implementations of their sensor stuff to market.
Microsoft was the first example of this with the Kinect, and of course
other people managed to hack together some Kinect drivers and start to
play around. It makes complete sense that PrimeSense would take a
great interest in these developments, and would want to be be involved
considering it is their technology. Not least this is because everyone
seemed to be talking about Microsoft as the inventor of this stuff,
and just using the term 'Kinect' to cover everything, which is
inaccurate. Kinect is not just the sensor, it is the software
Microsoft made to deliver a robust consumer experience, and that
software is only for XBox360 platform.

PrimeSense have a large stake in the future of this technology when
used on computing platforms, and all the things that developers around
the globe could do with such technology. Faced with what was happening
on the 'Kinect' hacking scene, they could have taken a bad approach
that tried to excessively control what was happening. Instead they did
the right thing by releasing tools to help, and by doing it in the
form of a framework that can be used with other hardware and software
solutions that are not from PrimeSense in future. This is very
sensible and gives the framework some chance of success without
creating a messy and fragmented start to this new world of
interaction, a mess that would not benefit the consumer and may reduce
the chances of such technologies taking off in a big way. As a
developer I will want to write code once that will work with a range
of hardware in future, without too many headaches. As a consumer I
will want choice too. So I want OpenNI to succeed, and for most
solutions to make use of it. If it has flaws then hopefully these will
be addressed, it will evolve to meet new requirements etc. Time will
tell.

As for NITE, well in many ways it is the carrot that makes people take
notice of OpenNI in the first place. In an ideal world it would be
open, but I presume there are intellectual property issues with this
stuff, or other business reasons to keep it closed. I am just glad
that it has been released to us at all. Its all very well to ponder on
what other motivated developers could create in the realms of skeletal
tracking, but before the OpenNI & NITE announcement I was prepared for
a very long and frustrating wait. I think the hope is that others will
still develop their own middleware over time, I dont think NITE is
seen as the perfect and ultimate skeletal tracking and so there is
still room for others to develop their own solutions in this space,
and they will plug into OpenNI. But in the meantime developers who
want to get on with the app side of things dont have to wait for an
unknown length of time to get some skeletal tracking working for them.
And this makes sense for PrimeSense because they can surely get more
deals with other companies to use their hardware technology if there
are a load of apps that work with this stuff and great ideas floating
around. Ive forgotten whether you statement that NITE only works with
PrimeSenses hardware (which includes the Kinect sensor) is true or
not. If it is true then its a shame and will cause further conusion
about what exactly OpenNI is, as with such a framework people would
expect both hardware drivers and middleware modules to be
interchangeable. However it is up to middleware producers to define
the circumstances under which their middleware works, and for
PrimeSense both the origins of NITE (as an example of what can be done
with their hardware tech) and their business model do lend themselves
to making NITE not completely free-and open. All the more reason for
other Middleware developers to come along who have different
objectives and business models, and again I think OpenNI is an
opportunity for them rather than a hindrance.

When I look to the future the big unknowns for me are more about the
hardware. Their reference design for developers is just that, they
dont want to be in the business of selling direct to consumers
themselves, and want partners. This seems fairly straightforward for
spaces such as the 'set top box' for living room, where they can deal
directly with the usual companies to get their stuff integrated into
the device. But for the mass computer market I am more confused about
what will happen. We can expect that consumers will only want to buy a
sensor once for use with their computer, so who will be the
manufacturer(s) that gets the deal to bring this tech to the computer,
that manufacture and market the device? We could imagine that
PrimeSense expect to do deals with individual hardware manufacturers
to incorporate the tech directly into some models of computers, but
what about the mass market for buying is as a USB peripheral that
works with all modern computers? Right now people can buy a Kinect
sensor and get it to work, but its a very grey area isnt it? I
certainly dont think it would be a good idea to try to offer software
for computers to the wider consumer market without this question being
answered, as I assume Microsoft would be unhappy with the term
'Kinect' being associated with this stuff. We dont know what their
deal with PrimeSense is either, they may well only have the rights to
use the technology in the games console market for all I know. Now
that there are wrappers for things like Unity I think this question
will become relevant and urgent quite early in 2011. I certainly want
to setup a busines that uses these technologies but as I am not
planning on bringing hardware to market myself, I need this stuff to
be somewhat resolved before I can create a business with confidence
and a high degree of certainty about how the consumers with Windows &
OSX computers will be able to get the right hardware, and how I can
market my stuff without legal woes. Only have to read PrimeSenses
twitter feed to see that they are walking a fine line between
benefitting from all this 'Kinect hacking' and distancing themselves
from the term Kinect and the idea that OpenNI etc work with the Kinect
sensor.

Cheers

Steve
> Robert Praxmarer, Scientific Headhttp://www.cadet.at
> ...
>
> read more »

Tamir Berliner

unread,
Jan 3, 2011, 11:50:59 AM1/3/11
to openn...@googlegroups.com
Hi Robert,

Thank you for the insights and for the comments.

I'll try to answer both of your general questions:
1. Where is PrimeSense heading with this / what's the strategy.
2. OpenNI - is it what you were looking for or not.

I'll start with the second as it's more relevant to this forum:
OpenNI is still in alpha stages and I believe we will see modifications made to it in the future due to requirements and updates both given by the community and provided by the parties that work on this standard. It's not a secret that it was released a bit earlier than expected and thus not as well cooked as one would expect, but on the other hand it leaves it more open to opinions / modifications that you (personally and cadet project) can influence. I can't tell you not to chose your own path with your own framework, I can tell you that the goal of OpenNI was to make sure that the market won't be segmented/fragmented and that's why it's open source. We are working very hard to also make sure it will be managed in an open source format (we are not there yet)

With regards to PrimeSense - one must remember that the business model for PrimeSense is selling HW components to companies that build a 3D sensor using the PrimeSense reference design (basically selling chips and optical elements). In order to push the market forwards PrimeSense is selling HW to developers / partners to extend the general understanding of what can be done with the technology and to help grow the market faster.

The PSDKs are being sold in two general channels:
1. To OpenNI community developers - for the Price of $200 + shipment and handling, this does not include any support packages from PrimeSense.
2. To PrimeSense's partners - these are commercial entities that show a roadmap to developing an application for the living room usage*

*if you are asking why that is interesting for PrimeSense: PrimeSense is working with more companies to bring this technology to consumers in the forms of gesture controlled media centers.

I know that OpenNI might suffer a bit from lack of documentation (being worked on) and some maturity / communication problems. OpenNI is mainly driven these days by PrimeSense and Willow Garage... I can't speak for our partners at Willow Garage (who have been doing an amazing work as far as I can tell) but for PrimeSense it's a very busy time with CES just around the corner. I believe you will see a far greater effort being made on OpenNI in the coming up few months once CES has passed.

Please don't hesitate to email me here or directly to my email for any question / collaborations... I think joining forces on OpenNI is better than having two different efforts.

I hope this helps
TamirB
ta...@openni.org

hi all,

> > driver/middleware - for example it does not manage any USB protocol (such as

> > One quick note - I personally, both as the OpenNI community manager and as


> > the founder of PrimeSense promote creation of an alternative skeleton that
> > can run on all devices. I've even approached people who discussed creation
> > of an alternative skeleton and expressed the fact that we can support them
>
> > PrimeSense is a commercial entity with a business model of selling hardware
> > components (chips). I've been working very hard on getting the company to
> > open it's middleware to the community and I am still working on making sure
> > OpenNI gets a consortium ASAP. NITE is PrimeSense's implementation for some
> > of the OpenNI capabilities and I believe that most people will agree that it
> > does a good job there.
>
> > I would like to open the following subject to discussion though: Do you
> > believe that we need to make the certification of OpenNI to include also the
> > protocol between the different middlewares and the devices? If so does
> > OpenNI need to also define a depth protocol over USB? (like UVC and UAC?)
>
> > So far my goal in OpenNI was to lead a standard so that applications can be
> > deployed everyone and with every device that supports the OpenNI
> > implementation of the required modules. In the longer term I would like to
> > make sure all vendors will want to support OpenNI as every application out
> > there will run with their HW, and every application developer will want to
> > work with OpenNI because it knows it will run on the largest installed based
> > possible.
>
> > The business models I had in mind that could stimulate each of the
> > different vendors:
>

> > 1. HW - don't worry about middleware and applications, you don't


> > need to create your own content as it's already out there.
>

> > 2. Middleware - create device agnostic middleware and provide the


> > best performance, this will make people want to buy a license to your
> > middleware as it provides the best solution in the market
>

> > 3. Applications - Any application using OpenNI can know his

--

cadet project

unread,
Jan 6, 2011, 11:31:33 AM1/6/11
to OpenNI
thx for your quick answer, after some consideration we will start to
develop different device types for openni and middleware components
and see how that works out. and yes it would be great to collaborate
in the process of making things better, an open DirectInput just
better and platform independant would be pretty cool...
best,
Robert
> Robert Praxmarer, Scientific Headhttp://www.cadet.at
>
> On 21 Dez. 2010, 20:11, Joshua Blake <joshbl...@gmail.com> wrote:
> > Tamir,
>
> > Thanks for the extensive response.
>
> > I understand that skeleton extraction middleware may a certain range of
> > qualities about a depthmap generator. We wouldn't expect NITE skeleton to
> > work with low spatial resolution devices such as LIDAR or geospatial
> > depth maps (except to perhaps look for extraterrestrial
> > constructs<http://science.nasa.gov/science-news/science-at-nasa/2001/ast24may_1/>).
> > Perhaps the standard OpenNI needs to require metadata describing the
> > expected spatial resolution, uncertainty/noise, or other parameters that
> > generators have so that middleware that consumes the data can say "You know
> > I probably won't be able to generate hand points/a skeleton from a 1m
> > resolution depth map." The same metadata should go for other modalities such
> > as audio, etc.
>
> > I also understand that PrimeSense is sharing NITE with the license
> > restriction that it should only be used with PrimeSense chips ("PrimeSense
> > Inside"?) It is a significant gift to the community and industry that
> > PrimeSense has made NITE available for free.
>
> > That aside, if there is an unbreakable coupling between Sensor and NITE then
> > I think NITE would be better as a (binary) plug-in to Sensor, not OpenNI. In
> > general if a generator requires explicit knowledge or out-of-band
> > (non-standard) communication with the hardware, then it would make sense
> > that all of that is exposed as depth, skeleton, etc. generators from a
> > singlemodule, not multiple modules.
> > > UVC or UAC). So a...
>
> Erfahren Sie mehr »
Reply all
Reply to author
Forward
0 new messages