kinectsdk + openni/nite at the same time

3,789 views
Skip to first unread message

Amir Hirsch

unread,
Jun 19, 2011, 6:04:45 PM6/19/11
to openn...@googlegroups.com

Joshua Blake

unread,
Jun 19, 2011, 6:21:15 PM6/19/11
to openn...@googlegroups.com

Interesting... Any details on implementation? I'm not really sure if this is showing two Kinects or an OpenNI plug in for the Kinect SDK.

> --
> You received this message because you are subscribed to the Google Groups "OpenNI" group.
> To post to this group, send email to openn...@googlegroups.com.
> To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/openni-dev?hl=en.
>

Amir Hirsch

unread,
Jun 19, 2011, 6:27:40 PM6/19/11
to openn...@googlegroups.com
Two kinects, but I'm actively working on the OpenNI plugin for the KinectSDK and hope to also provide the skeleton as a pass through library.

Of interest will be how microsoft responds to already commercially available OpenNI software benefiting from an adapter like this....

Amir

Amir Hirsch

unread,
Jun 20, 2011, 1:16:28 AM6/20/11
to openn...@googlegroups.com
So I've gotten a little progress on this.

I've been able to build the NiSampleModule example in OpenNI with some
modifications that cause it to initialize the Kinect SDK.

I can also register it with niReg as a Depth node. the only difficult
part was you have to run niReg as administrator (I spent 2 hours on
that part and like 5 minutes on the rest)

Almost there....

Amir

MichaelK

unread,
Jun 20, 2011, 4:27:53 AM6/20/11
to OpenNI
Any hints, how this works?

Ibrahim

unread,
Jun 20, 2011, 6:45:56 AM6/20/11
to OpenNI
i have installed the microsoft sdk and my openni programs stopped
working. is there any solution to this problem. i tried reinstalling
the primesense drivers and uninstalling the micrisoft drivers but
still somehow the microsoft drivers automatically reinstall

Sam Muscroft

unread,
Jun 20, 2011, 6:59:31 AM6/20/11
to OpenNI
ON Win 7 you need to uninstall microsoft sdk and then re-install avin
sensor kinect drivers for openni (no need to uninstall NITE / OpenNI
libraries). This worked for me yesterday with the kinect plugged in
throughout.

Cheers,

S.

Ibrahim

unread,
Jun 20, 2011, 7:25:53 AM6/20/11
to OpenNI
so then u have to basically uninstall and reinstall everytime u want
to shift from one to another..

noisecrime

unread,
Jun 20, 2011, 7:26:57 AM6/20/11
to OpenNI


On Jun 20, 6:16 am, Amir Hirsch <a...@tinkerheavy.com> wrote:
> So I've gotten a little progress on this.
>
> I've been able to build the NiSampleModule example in OpenNI with some
> modifications that cause it to initialize the Kinect SDK.
>
> I can also register it with niReg as a Depth node. the only difficult
> part was you have to run niReg as administrator (I spent 2 hours on
> that part and like 5 minutes on the rest)
>
> Almost there....

Nice work, keep going it will be done in no time ;)

Will this mean it is usable directly in unity or does the need
for .Net 4 mean we'll have to find some other means of transferring
the data? Thats the only bit that concerns me, I mean fair enough for
tracking data, but not for image data.

BTW - been working on improvements to the new Unity Wrapper, adding
support for image and label, as well as optimisation of setPixels. Did
have a few queries about certain approaches in the wrapper, but i'll
discuss them on UnityKinect once I release this update.

Sam Muscroft

unread,
Jun 20, 2011, 7:30:00 AM6/20/11
to OpenNI
No, only unininstall microsoft sdk and then re-install avin kinect
sensor driver (not Openni and Nite.) This is until there is a plugin/
workaround which it seems is not far off judging by the previous posts
by Amir.

MichaelK

unread,
Jun 20, 2011, 9:26:38 AM6/20/11
to OpenNI
But when you uninstall the MS SDK, you can't use this anymore!? The
question is, how can openNI and the MS SDK co-exist, without
uninstalling anything to use the other thing.

Sam Muscroft

unread,
Jun 20, 2011, 9:31:36 AM6/20/11
to OpenNI
Michael - I was replying to Ibrahim, who had asked how to re-instate
OpenNi after installing / uninstalling microsoft sdk.
Message has been deleted

Amir Hirsch

unread,
Jun 21, 2011, 6:24:34 AM6/21/11
to openn...@googlegroups.com
Latest update: I can get the Kinect to turn on using the Kinect SDK
invoked through OpenNI module.

I am modifying NiSampleModule to get depth from msft Kinect SDK and
feed it to OpenNI.

First the DLL hell stuff in VS2010:
your "additional dependencies" in the project properties -> linker ->
input add MSRKinectNUI.lib
MSRKinectNUI.lib;openNI.lib;%(AdditionalDependencies)

i dragged the msrkinectnui.lib into my project too for good measure

in the VC++ directories settings:
$(MSRKINECTSDK)\inc to include
$(MSRKINECTSDK)\lib to libraries


in SampleDepth.cpp i now initialize the kinect with the Kinect SDK. I
needed to include some headers:


#include "wtypes.h"
#include "MSR_NuiApi.h"

..
and modify init:

XnStatus SampleDepth::Init()
{
m_pDepthMap = new XnDepthPixel[SUPPORTED_X_RES * SUPPORTED_Y_RES];
if (m_pDepthMap == NULL)
{
return XN_STATUS_ALLOC_FAILED;
}

m_hNextDepthFrameEvent = CreateEvent( NULL, TRUE, FALSE, NULL );
HRESULT hr;

hr = NuiInitialize(
NUI_INITIALIZE_FLAG_USES_DEPTH_AND_PLAYER_INDEX |
NUI_INITIALIZE_FLAG_USES_SKELETON | NUI_INITIALIZE_FLAG_USES_COLOR);
if( FAILED( hr ) )
{
// MessageBoxResource(m_hWnd,IDS_ERROR_NUIINIT,MB_OK | MB_ICONHAND);
return XN_STATUS_ALLOC_FAILED;;
}



hr = NuiImageStreamOpen(
NUI_IMAGE_TYPE_DEPTH_AND_PLAYER_INDEX,
NUI_IMAGE_RESOLUTION_320x240,
0,
2,
m_hNextDepthFrameEvent,
&m_pDepthStreamHandle );
if( FAILED( hr ) )
{
// MessageBoxResource(m_hWnd,IDS_ERROR_DEPTHSTREAM,MB_OK | MB_ICONHAND);
return XN_STATUS_ILLEGAL_POSITION ;
}


return (XN_STATUS_OK);
}

the XnStatus return types in the hr error is just me being ridiculous.


also in SampleDepth.h
XnBool m_bMirror;
ChangeEvent m_generatingEvent;
ChangeEvent m_dataAvailableEvent;
ChangeEvent m_mirrorEvent;


//ADDED BY AMIR FOR KinectSDK binding
HANDLE m_hThNuiProcess;
HANDLE m_hEvNuiProcessStop;
HANDLE m_pDepthStreamHandle;
HANDLE m_hNextDepthFrameEvent;

};


now i build it and then register it with: nireg -r NiSampleModule.dll
(don't forget to run this step in admin mode)
nireg -l now lists my samplemodule.dll
F:\Hacking\OpenNI\OpenNI\Platform\Win32\Bin\Debug\NiSampleModule.dll (compiled w
ith OpenNI 1.1.0.39):
Depth: OpenNI/SampleDepth/1.1.0.39


i was getting problems that looked like this:
F:\Hacking\OpenNI\OpenNI\Platform\Win32\Bin\Debug\NiSampleModule.dll 26920
[WARNING] Failed to load library 'F:\Hacking\OpenNI\OpenNI\Platform\Win32\
Bin\Debug\NiSampleModule.dll'. Error code: 87
27160 [WARNING] Failed to load 'F:\Hacking\OpenNI\OpenNI\Platfor
m\Win32\Bin\Debug\NiSampleModule.dll' - missing dependencies?

this error occured until i included the MSRKinectNUI.lib in the
additional dependencies. but then after several hours discovered my
folly.


ok so then i modified samplesconfig.xml in the OpenNI/data directory
to look only for a depth node (i'm testing with usertracker which only
consumes depth)

<OpenNI>
<Licenses>
<!-- Add licenses here
<License vendor="vendor" key="key"/>
-->
</Licenses>
<Log writeToConsole="false" writeToFile="false">
<!-- 0 - Verbose, 1 - Info, 2 - Warning, 3 - Error (default) -->
<LogLevel value="3"/>
<Masks>
<Mask name="ALL" on="true"/>
</Masks>
<Dumps>
</Dumps>
</Log>
<ProductionNodes>
<Node type="Depth" name="Depth1">
<Configuration>
<Mirror on="true"/>
</Configuration>
</Node>
</ProductionNodes>
</OpenNI>

now i haven't started the NUIThread and mapped the depth frames to
OpenNI, but when i run UserTracker my kinect turns on, implying a
connection between openni and kinectsdk is working.

so now someone else can avoid my errors and probably get it done
tomorrow before i even wake up.

Amir

alex summer

unread,
Jun 22, 2011, 10:10:52 AM6/22/11
to OpenNI
I hope this works! Keep us informed of the progress, you're almost
done!!!
This will be very usefull to everyone! Cheers :D

Alex
> > On Sun, Jun 19, 2011 at 3:27 PM, Amir Hirsch <a...@catalystac.com> wrote:
> >> Two kinects, but I'm actively working on the OpenNI plugin for the KinectSDK
> >> and hope to also provide the skeleton as a pass through library.
> >> Of interest will be how microsoft responds to already commercially
> >> available OpenNI software benefiting from an adapter like this....
> >> Amir
>
> >> On Sun, Jun 19, 2011 at 3:21 PM, Joshua Blake <joshbl...@gmail.com> wrote:
>
> >>> Interesting... Any details on implementation? I'm not really sure if this
> >>> is showing two Kinects or an OpenNI plug in for the Kinect SDK.
>

Amir Hirsch

unread,
Jun 22, 2011, 1:14:17 PM6/22/11
to openn...@googlegroups.com
So NiViewer can draw the data from our depth node that produces frames
through the KinectSDK now.

But when I try to use the usertracker example I get some nonsense
about maxShift:

C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
Couldn't get maxShift.
Couldn't get maxShift.
Find user generator failed: Error!

working on it..

Amir

Tomoto

unread,
Jun 22, 2011, 7:43:28 PM6/22/11
to OpenNI
Hi Amir and everyone,

I got the same "maxShift" issue, and, so far, it looks nontrivial to
get through it...

It seems NITE's skeleton tracker only works with PrimeSense devices
according to the following discussion:
http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd4069722f9d/eddba6aa76c53fbd?lnk=gst&q=maxshift#eddba6aa76c53fbd
. (Note Kinect could be a "3rd party device" for PrimeSense if covered
by Microsoft's driver.) Also, I found XnDepthStream#GetMaxShift method
in the source code of SensorKinect. I guess NITE depends on this
method and apparently Microsoft's driver does not have it.

To avoid this issue, probably we need to wrap Microsoft's skeleton
tracker for OpenNI so that we could (unwillingly) skip NITE at all.
Or, it may be possible to wrap Microsoft's Kinect driver in a way
NITE's skeleton tracker expects. In any ways, it looks nontrivial and
painful work. :-/

Anyway, I will share my source code in some public source code
repository.

Thanks,
Tomoto

Tomoto

unread,
Jun 23, 2011, 12:51:25 AM6/23/11
to OpenNI
Hi,

I made the code public at: https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

Only Depth and Image nodes work, but others (e.g. User node) do not.
It is not that cool.

Thanks,
Tomoto


On 6月22日, 午後4:43, Tomoto <tom...@gmail.com> wrote:
> Hi Amir and everyone,
>
> I got the same "maxShift" issue, and, so far, it looks nontrivial to
> get through it...
>
> It seems NITE's skeleton tracker only works with PrimeSense devices
> according to the following discussion:http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd...
> ...
>
> もっと読む »

Amir Hirsch

unread,
Jun 23, 2011, 3:42:46 AM6/23/11
to openn...@googlegroups.com
Thanks Tomoto for producing a good project structure for this! I've
extended your work and I''ve fixed the MaxShift issue and now dealing
with other stuff now.

For everyone's reference, if you install Tomoto's dll and run
UserTracker (which uses NITE) you'll see that NITE looks for some
property called maxShift:

C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe

Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11


D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.

KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE00)
KinectCamera_OpenStreamEndpoint Opened successfully.


Couldn't get maxShift.
Couldn't get maxShift.

Find user generator failed: Error!


I've gone down the rabbit hole of the Sensor code inheritance (someone
at PrimeSense really likes extending classes, huh?) I've discovered
that the MaxShift is a property requested through a GetIntProperty
request in the XnSensorProductionNode:

I added this little work-around to the MSRKinectDepthGenerator which
just reports 2047 when asked for the IntProperty "MaxShift"
(frankly, i don't think this property actually matters and I wonder
why NITE has dependencies on hidden properties)

XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{
if (strName== "MaxShift")
{
nValue = 2047;
}
return XN_STATUS_OK;
}

so now it gets a MaxShift and the error looks like this saying it
can't find the shift2Depth table:

C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe

Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11


D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.

KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE00)
KinectCamera_OpenStreamEndpoint Opened successfully.
Couldn't get shift2Depth table.
Couldn't get shift2Depth table.

ONWARD.


2011/6/22 Tomoto <tom...@gmail.com>:

>> もっと読む ≫

Amir Hirsch

unread,
Jun 23, 2011, 4:09:47 AM6/23/11
to openn...@googlegroups.com
you don't actually need to return a value for the MaxShift and can
just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
is gotten through the GetGeneralProperty call:

XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{

return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
strName, XnUInt32 nBufferSize, void* pBuffer) const
{
return XN_STATUS_OK;
}

2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:

Tomoto

unread,
Jun 23, 2011, 4:32:53 AM6/23/11
to OpenNI
Interesting. I figured out NITE asked the following values. Eventually
I got a "division by zero" error when just returned XN_STATUS_OK for
everything. Perhaps some proper value needs to be given..

GetIntProperty:
#define XN_STREAM_PROPERTY_MAX_SHIFT "MaxShift"
#define XN_STREAM_PROPERTY_CONST_SHIFT "ConstShift"
#define XN_STREAM_PROPERTY_ZERO_PLANE_DISTANCE "ZPD"

GetRealProperty:
#define XN_STREAM_PROPERTY_ZERO_PLANE_PIXEL_SIZE "ZPPS"
#define XN_STREAM_PROPERTY_EMITTER_DCMOS_DISTANCE "LDDIS"

GetGeneralProperty:
/** XN_DEPTH_TYPE[] */
#define XN_STREAM_PROPERTY_S2D_TABLE "S2D"
/** XnUInt16[] */
#define XN_STREAM_PROPERTY_D2S_TABLE "D2S"


On 6月23日, 午前1:09, Amir Hirsch <a...@tinkerheavy.com> wrote:
> you don't actually need to return a value for the MaxShift and can
> just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
> is gotten through the GetGeneralProperty call:
>
> XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> strName, XnUInt64& nValue) const
> {
> return XN_STATUS_OK;}
>
> XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
> strName, XnUInt32 nBufferSize, void* pBuffer) const
> {
> return XN_STATUS_OK;
>
> }
>
> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
> ...
>
> もっと読む ≫

Amir Hirsch

unread,
Jun 23, 2011, 4:46:58 AM6/23/11
to openn...@googlegroups.com
UserTracker runs after I put values for:

dValue in the GetRealProperty
nValue in the GetIntProperty


Unfortunately it doesn't track the user.... but it runs.

XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{

nValue = 2047;
return XN_STATUS_OK;
}

XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
strName, XnUInt32 nBufferSize, void* pBuffer) const
{

return XN_STATUS_OK;
}

XnStatus MSRKinectDepthGenerator::GetStringProperty(const XnChar*
strName, XnChar* csValue, XnUInt32 nBufSize) const
{
return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetRealProperty(const XnChar*
strName, XnDouble& dValue) const
{
dValue = 1.0;
return XN_STATUS_OK;
}

Amir Hirsch

unread,
Jun 23, 2011, 5:39:51 AM6/23/11
to openn...@googlegroups.com
It's definitely looking for more Int and Real properties than just
those it seems. If I return 0 for anything but those properties it
still freezes, but if I always assign a default then it's still
running.

On Thu, Jun 23, 2011 at 1:32 AM, Tomoto <tom...@gmail.com> wrote:

Tomoto

unread,
Jun 23, 2011, 6:33:42 AM6/23/11
to OpenNI
Great, it runs! (although the user tracking does not work as you say.)
I pushed the fix into the repository.

I think the properties NITE looks for are limited because I wrote a
strict code as below and it still ran. It seems the values must be
"correct" to get the user tracking worked.

--- a/src/MSRKinectDepthGenerator.cpp
+++ b/src/MSRKinectDepthGenerator.cpp
@@ -125,3 +125,34 @@ void
MSRKinectDepthGenerator::UnregisterFromViewPointChange(XnCallbackHandle
hCa
m_viewPointChangeEvent.Unregister(hCallback);
}

+XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
+{
+ // todo -- dummy values to allow UserGenerator run
+ if (strcmp(strName, "MaxShift") == 0 || strcmp(strName, "ZPD") == 0
|| strcmp(strName, "ConstShift") == 0) {
+ nValue = 2047;
+ return XN_STATUS_OK;
+ }
+ return XN_STATUS_ERROR;
+}
+
+
+XnStatus MSRKinectDepthGenerator::GetRealProperty(const XnChar*
strName, XnDouble& dValue) const
+{
+ // todo -- dummy values to allow UserGenerator run
+ if (strcmp(strName, "ZPPS") == 0 || strcmp(strName, "LDDIS") == 0) {
+ dValue = 1.0;
+ return XN_STATUS_OK;
+ }
+ return XN_STATUS_ERROR;
+}
+
+
+XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
strName, XnUInt32 nBufferSize, void* pBuffer) const
+{
+ // todo -- dummy values to allow UserGenerator run
+ if (strcmp(strName, "S2D") == 0 || strcmp(strName, "D2S") == 0) {
+ return XN_STATUS_OK;
+ }
+ return XN_STATUS_ERROR;
+}
+


On 6月23日, 午前2:39, Amir Hirsch <a...@tinkerheavy.com> wrote:
> It's definitely looking for more Int and Real properties than just
> those it seems. If I return 0 for anything but those properties it
> still freezes, but if I always assign a default then it's still
> running.
>
>
>
>
>
>
>
> On Thu, Jun 23, 2011 at 1:32 AM, Tomoto <tom...@gmail.com> wrote:
> > Interesting. I figured out NITE asked the following values. Eventually
> > I got a "division by zero" error when just returned XN_STATUS_OK for
> > everything. Perhaps some proper value needs to be given..
>
> > GetIntProperty:
> > #define XN_STREAM_PROPERTY_MAX_SHIFT "MaxShift"
> > #define XN_STREAM_PROPERTY_CONST_SHIFT "ConstShift"
> > #define XN_STREAM_PROPERTY_ZERO_PLANE_DISTANCE "ZPD"
>
> > GetRealProperty:
> > #define XN_STREAM_PROPERTY_ZERO_PLANE_PIXEL_SIZE "ZPPS"
> > #define XN_STREAM_PROPERTY_EMITTER_DCMOS_DISTANCE "LDDIS"
>
> > GetGeneralProperty:
> > /** XN_DEPTH_TYPE[] */
> > #define XN_STREAM_PROPERTY_S2D_TABLE "S2D"
> > /** XnUInt16[] */
> > #define XN_STREAM_PROPERTY_D2S_TABLE "D2S"
>
> > On 6月23日, 午前1:09, Amir Hirsch <a...@tinkerheavy.com> wrote:
> >> you don't actually need to return a value for theMaxShiftand can
> >> just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
> >> is gotten through the GetGeneralProperty call:
>
> >> XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> >> strName, XnUInt64& nValue) const
> >> {
> >> return XN_STATUS_OK;}
>
> >> XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
> >> strName, XnUInt32 nBufferSize, void* pBuffer) const
> >> {
> >> return XN_STATUS_OK;
>
> >> }
>
> >> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
>
> >> > Thanks Tomoto for producing a good project structure for this! I've
> >> > extended your work and I''ve fixed theMaxShiftissue and now dealing
> >> > with other stuff now.
>
> >> > For everyone's reference, if you install Tomoto's dll and run
> >> > UserTracker (which uses NITE) you'll see that NITE looks for some
> >> > property calledmaxShift:
>
> >> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> >> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
> >> > 1-EF5E-00C04F2D728B}\00
> >> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> >> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> >> > KinectCamera_OpenStreamEndpoint Opened successfully.
> >> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> >> > D1-EF5E-00C04F2D728B}\00\PIPE00)
> >> > KinectCamera_OpenStreamEndpoint Opened successfully.
> >> > Couldn't getmaxShift.
> >> > Couldn't getmaxShift.
> >> > Find user generator failed: Error!
>
> >> > I've gone down the rabbit hole of the Sensor code inheritance (someone
> >> > at PrimeSense really likes extending classes, huh?) I've discovered
> >> > that theMaxShiftis a property requested through a GetIntProperty
> >> > request in the XnSensorProductionNode:
>
> >> > I added this little work-around to the MSRKinectDepthGenerator which
> >> > just reports 2047 when asked for the IntProperty "MaxShift"
> >> > (frankly, i don't think this property actually matters and I wonder
> >> > why NITE has dependencies on hidden properties)
>
> >> > XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> >> > strName, XnUInt64& nValue) const
> >> > {
> >> > if (strName== "MaxShift")
> >> > {
> >> > nValue = 2047;
> >> > }
> >> > return XN_STATUS_OK;
> >> > }
>
> >> > so now it gets aMaxShiftand the error looks like this saying it
> ...
>
> もっと読む ≫

Amir Hirsch

unread,
Jun 23, 2011, 6:36:57 AM6/23/11
to openn...@googlegroups.com
I went and tweaked Sensor to print a dump. Here's a log of the
GetIntProperty and GetRealProperty calls along with valid numbers. I'm
going to put these into your code now as return values and if that
works well... that's just epic.. otherwise i'll send a disappointing
email in 10 minutes.

GetProperty (INT): SupportedModesCount : 15
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): SupportedModesCount : 11
GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (REAL): LDDIS : 0
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): ConstShift : 200


2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:

Amir Hirsch

unread,
Jun 23, 2011, 6:38:58 AM6/23/11
to openn...@googlegroups.com
oops forgot to make it %f:

GetProperty (INT): SupportedModesCount : 15
GetProperty (INT): ZPD : 120

GetProperty (REAL): ZPPS : 0.105200


GetProperty (INT): SupportedModesCount : 11
GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120

GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200

Amir Hirsch

unread,
Jun 23, 2011, 7:01:27 AM6/23/11
to openn...@googlegroups.com
So the issue is probably not related to the Int or Real values and is
probably something to do with the table. here's the output from my
code faking it (btw, needed to change strName == "foo" to
strcmp(strName,"foo") ==0

GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): ConstShift : 200

Amir Hirsch

unread,
Jun 23, 2011, 7:51:49 AM6/23/11
to openn...@googlegroups.com
ok. i've been logging all sorts of communication between NITE and
sensorkinect, but i'm too tired to examine it. It's hard to understand
what NITE is consuming directly from the Sensor that isn't obviously
exposed by OpenNI.

This all came up back in December, this thread is great:
http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd4069722f9d

I'm nearly certain I can make a module that tricks NITE into thinking
our KinectSDK depth node is a sensorkinect equivalent.

But it's more interesting to get the user-map and their skeleton data
into OpenNI. We can generate the rotation matrices from the Position
data and also add in some pose detection capabilities. We'll need to
establish a user acquisition method that takes advantage of not
needing any calibration.


Amir


2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:

Tomoto

unread,
Jun 23, 2011, 3:07:47 PM6/23/11
to OpenNI
I encoded these values but still had no luck. Then D2S and S2D might
be the key.

Tomoto

On 6月23日, 午前3:38, Amir Hirsch <a...@tinkerheavy.com> wrote:
> oops forgot to make it %f:
>
> GetProperty (INT): SupportedModesCount : 15
> GetProperty (INT): ZPD : 120
> GetProperty (REAL): ZPPS : 0.105200
> GetProperty (INT): SupportedModesCount : 11
> GetProperty (INT): MaxShift : 2047
> GetProperty (INT): ZPD : 120
> GetProperty (REAL): ZPPS : 0.105200
> GetProperty (REAL): LDDIS : 7.500000
> GetProperty (REAL): ZPPS : 0.105200
> GetProperty (INT): ZPD : 120
> GetProperty (REAL): ZPPS : 0.105200
> GetProperty (INT): ZPD : 120
> GetProperty (REAL): ZPPS : 0.105200
> GetProperty (INT): ZPD : 120
> GetProperty (INT): ConstShift : 200
> GetProperty (INT): ConstShift : 200
>
> 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
>
>
>
> > I went and tweaked Sensor to print a dump. Here's a log of the
> > GetIntProperty and GetRealProperty calls along with valid numbers. I'm
> > going to put these into your code now as return values and if that
> > works well... that's just epic.. otherwise i'll send a disappointing
> > email in 10 minutes.
>
> > GetProperty (INT): SupportedModesCount : 15
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 536870912
> > GetProperty (INT): SupportedModesCount : 11
> > GetProperty (INT): MaxShift : 2047
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 536870912
> > GetProperty (REAL): LDDIS : 0
> > GetProperty (REAL): ZPPS : 536870912
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 536870912
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 536870912
> > GetProperty (INT): ZPD : 120
> > GetProperty (INT): ConstShift : 200
> > GetProperty (INT): ConstShift : 200
>
> > 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
> >> It's definitely looking for more Int and Real properties than just
> >> those it seems. If I return 0 for anything but those properties it
> >> still freezes, but if I always assign a default then it's still
> >> running.
>
> >> On Thu, Jun 23, 2011 at 1:32 AM, Tomoto <tom...@gmail.com> wrote:
> >>> Interesting. I figured out NITE asked the following values. Eventually
> >>> I got a "division by zero" error when just returned XN_STATUS_OK for
> >>> everything. Perhaps some proper value needs to be given..
>
> >>> GetIntProperty:
> >>> #define XN_STREAM_PROPERTY_MAX_SHIFT                            "MaxShift"
> >>> #define XN_STREAM_PROPERTY_CONST_SHIFT                          "ConstShift"
> >>> #define XN_STREAM_PROPERTY_ZERO_PLANE_DISTANCE          "ZPD"
>
> >>> GetRealProperty:
> >>> #define XN_STREAM_PROPERTY_ZERO_PLANE_PIXEL_SIZE        "ZPPS"
> >>> #define XN_STREAM_PROPERTY_EMITTER_DCMOS_DISTANCE       "LDDIS"
>
> >>> GetGeneralProperty:
> >>> /** XN_DEPTH_TYPE[] */
> >>> #define XN_STREAM_PROPERTY_S2D_TABLE                            "S2D"
> >>> /** XnUInt16[] */
> >>> #define XN_STREAM_PROPERTY_D2S_TABLE                            "D2S"
>
> >>> On 6月23日, 午前1:09,AmirHirsch <a...@tinkerheavy.com> wrote:
> >>>> you don't actually need to return a value for the MaxShift and can
> >>>> just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
> >>>> is gotten through the GetGeneralProperty call:
>
> >>>> XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> >>>> strName, XnUInt64& nValue) const
> >>>> {
> >>>>         return XN_STATUS_OK;}
>
> >>>> XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
> >>>> strName, XnUInt32 nBufferSize, void* pBuffer) const
> >>>> {
> >>>>         return XN_STATUS_OK;
>
> >>>> }
>
> >>>> 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
> >>>> >>> HiAmirand everyone,
>
> >>>> >>> I got the same "maxShift" issue, and, so far, it looks nontrivial to
> >>>> >>> get through it...
>
> >>>> >>> It seems NITE's skeleton tracker only works with PrimeSense devices
> >>>> >>> according to the following discussion:http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd...
> >>>> >>> . (Note Kinect could be a "3rd party device" for PrimeSense if covered
> >>>> >>> by Microsoft's driver.) Also, I found XnDepthStream#GetMaxShift method
> >>>> >>> in the source code of SensorKinect. I guess NITE depends on this
> >>>> >>> method and apparently Microsoft's driver does not have it.
>
> >>>> >>> To avoid this issue, probably we need to wrap Microsoft's skeleton
> >>>> >>> tracker for OpenNI so that we could (unwillingly) skip NITE at all.
> >>>> >>> Or, it may be possible to wrap Microsoft's Kinect driver in a way
> >>>> >>> NITE's skeleton tracker expects. In any ways, it looks nontrivial and
> >>>> >>> painful work. :-/
>
> >>>> >>> Anyway, I will share my source code in some public source code
> >>>> >>> repository.
>
> >>>> >>> Thanks,
> >>>> >>> Tomoto
>
> >>>> >>> On 6月22日, 午前10:14,AmirHirsch <a...@tinkerheavy.com> wrote:
>
> >>>> >>> > So NiViewer can draw the data from our depth node that produces frames
> >>>> >>> > through the KinectSDK now.
>
> >>>> >>> > But when I try to use the usertracker example I get some nonsense
> >>>> >>> > aboutmaxShift:
>
> >>>> >>> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> >>>> >>> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11D
> >>>> >>> > 1-EF5E-00C04F2D728B}\00
> >>>> >>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11
> >>>> >>> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> >>>> >>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> >>>> >>> > Couldn't getmaxShift.
> >>>> >>> > Couldn't getmaxShift.
> >>>> >>> > Find user generator failed: Error!
>
> >>>> >>> > working on it..
>
> >>>> >>> >Amir
>
> >>>> >>> > On Wed, Jun 22, 2011 at 7:10 AM, alex summer
>
> >>>> >>> > <ezekielninju...@hotmail.com> wrote:
> >>>> >>> > > I hope this works! Keep us informed of the progress, you're almost
> >>>> >>> > > done!!!
> >>>> >>> > > This will be very usefull to everyone! Cheers :D
>
> >>>> >>> > > Alex
>
> >>>> >>> > > On Jun 21, 6:24 am,AmirHirsch <a...@tinkerheavy.com> wrote:
> >>>> >>> > >> Latest update: I can get the Kinect to turn on using the Kinect SDK
> >>>> >>> > >> invoked through OpenNI module.
>
> >>>> >>> > >> I am modifying NiSampleModule to get depth from msft Kinect SDK and
> >>>> >>> > >> feed it to OpenNI.
>
> >>>> >>> > >> First the DLL hell stuff in VS2010:
> >>>> >>> > >> your "additional dependencies" in the project properties -> linker ->
> >>>> >>> > >> input add MSRKinectNUI.lib
> >>>> >>> > >> MSRKinectNUI.lib;openNI.lib;%(AdditionalDependencies)
>
> >>>> >>> > >> i dragged the msrkinectnui.lib into my project too for good measure
>
> >>>> >>> > >> in the VC++ directories settings:
> >>>> >>> > >> $(MSRKINECTSDK)\inc to include
> >>>> >>> > >> $(MSRKINECTSDK)\lib to libraries
>
> >>>> >>> > >> in SampleDepth.cpp i now initialize the kinect with the Kinect SDK. I
> >>>> >>> > >> needed to include some headers:
>
> >>>> >>> > >> #include "wtypes.h"
> >>>> >>> > >> #include "MSR_NuiApi.h"
>
> >>>> >>> > >> ..
> >>>> >>> > >> and modify init:
>
> >>>> >>> > >> XnStatus
>
> ...
>
> もっと読む »- 引用テキストを表示しない -
>
> - 引用テキストを表示 -

Tomoto

unread,
Jun 23, 2011, 4:44:50 PM6/23/11
to OpenNI
I agree with "it's more interesting to get the user-map and their
skeleton data
into OpenNI". Actually I have a prototype of UserGenerator over MS SDK
that runs but returns nothing.

Tomoto

On 6月23日, 午前4:51, Amir Hirsch <a...@tinkerheavy.com> wrote:
> ok. i've been logging all sorts of communication between NITE and
> sensorkinect, but i'm too tired to examine it. It's hard to understand
> what NITE is consuming directly from the Sensor that isn't obviously
> exposed by OpenNI.
>
> This all came up back in December, this thread is great:http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd...
>
> I'm nearly certain I can make a module that tricks NITE into thinking
> our KinectSDK depth node is a sensorkinect equivalent.
>
> But it's more interesting to get the user-map and their skeleton data
> into OpenNI. We can generate the rotation matrices from the Position
> data and also add in some pose detection capabilities. We'll need to
> establish a user acquisition method that takes advantage of not
> needing any calibration.
>
> Amir
>
> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
>
> > So the issue is probably not related to the Int or Real values and is
> > probably something to do with the table. here's the output from my
> > code faking it (btw, needed to change strName == "foo" to
> > strcmp(strName,"foo") ==0
>
> > GetProperty (INT): MaxShift : 2047
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (REAL): LDDIS : 7.500000
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (INT): ConstShift : 200
> > GetProperty (INT): ConstShift : 200
>
> > On Thu, Jun 23, 2011 at 3:38 AM, Amir Hirsch <a...@tinkerheavy.com> wrote:
> >> oops forgot to make it %f:
>
> >> GetProperty (INT): SupportedModesCount : 15
> >> GetProperty (INT): ZPD : 120
> >> GetProperty (REAL): ZPPS : 0.105200
> >> GetProperty (INT): SupportedModesCount : 11
> >> GetProperty (INT): MaxShift : 2047
> >> GetProperty (INT): ZPD : 120
> >> GetProperty (REAL): ZPPS : 0.105200
> >> GetProperty (REAL): LDDIS : 7.500000
> >> GetProperty (REAL): ZPPS : 0.105200
> >> GetProperty (INT): ZPD : 120
> >> GetProperty (REAL): ZPPS : 0.105200
> >> GetProperty (INT): ZPD : 120
> >> GetProperty (REAL): ZPPS : 0.105200
> >> GetProperty (INT): ZPD : 120
> >> GetProperty (INT): ConstShift : 200
> >> GetProperty (INT): ConstShift : 200
>
> >> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
> >>> I went and tweaked Sensor to print a dump. Here's a log of the
> >>> GetIntProperty and GetRealProperty calls along with valid numbers. I'm
> >>> going to put these into your code now as return values and if that
> >>> works well... that's just epic.. otherwise i'll send a disappointing
> >>> email in 10 minutes.
>
> >>> GetProperty (INT): SupportedModesCount : 15
> >>> GetProperty (INT): ZPD : 120
> >>> GetProperty (REAL): ZPPS : 536870912
> >>> GetProperty (INT): SupportedModesCount : 11
> >>> GetProperty (INT): MaxShift : 2047
> >>> GetProperty (INT): ZPD : 120
> >>> GetProperty (REAL): ZPPS : 536870912
> >>> GetProperty (REAL): LDDIS : 0
> >>> GetProperty (REAL): ZPPS : 536870912
> >>> GetProperty (INT): ZPD : 120
> >>> GetProperty (REAL): ZPPS : 536870912
> >>> GetProperty (INT): ZPD : 120
> >>> GetProperty (REAL): ZPPS : 536870912
> >>> GetProperty (INT): ZPD : 120
> >>> GetProperty (INT): ConstShift : 200
> >>> GetProperty (INT): ConstShift : 200
>
> >>> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
> ...
>
> もっと読む ≫

Tomoto

unread,
Jun 26, 2011, 3:59:24 PM6/26/11
to OpenNI
Hello,

Finally I have implemented UserGenerator that works over MS Kinect
SDK!.
https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

Now you can use Depth, Image, and User nodes. This UserGenerator
implementation directly passes through the user and skeleton data
recognized by Kinect SDK to the application, so the recognition
process is all done by Kinect SDK without NITE. You can see the user
tracking starts without psi pose as well as other different behaviors
of Kinect SDK from NITE.

I tested this module with NiUserTracker and my Hacks (kinect-ultra and
kinect-kamehameha), and they were "a sort of working". It is not
perfect, though. For example, the slight difference of the joint
position could make the application hard to use and require some
tuning.

Enjoy!
Tomoto
> > >>>>>> you don't actually need to return a value for theMaxShiftand can
> > >>>>>> just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
> > >>>>>> is gotten through the GetGeneralProperty call:
>
> > >>>>>> XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> > >>>>>> strName, XnUInt64& nValue) const
> > >>>>>> {
> > >>>>>> return XN_STATUS_OK;}
>
> > >>>>>> XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
> > >>>>>> strName, XnUInt32 nBufferSize, void* pBuffer) const
> > >>>>>> {
> > >>>>>> return XN_STATUS_OK;
>
> > >>>>>> }
>
> > >>>>>> 2011/6/23 Amir Hirsch <a...@tinkerheavy.com>:
>
> > >>>>>> > Thanks Tomoto for producing a good project structure for this! I've
> > >>>>>> > extended your work and I''ve fixed theMaxShiftissue and now dealing
> > >>>>>> > with other stuff now.
>
> > >>>>>> > For everyone's reference, if you install Tomoto's dll and run
> > >>>>>> > UserTracker (which uses NITE) you'll see that NITE looks for some
> > >>>>>> > property calledmaxShift:
>
> > >>>>>> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> > >>>>>> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
> > >>>>>> > 1-EF5E-00C04F2D728B}\00
> > >>>>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>>>> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> > >>>>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>>>> > D1-EF5E-00C04F2D728B}\00\PIPE00)
> > >>>>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>>>> > Couldn't getmaxShift.
> > >>>>>> > Couldn't getmaxShift.
> > >>>>>> > Find user generator failed: Error!
>
> > >>>>>> > I've gone down the rabbit hole of the Sensor code inheritance (someone
> > >>>>>> > at PrimeSense really likes extending classes, huh?) I've discovered
> > >>>>>> > that theMaxShiftis a property requested through a GetIntProperty
> > >>>>>> > request in the XnSensorProductionNode:
>
> > >>>>>> > I added this little work-around to the MSRKinectDepthGenerator which
> > >>>>>> > just reports 2047 when asked for the IntProperty "MaxShift"
> > >>>>>> > (frankly, i don't think this property actually matters and I wonder
> > >>>>>> > why NITE has dependencies on hidden properties)
>
> > >>>>>> > XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> > >>>>>> > strName, XnUInt64& nValue) const
> > >>>>>> > {
> > >>>>>> > if (strName== "MaxShift")
> > >>>>>> > {
> > >>>>>> > nValue = 2047;
> > >>>>>> > }
> > >>>>>> > return XN_STATUS_OK;
> > >>>>>> > }
>
> > >>>>>> > so now it gets aMaxShiftand the error looks like this saying it
> ...
>
> もっと読む ≫

Amir Hirsch

unread,
Jun 26, 2011, 9:00:29 PM6/26/11
to openn...@googlegroups.com
even though you can't shoot fireballs, you're a real hero! :)

Amir

2011/6/26 Tomoto <tom...@gmail.com>:

Joshua Blake

unread,
Jun 26, 2011, 9:41:40 PM6/26/11
to openn...@googlegroups.com
Tomoto,
 
That's great news. I'll try it out sometime soon and let you know how it works with my apps.
 
Josh

---
Joshua Blake
Microsoft Surface MVP
OpenKinect Community Founder http://openkinect.org

(cell) 703-946-7176
Twitter: http://twitter.com/joshblake
Blog: http://nui.joshland.org
Natural User Interfaces in .NET book: http://bit.ly/NUIbook





2011/6/26 Tomoto <tom...@gmail.com>

Tomoto

unread,
Jun 27, 2011, 2:02:13 AM6/27/11
to OpenNI
Amir, Josh,

This is a quick hack in a couple of days, and I have skipped
implementing a number of details. Let me know your experience. Also I
think I could add you as team members so that you could directly
modify the code in the repository.

Thanks,
Tomoto

On 6月26日, 午後6:41, Joshua Blake <joshbl...@gmail.com> wrote:
> Tomoto,
>
> That's great news. I'll try it out sometime soon and let you know how it
> works with my apps.
>
> Josh
>
> ---
> Joshua Blake
> Microsoft Surface MVP
> OpenKinect Community Founderhttp://openkinect.org
> ...
>
> もっと読む ≫

Joshua Blake

unread,
Jun 27, 2011, 2:14:27 PM6/27/11
to openn...@googlegroups.com
Tomoto:
 
LOL @:
	virtual XnBool NeedPoseForCalibration()
	{
		return FALSE; // yay!
	}
I tested the bridge with InfoStrat.MotionFx (using OpenNI.Net) and it works after I changed these:
 
* Added to the openni.xml file the User node with query (per Readme)
* Removed from openni.xml Gesture and Hands nodes
* Commented out in my code where it subscribes to the PoseDetected event. This caused a not supported error. (I suppose I could have checked whether the capability is supported before subscribing but I didn't.)
* Changed some code that requires NewUser to be called before CalibrationEnd.
 
That last point illustrates a bug - CalibrationEnd event is called for a user before the NewUser event.
 
I also noticed that the skeleton has a lot more jitter than expected. I'm looking through your code to see whether you set up the skeleton smoothing parameters.
 
Side note should probably go on a separate thread:
Seeing a new implementation of the various generators and comparing the OpenNI and Kinect SDK APIs makes me think that we need to have a discussion as a community about iterating the OpenNI APIs to be more flexible for different types of generators. One example is that the OpenNI UserGenerator API is basically a direct wrapper for NITE including it's limitations and requirements about pose detection and calibration. We should figure out how to have a standard, generic API but also elegantly allow access to new or special features that a Generator exposes.
 
Josh

 
2011/6/27 Tomoto <tom...@gmail.com>
> ...
>
> もっと読む ≫

Amir Hirsch

unread,
Jun 27, 2011, 3:03:40 PM6/27/11
to openn...@googlegroups.com
To your last point, I've brought this up before since calibration-free skeleton tracking still requires some method of player determination. If we only get to track two skeletons but there are 5 users, how do we determine who is player 1 and player 2?

One option is to look for a pose: cycle the skeleton tracker to look at each user for a fixed amount of time until you find one standing in the pose
another option is location based: find the user standing based on where they are standing. yet another options is to use speech recognition with mic-array location to determine which player to skeleton-track.




Amir


2011/6/27 Joshua Blake <josh...@gmail.com>

Joshua Blake

unread,
Jun 27, 2011, 3:59:52 PM6/27/11
to openn...@googlegroups.com
Amir,
 
I don't think the Kinect SDK allows us to determine which players are actively tracked (with skeletons.) It will passively track just the player indexes and center positions of up to six players, and as far as I can tell just picks two of them to track actively.
 
In general though, you're right that applications still need a way for users to engage or disengage interaction. This is why on xbox, for example, you must wave your hand to get its attention. Unlike OpenNI, the Kinect is already tracking your hand actively and the wave gesture just tells it you want to interact. You can tell this because the animation starts to track your hand after only 1/2 or 1 wave.
 
Josh
2011/6/27 Amir Hirsch <am...@catalystac.com>

Amir Hirsch

unread,
Jun 27, 2011, 11:19:39 PM6/27/11
to openn...@googlegroups.com
Hi! I got Tomoto's kinectsdk bindings to OpenNI working. Totally awesome

here are the gotchas:

1) Orientation isn't supported yet (we can get that working soon!)
2) The image and depth data generator is still not compatible with
NITE (I'm working on it, I'm gonna print out the shift->depth data
tables, but I can't tell if it wants something in the frame metadata?)
3) I needed to go to the XnVFeatures 1.3.0 and 1.3.1 directories under
NITE and nireg -u XnVFeatures.dll so OpenNI would use your user
tracker instead of NITE.

Amir


2011/6/26 Tomoto <tom...@gmail.com>:

Tomoto

unread,
Jun 28, 2011, 4:41:45 AM6/28/11
to OpenNI
Hi Josh,

Appreciate your feedback!

> That last point illustrates a bug - CalibrationEnd event is called for a user before the NewUser event.

Thanks for pointing out. This was a bug and I fixed it.

> I also noticed that the skeleton has a lot more jitter than expected.

The skeleton smoothing should be turned on when you give a non-zero
value to SkeletonCapability::SetSmoothing. Does it not work as you
expect?

As for your side note -- yes, it always happens when you design a
framework. It is very difficult to foresee every future requirements
upfront, and you need to change it as you see new needs. I feel the
event set defined by OpenNI today is fairly good basically. (As a
matter of fact, I like that OpenNI already allowed the application to
switch the logic according to whether the generator requires pose
detection or not; otherwise, it could have been more trouble to
support Kinect SDK.) It may be useful to add more events with more
granular, more generic,and clearer semantics. (like TrackingStart
which is different from either of NewUser or CalibrationEnd.)

Thanks,
Tomoto
> ...
>
> もっと読む ≫

Tomoto

unread,
Jun 28, 2011, 4:53:24 AM6/28/11
to OpenNI
Hi Amir,

1) Sounds great!

3) You do not need to niReg -u XnVFeatures, but just add <Query>
element in the configuration XML (as noted in README) to "match" only
our generator.

...
<Node type="User" name="User1">
<Query>
<Name>MSRKinectUserSkeletonGenerator</Name>
</Query>
</Node>
...

Thanks,
Tomoto
> ...
>
> もっと読む ≫
Message has been deleted

Felix

unread,
Jun 28, 2011, 8:19:09 AM6/28/11
to OpenNI
Hi,

got it running, too.

My only obstacles were all in the config.xml:

* Besides removing Gesture and Hands nodes, I also had to remove the
Scene nodes. No problem as I didn't use any of them recently.
* The MapOutputMode for the Image node with 1280x1024@15fps didn't
work, so I needed to switch back to 640x480@30fps.
Kind of a pity, as I don't want an rgb video stream, but I only want
to take hi-res photos at specific points

Anyway, Thanks Tomoto, great work!

Tomoto

unread,
Jun 28, 2011, 2:49:33 PM6/28/11
to OpenNI
Hi Felix,

Thanks for your interest and feedback!

* It was possible to implement the scene generator, but I thought
nobody would use it.

* OK, give me some time to support 1280x1024 resolution.

Tomoto.

Joshua Blake

unread,
Jun 28, 2011, 3:08:28 PM6/28/11
to openn...@googlegroups.com
Tomoto,

The skeleton smoothing should be turned on when you give a non-zero
value to SkeletonCapability::SetSmoothing. Does it not work as you
expect?

 

 
I tried the set smoothing but it didn't change much. I also changed your implementation slightly to set different Kinect SDK smoothing parameters, and it helped a bit.
 
What I was seeing is when I move ScatterView images around the screen using the Hand joints and I pause, there is a bit of jitter that I don't believe was there with NITE. It is probably just an artifact of how Kinect SDK does skeleton tracking as opposed to NITE.
 
I think it might be interesting to add another clause to the ones the Kinect SDK requires that says something along the lines of "If this code is used with a version of the Kinect SDK that does not require the above clauses, then this license can be replaced with an unmodified [Apache 2 | BSD] license." Basically meaning that if they change the license on future versions, the code we distribute reverts to a normal open source license.
 
(Not quite as extensive as Kevin Connolly's clause 4, near the bottom: http://kinectnui.codeplex.com/SourceControl/changeset/view/67930#1224657)
 
Josh
 

Tomoto

unread,
Jun 29, 2011, 5:16:54 AM6/29/11
to OpenNI
Hi Felix,

Now 1280x1024 is supported.

Thanks,
Tomoto

Felix

unread,
Jun 29, 2011, 6:13:36 AM6/29/11
to OpenNI
Hi Tomoto,

thanks for the fast implementation, now I can run my applications with
both drivers and the same functionality :-)

I noticed that the minimum distance is 85 cm in opposite to the about
30 cm with NITE, but I guess that is the limitation of the Microsoft
SDK?

Thanks again,
Felix

Tomoto

unread,
Jun 29, 2011, 6:14:48 AM6/29/11
to OpenNI
Hi Josh,

> It is probably just an artifact of how Kinect SDK does skeleton tracking as opposed to NITE.

I guess so too. I would appreciate if you could check in or send me
the smoothing parameters you tried. (Did you find some reasonable
formula to map the parameter of SetSmoothing in OpenNI to
NUI_TRANSFORM_SMOOTH_PARAMETERS in Kinect SDK?)

I added a statement to the license terms as you suggested. In fact, I
originally thought to distribute this module under the BSD license (as
well as my other hacks) if allowed. It makes sense to state this
intention and the reason why I cannot do so.

Thanks,
Tomoto

Tomoto

unread,
Jun 29, 2011, 8:17:24 AM6/29/11
to OpenNI
Hi Felix,

Glad to hear it worked!

> I noticed that the minimum distance is 85 cm in opposite to the about
> 30 cm with NITE, but I guess that is the limitation of the Microsoft
> SDK?

I think so. I notice Microsoft's SDK generally has more "blank
area" (I mean the pixels whose depth are unknown in the depth map)
than PrimeSense's. Possibly it might be a sort of optimization to
better isolate human's gaming motion. Just a guess.

Tomoto

Amir Hirsch

unread,
Jun 29, 2011, 12:47:14 PM6/29/11
to openn...@googlegroups.com
I think the MS SDK just arbitrarily cuts out anything in front of 85
centimeters.

Amir

Amir Hirsch

unread,
Jun 29, 2011, 2:21:50 PM6/29/11
to openn...@googlegroups.com
Here are the bin dumps for the "GetGeneralProperty" function.

I'll see if this combined with the GetRealProperty GetIntProperty and
GetStrProperty overrides can get NITE working with the MSR Kinect
Depth Generator module.


Amir

CmosBlankingUnits.bin
D2S.bin
InstancePointer.bin
S2D.bin
SupportedModes.bin

Amir Hirsch

unread,
Jun 29, 2011, 2:26:26 PM6/29/11
to openn...@googlegroups.com
list didn't like my attachments:

here: zigfu.com/Sensorbins.rar

Joshua Blake

unread,
Jun 29, 2011, 11:11:05 PM6/29/11
to openn...@googlegroups.com
Amir,
 
Can you figure out a way to use the depth2shift table with the KinectSDK OpenNI bridge and enable the alternate depth sensor viewpoint, which as I understand is not implemented in the Kinect SDK yet?
 
If so, epic win.
 
Thanks,
Josh

Lance Drake

unread,
Jun 29, 2011, 11:11:37 PM6/29/11
to openn...@googlegroups.com
Hi OpenNI folks - 

Can anyone provide any info on whether it's possible to use the Kinect device on a MacPro (Snow Leopard) with VMWare Fusion v3.1.3 ?  

The "NuiInitialize failed" alert shows up and I am not sure why.  The device runs great with the same compliment of software on an HP PC.  I would love to not have to have two computers running - would like to do all my development on my Mac - but Kinect and this virtual PC seems to have a problem.  

Any clues?  THANKS!



BACKGROUND: I have installed Windows 7 Pro (64bit), the Visual Studio 2010 ultimate package - the Kinect SDK 64 package - the DirectX SDK June 2010 package - and everything else they point you at.

Amir Hirsch

unread,
Jun 29, 2011, 11:27:55 PM6/29/11
to openn...@googlegroups.com
What alternate viewpoint are you talking about?

And the answer is probably no since we do not get to send messages to
the Kinect via the Kinect SDK and I'm not sure if i can send it data
through some other mechanism while the kinect sdk is loaded in
windows.

This is only useful for convincing NITE to work with whatever depth
data you feed it. Might be useful for auto-labeling a skeleton if you
wanted to implement your own.

It would be really useful if we could choose what to input to
Microsoft's skeleton tracker and feed it our own depth data. Something
we really want to do is filter the depth image for a specific user so
we can select which user to pass to skeleton tracker.

Amir

Tomoto

unread,
Jun 29, 2011, 11:56:53 PM6/29/11
to OpenNI
Hi Josh,

Alternate depth sensor viewpoint is already implemented by the bridge.
(In fact, it was a "must" feature for my AR hacks.) It works by
applying a KinectSDK API that transforms the depth sensor coords to
the RGB camera coords pixel by pixel. I guess this is what you want.

Thanks,
Tomoto


On 6月29日, 午後8:11, Joshua Blake <joshbl...@gmail.com> wrote:
> Amir,
>
> Can you figure out a way to use the depth2shift table with the KinectSDK
> OpenNI bridge and enable the alternate depth sensor viewpoint, which as I
> understand is not implemented in the Kinect SDK yet?
>
> If so, epic win.
>
> Thanks,
> Josh
>
> On Wed, Jun 29, 2011 at 2:26 PM, Amir Hirsch <a...@tinkerheavy.com> wrote:
> > list didn't like my attachments:
>
> > here: zigfu.com/Sensorbins.rar
>
> > On Wed, Jun 29, 2011 at 11:21 AM, Amir Hirsch <a...@tinkerheavy.com>
> > wrote:
> > > Here are the bin dumps for the "GetGeneralProperty" function.
>
> > > I'll see if this combined with the GetRealProperty GetIntProperty and
> > > GetStrProperty overrides can get NITE working with the MSR Kinect
> > > Depth Generator module.
>
> > > Amir
>
> > > On Wed, Jun 29, 2011 at 9:47 AM, Amir Hirsch <a...@tinkerheavy.com>
> > >>>> > > > * The MapOutputMode for the Image node with 1280x1024@15fpsdidn't

Joshua Blake

unread,
Jun 29, 2011, 11:58:22 PM6/29/11
to openn...@googlegroups.com
Oh ok, I didn't realize that was implemented. (The app I use that feature crashed last time I tried with the bridge.) I'll get the latest revision and test again.
 
Thanks!

Tomoto

unread,
Jun 30, 2011, 12:06:49 AM6/30/11
to OpenNI
In case I was not clear:

I mean, the "usual" way to turn on the alternative viewpoint, i.e.
depthGenerator.GetAlternativeViewPointCap().SetViewPoint(imageGenerator),
should work as expected. It will turn on the coordinate transformation
within the bridge that uses KinectSDK API. Depth map and user ID map
are transformed pixel by pixel. Skeleton coordinates are also mapped
by using some dirty hack.

Tomoto

Tomoto

unread,
Jun 30, 2011, 5:02:44 AM6/30/11
to OpenNI
Amir,

I imported your dump, but had no luck -- I got an access violation.
The code is in primesense-compatibility branch (https://
www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes?rev=primesense-compatibility)
so take a look if you are interested.

I observed the requested data size for D2S property at runtime was
different from the size of your dump. It might illustrate that the
issue was not so easy that we could solve simply by "record and reply"
the data. We may have a chance if we go through Avin's SensorKinect
source code and understand what these properties really are.

Thanks,
Tomoto


On 6月29日, 午前11:26, Amir Hirsch <a...@tinkerheavy.com> wrote:
> list didn't like my attachments:
>
> here: zigfu.com/Sensorbins.rar
>
>
>
>
>
>
>
> On Wed, Jun 29, 2011 at 11:21 AM, Amir Hirsch <a...@tinkerheavy.com> wrote:
> > Here are the bin dumps for the "GetGeneralProperty" function.
>
> > I'll see if this combined with the GetRealProperty GetIntProperty and
> > GetStrProperty overrides can get NITE working with the MSR Kinect
> > Depth Generator module.
>
> > Amir
>

Amir Hirsch

unread,
Jun 30, 2011, 7:44:36 AM6/30/11
to openn...@googlegroups.com, OpenNI
You should try dumping the data yourself. I think I might be losing a large portion of one of the tables because it looks like it is being dumped each frame.

People love us on github.com/tinkerer

Joshua Blake

unread,
Jun 30, 2011, 11:11:03 AM6/30/11
to openn...@googlegroups.com
Tomoto,
In one of my OpenNI apps, I call
var scene = userGenerator.GetUserPixels(userId);

The scene returned has an XRes and YRes of 0.
 
Any idea why?
 
I tried changing the DepthNode to 320x240 but it gave me a not supported exception.
 
Thanks,
Josh

Tomoto

unread,
Jun 30, 2011, 2:35:09 PM6/30/11
to OpenNI
Hi Josh,

I may not have taken care of everything of XnSceneMetaData.

Oh, do you need 320x240?

OK, give me some time...

Thanks,
Tomoto


On 6月30日, 午前8:11, Joshua Blake <joshbl...@gmail.com> wrote:
> Tomoto,
> In one of my OpenNI apps, I call
>
> var scene = userGenerator.GetUserPixels(userId);
>
> The scene returned has an XRes and YRes of 0.
>
> Any idea why?
>
> I tried changing the DepthNode to 320x240 but it gave me a not supported
> exception.
>
> Thanks,
> Josh
>
>
>
>
>
>
>
> On Thu, Jun 30, 2011 at 7:44 AM, Amir Hirsch <a...@tinkerheavy.com> wrote:
> > You should try dumping the data yourself. I think I might be losing a large
> > portion of one of the tables because it looks like it is being dumped each
> > frame.
>
> > People love us on github.com/tinkerer
>
> > On Jun 30, 2011, at 2:02 AM, Tomoto <tom...@gmail.com> wrote:
>
> > > Amir,
>
> > > I imported your dump, but had no luck -- I got an access violation.
> > > The code is in primesense-compatibility branch (https://
>
> >www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes?rev=primes...
> > )
> > >>>>>>>>> * The MapOutputMode for the Image node with 1280x1024@15fpsdidn't

Joshua Blake

unread,
Jun 30, 2011, 3:04:39 PM6/30/11
to openn...@googlegroups.com

I don't need 320x240 specifically except that I think player indexes are only supported in that mode and not 640x480 in Kinect SDK.

Tomoto

unread,
Jun 30, 2011, 3:37:09 PM6/30/11
to OpenNI
Josh,

Check out the latest code. XNSceneMetaData is now populated by
GetUserPixels.

> I don't need 320x240 specifically except that I think player indexes are only supported in that mode and not 640x480 in Kinect SDK.

Don't worry about this. This bridge gets the data from Kinect SDK in
320x240 (with player indexes) and then converts it to 640x480. It was
necessary for me to get my applications worked without any code
changes (except configuration files).

One drawback of this implementation is you must use 320x240 (= cannot
use 640x480) even if you don't need player indexes. I have left this
restriction so far because there is no elegant way for a Depth node to
know whether the application needs player indexes or not at the time
of opening the image stream.

Thanks,
Tomoto

Tomoto

unread,
Jun 30, 2011, 3:45:15 PM6/30/11
to OpenNI
Hi Amir,

Actually your data was "too sufficient" (larger than the requested
size -- it was 20002 while 8002 requested), which suggests the data
depends on some parameters we do not know of yet. That is the reason
why I think we had better take a look at SensorKinect source code for
which parameters the data depends on.

Glad to hear that our work helps other people!

Thanks,
Tomoto


On 6月30日, 午前4:44, Amir Hirsch <a...@tinkerheavy.com> wrote:
> You should try dumping the data yourself. I think I might be losing a large portion of one of the tables because it looks like it is being dumped each frame.
>
> People love us on github.com/tinkerer
>
> On Jun 30, 2011, at 2:02 AM, Tomoto <tom...@gmail.com> wrote:
>
>
>
>
>
>
>
> > Amir,
>
> > I imported your dump, but had no luck -- I got an access violation.
> > The code is in primesense-compatibility branch (https://
> >www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes?rev=primes...)

Joshua Blake

unread,
Jul 2, 2011, 1:47:19 PM7/2/11
to openn...@googlegroups.com
Thanks Tomoto, I'll test it out.
 
On Thu, Jun 30, 2011 at 3:37 PM, Tomoto <tom...@gmail.com> wrote:


One drawback of this implementation is you must use 320x240 (= cannot
use 640x480) even if you don't need player indexes. I have left this
restriction so far because there is no elegant way for a Depth node to
know whether the application needs player indexes or not at the time
of opening the image stream.

 
Another example of why OpenNI's API needs some updating (or de-NITEing ;))
 
Josh

maxOh_

unread,
Jul 3, 2011, 2:21:29 PM7/3/11
to OpenNI
Hi guys,

first of all, thanks a lot for the detailed work you've done. This
thread helped me quite a bit.
I had the same problem with the 'Couldn't get maxShift', but i don't
use the microsoft sdk. I stream the depthMap in realtime over the
network to another computer. On the other computer i don't have a
camera connected, i just receive the depthMap and feed it to a
ModuleDepthGenerator.
I didn't had the time to try the UserGenerator, but the SceneAnalyser
works! I use the same properties you do for the int/string properties,
but my GeneralProperty byte dumps are a bit different. Maybe this is
because i use the asus xtion and not the kinect, but i will try this
with the kinect as well.
The GeneralProperty sizes seams to be right, the only thing i did, was
to connect a camera and dump the GeneralProperties:

// write the bin blocks to disk
_depthGen.GetGeneralProperty("SupportedModes",66,buf);
writeBin("SupportedModes.bin",66,buf);

_depthGen.GetGeneralProperty("D2S",20002,buf);
writeBin("D2S.bin",20002,buf);

_depthGen.GetGeneralProperty("CmosBlankingUnits",8,buf);
writeBin("CmosBlankingUnits.bin",8,buf);

_depthGen.GetGeneralProperty("InstancePointer",4,buf);
writeBin("InstancePointer.bin",4,buf);

_depthGen.GetGeneralProperty("S2D",4096,buf);
writeBin("S2D.bin",4096,buf);

Then i used those bin-dumps in the ModuleDepthGenerator, like you did.

thanks,
max

RSAbg

unread,
Jul 4, 2011, 2:19:09 AM7/4/11
to OpenNI
Hi,

First of all, my deepest respect to all of you developing these hacks
here!
I will be trying them asap!

My problem was that the Nite Skeleton Tracking was not good enough for
my purposes. I tried the Microsoft SDK and the Skeleton tracking alg.
really kicks ass. Now I have a big database of oni-Files which I need
to analyze and the Microsoft SDK won't let me use them.

I was thrilled to read that you can use the Microsoft Tracking in
openNI. Is it possible as well to read oni files and pass them to the
microsoft skeleton algorithm?

thank you very much

Benedikt

Tomoto

unread,
Jul 4, 2011, 3:49:36 AM7/4/11
to OpenNI
Hi Josh and all,

I added suppoort for the Depth generator at 640x480. Though it might
not be much useful, I wanted to solve this puzzle as a showcase of how
to deal with the "gap" between the programming models of the
underlying data source and the OpenNI framework. The OpenNI community
might be able to use this as a sort of benchmark to evaluate the
generality of the OpenNI framework.

This implementation automatically chooses the resolution of the
underlying depth stream (640x480 or 320x240) in a way to best satisfy
the application's requirement. This choice is transparent so that the
application sees the same 640x480 data in any cases.

This is how it works:
1. There is a single "manager" for all the underlying data stream.
2. Each generator notifies the manager of its requirement (e.g. "hey,
I am a Depth generator and need 640x480 resolution") upon
initialization. The manager just keeps these requirements, and
suspends the actual initialization of Kinect SDK.
3. At the first call of StartGenerating of any generator, the manager
initializes the Kinect SDK and opens the data streams with the most
desirable parameters as per the requirements collected by that moment.
(e.g. It opens the depth stream with
NUI_IMAGE_TYPE_DEPTH_AND_PLAYER_INDEX@320x240 if both Depth and User
generators are requested, or NUI_IMAGE_TYPE_DEPTH@640x480 if only
Depth generator is requested. The parameters to NuiInitialize are also
chosen in the similar way.)
4. Each generator knows the manager's decision and is responsible for
converting the data from the underlying stream to the format requested
by the application.

It is a hack, but looks kind of OK for me. Someone ("manager" in above
case) eventually needs to know all the characteristics and
restrictions of the underlying data source, and act as an interpreter
with OpenNI's programming model.

> Another example of why OpenNI's API needs some updating (or de-NITEing ;))

I think "de-NITEing" part is essential. If NITE's analyzers could work
with just any OpenNI conformant Depth generators, I would not need to
come up with these tricks.

Hope it helps,
Tomoto

Tomoto

unread,
Jul 4, 2011, 4:19:08 PM7/4/11
to OpenNI
Hi Benedikt,

> Is it possible as well to read oni files and pass them to the microsoft skeleton algorithm?

I do not think so. AFAIK, Microsoft skeleton algorithm is
monolithically integrated in the pipileine within Kinect SDK, and
there is no way to feed your own depth data into it. My User generator
just "wraps" the final result of their skeleton algorithm, which means
it works "by itself" and does not accept any other Depth generators to
feed the depth data. I am sorry to disappoint you, but it is the
architecture of Kinect SDK.

Thanks,
Tomoto

Tomoto

unread,
Jul 4, 2011, 6:04:20 PM7/4/11
to OpenNI
Hi Max,

Great to hear that your SceneGenerator worked. I am still getting a
crash by null pointer access with NITE's Scene or User generator.

Interestingly, the length of D2S property requested by NITE generators
is not the same with SensorKinect's depth generator and my depth
generator. It was 20002 bytes when I used SensorKinect's depth
generator, but it changed to 8002 once I replaced the depth generator
with mine. Apparently "record and play back" approach would not work
in this situation. Anyways, there is no way for me to know if it is
the direct cause of the crash.

Also, I noticed it would not make sense to record and play back
InstancePointer property (because it looks like a pointer value on the
memory), but, fortunately, this property seems unused for non-
PrimeSense depth generators.

Thanks,
Tomoto

RSAbg

unread,
Jul 5, 2011, 1:58:31 AM7/5/11
to OpenNI
Hi Tomoto,
thank you very much for your answer... I feared that you would answer
that.
so I guess the only possible way to do it would be to write a new
driver which simulates to be the Kinect sensor for the Microsoft SDK
and then load and pass the oni data... am I right?
unfortunately, that's beyond my programming skills...

maxOh_

unread,
Jul 5, 2011, 11:16:35 AM7/5/11
to OpenNI
Hi tomoto,

> Great to hear that your SceneGenerator worked. I am still getting a
> crash by null pointer access with NITE's Scene or User generator.

I also had this crashes in the beginning, but since i used the
captured data, this disappeared.

> Interestingly, the length of D2S property requested by NITE generators
> is not the same with SensorKinect's depth generator and my depth
> generator. It was 20002 bytes when I used SensorKinect's depth
> generator, but it changed to 8002 once I replaced the depth generator
> with mine. Apparently "record and play back" approach would not work
> in this situation. Anyways, there is no way for me to know if it is
> the direct cause of the crash.

That's strange, when openNI calls GetGeneralProperty for "D2S", i do
still get 20002 for the bufferSize.
Also i checked the minimum GeneralProperties i need, it workes with
"D2S" and "S2D", for the others i give back XN_STATUS_ERROR.
If i leave away "D2S" or "S2D" i do get crashes too.

Finally i tested the userGenerator and it works with the skeletons.
Also i tested the same code with the kinect and it works(before i used
the asus xtion).
All the test i did under linux with the 64bit version, but i don't
think this would be a difference with windows(more i don't hope
so ;) ).

max

GGG

unread,
Jul 6, 2011, 6:24:26 PM7/6/11
to OpenNI
Has anyone made any more progress understanding what the UseGenerator
needs from a depth generator to avoid crashes. I have enjoyed reading
through this thread and made it as far as creating a module that works
to show range images in NiViewer, but NiUserTracker will crash when it
tries g_UserGenerator.Create and prints the error:
Couldn't alloc depthToShift buffer

Gaile

Amir Hirsch

unread,
Jul 8, 2011, 5:58:16 AM7/8/11
to openn...@googlegroups.com
I'm working on it. I've discovered that the Sensor driver contains an
ONI device file driver that records all the parameters I had been
writing them to .bin files. I've noticed what Tomoto has, that when
GetGeneralProperty asks for

I've also noticed that it asks for things in a totally different order
so it might be important to discover what other things are
communicated between NITE and Sensor. I'll examine the ONI file stuff
tomorrow. My mod of the Sensor driver reports this:

C:\Program Files (x86)\PrimeSense\NITE\Samples\Bin\Release>Sample-Boxes.exe
GetProperty (GENERAL): InstancePointer
Writing 4 to InstancePointer1.bin
GetProperty (INT): SupportedModesCount : 15
GetProperty (GENERAL): SupportedModes
Writing 90 to SupportedModes1.bin
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): D2S
Writing 20002 to D2S2.bin
GetProperty (GENERAL): S2D
Writing 4096 to S2D3.bin
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): S2D
Writing 4096 to S2D4.bin
GetProperty (GENERAL): D2S
Writing 20002 to D2S5.bin
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): D2S
Writing 20002 to D2S6.bin
GetProperty (GENERAL): S2D
Writing 4096 to S2D7.bin
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
Setting resolution to QVGA
GetProperty (INT): ConstShift : 200

Then when I install KinectSDK and attempt to get the depth generator
to look the same as the PrimeSense sensor it asks for totally
different things (MaxShift first instead of InstancePointer).

C:\Program Files (x86)\PrimeSense\NITE\Samples\Bin\Release>Sample-Boxes.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
GetIntProperty, returning: MaxShift , 2047
GetGeneralProperty: D2S , 8002
GetGeneralProperty: S2D , 4096
GetRealProperty, returning: ZPPS , 0.105200
GetIntProperty, returning: ZPD , 120
GetIntProperty, returning: MaxShift , 2047
GetGeneralProperty: S2D , 4096
GetGeneralProperty: D2S , 8002
GetIntProperty, returning: ZPD , 120
GetRealProperty, returning: ZPPS , 0.105200
GetRealProperty, returning: LDDIS , 7.500000
GetRealProperty, returning: ZPPS , 0.105200
GetIntProperty, returning: ZPD , 120

Then the NITE boxes sample doesn't work. Booooooooooooo

OK so now i'm going to go down a deeper rabbit whole and figure out
how ONI playback works.

it's interesting that the NITE algorithms are dependent on
DepthToShift and ShiftToDepth tables pulled from the Sensor driver.


amir

Amir Hirsch

unread,
Jul 8, 2011, 3:27:21 PM7/8/11
to openn...@googlegroups.com
I just discovered that if you report MaxDepth of 10,000 instead of the
artificially low value that you get from the MSRKinectSDK then the D2S
table requests 20002 bytes instead of 8002. It still doesn't seem to
like my values being reported from the bin files.

Tomoto

unread,
Jul 12, 2011, 7:04:03 PM7/12/11
to OpenNI
Hi Amir,

It looks like the size of D2S table = (MaxDepth + 1) * sizeof(short).
That makes sense!

Tomoto

Tomoto

unread,
Jul 17, 2011, 8:09:00 AM7/17/11
to OpenNI
Hi All,

Now the KinectSDK-based DepthGenerator is working with NITE's
UserGenerator! Thanks to helpful information from Max, Amir, and
others. Check it out.

Binary: https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/release/kinect-mssdk-openni-bridge-0.0.zip
Source: https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes

Tomoto
> > >> For more options, visit this group athttp://groups.google.com/group/openni-dev?hl=en.- 引用テキストを表示しない -
>
> - 引用テキストを表示 -

noisecrime

unread,
Jul 21, 2011, 8:03:56 AM7/21/11
to OpenNI
Fantastic work on this, looks like a great deal of progress has been
made.

A couple of questions if someone could be so kind to answer.

1. Are any openNI/NITE features not usable/accessible?

From the replies, it sounds like many features are working, but what
about stuff like openNI hand tracking and gestures?
Looking at my xml files I use Image, Depth, User, Gesture, Hands nodes
in most of my projects, so would need continued support from them.
Would just like a heads up if something is currently missing, before I
go through the process or uninstaling and installing everything.

2. What features are used from Kinect SDK?
I can see the most important one of psi-pose-less skeleton tracking is
supported but is there anything else of interest? Haven't looked at
the MS SDK much but I do remember it being rather lacking in features
compared to openNI, so I guess it may not have any other features that
we are interested in?

Mind you one thing that would be nice is to finally get control of the
kinect motor. Is that possible using this bridge? If so how? Is it a
case of simply using both openNI library and kinect SDK library at the
same time?


3. What exactly are the steps to install everything?
In the readme, it just says to install openNI, however its unclear to
me if that means just openNI or the whole openNI suite ( i.e. NITE) as
well or what to do about drivers.

So would this be correct?

1. Uninstall openNI, NITE, Avin2 sensor kinect, delete drivers
permanently from system etc.
2. Install MS Kinect SDK, connect Kinect and get official drivers
installed
2a. Run any MS SDK examples to confirm everything is working.
3. Install openNI
4. Install NITE?
5. Run the Kinect Bridge install.bat
6 Run NiReg to confirm bridge install


thanks

Joshua Blake

unread,
Jul 21, 2011, 12:40:19 PM7/21/11
to openn...@googlegroups.com
On Thu, Jul 21, 2011 at 8:03 AM, noisecrime <no...@noisecrime.com> wrote:
3. What exactly are the steps to install everything?
In the readme, it just says to install openNI, however its unclear to
me if that means just openNI or the whole openNI suite ( i.e. NITE) as
well or what to do about drivers.

 
To switch between OpenNI/NITE/SensorKinect and OpenNI/Bridge/KinectSDK, you need to do this, assuming you already have NITE/SensorKinect installed:
 
1) Plug in Kinect.
2) Open Device Manager, find the Kinect Camera entry, right click and uninstall, click the checkbox that asks about deleting the software, then click OK.
3) Uninstall SensorKinect from Add/Remove Programs.
4) Install or repair Kinect SDK.
 
You'll also need to register the OpenNI-KinectSDK bridge using the batch file provided, of course, but you do not need to unregister it when switching between backends. You do need to change your openni.xml init file though.
 
To switch back from Kinect SDK to SensorKinect.
1) Plug in Kinect
2) Open Device Manager, find the Kinect Camera entry, right click and uninstall, click the checkbox that asks about deleting the software, then click OK.
3) Install SensorKinect
 
You don't need to uninstall KinectSDK, and then if you switch back you just need to do a repair install so it will reinstall the driver. We uninstall SensorKinect only because it's repair install does not seem to reinstall the driver.
 
Also it's important to click the checkbox when uninstalling the driver in device manager. That removes the driver from the driver cache. If you just install KinectSDK over top of SensorKinect without uninstalling, then it might work, but it's pure chance about which driver Kinect loads. Before I figured out this procedure I went for several hours using Kinect SDK just fine, then suddenly it starts crashing and it's because the device reinitialized at one point and happened to pick the SensorKinect driver for the camera. We need to make sure there is only one driver installed at a time for this reason.
 
Hope this helps.
 
Thanks,
Josh

noisecrime

unread,
Jul 21, 2011, 1:04:31 PM7/21/11
to OpenNI


On Jul 21, 5:40 pm, Joshua Blake <joshbl...@gmail.com> wrote:
>
> To switch between OpenNI/NITE/SensorKinect and OpenNI/Bridge/KinectSDK, you
> need to do this, assuming you already have NITE/SensorKinect installed:
>

Thanks that's a great help, good clear instructions.

Cheers

Amir Hirsch

unread,
Jul 21, 2011, 4:26:46 PM7/21/11
to openn...@googlegroups.com
The "Gesture Generator" component of NITE is the only one that does not work with the depth data from the Kinect SDK. The code includes a mock Depth-to-Shift data which tricks NITE into working with the Kinect SDK. We have no idea why the Gesture Generator does not also "just work."

In order to work around needing a Gesture issue for initiating a hand session there are a number of options. Since you get askeleton calibration-free, you can get information from from the the MSR skeleton about where to start the hand generator session and even make your own pose or gesture for initiating a hand session.

unfortunately this doesn't "just work" all the time like the gesture generator would when you are occluded under a blanket.


Amir

James Walsh

unread,
Jul 21, 2011, 7:16:44 PM7/21/11
to openn...@googlegroups.com
When this thread started a while back one of the big limitations was that there was not yet any joint orientatins being reported when using the bridge. Has that feature been added or are we still getting only position data?

My project currently relies on the orientation data for most joints and Im not ready to calculate that on my own. :-)


From: Amir Hirsch <am...@zigfu.com>
To: openn...@googlegroups.com
Sent: Thu, July 21, 2011 4:26:46 PM
Subject: Re: [OpenNI-dev] Re: kinectsdk + openni/nite at the same time

noisecrime

unread,
Jul 21, 2011, 9:24:51 PM7/21/11
to OpenNI


On Jul 21, 9:26 pm, Amir Hirsch <a...@zigfu.com> wrote:
> The "Gesture Generator" component of NITE is the only one that does not work
> with the depth data from the Kinect SDK. The code includes a mock
> Depth-to-Shift data which tricks NITE into working with the Kinect SDK. We
> have no idea why the Gesture Generator does not also "just work."

Thanks for the info.

Shame that NITES gestures aren't yet supported, sounds like its a bit
of a puzzle as to why it isn't.

Its not particularly the gestures that i'm interested in, but more the
hand/point tracking. From what you say can I infer that hand/point
tracking is also not supported? I understand I could use the kinect
SDK skeleton and write my own, but I like the results NITE provides
and suspect its not completely straightforward to implement (i.e how
NITE relates a hand point realtively to the screen).

Is anyone still working on integrating aspects such as gestures into
the bridge, or is it considered 'feature complete' at this time?

Anyway the process for installation seems nice and straightforward.
I'll probably update to the latest openNI/NITE/Sensor over the next
few days, then try out the sdk sometime.



Amir Hirsch

unread,
Jul 21, 2011, 10:00:04 PM7/21/11
to openn...@googlegroups.com
On Thu, Jul 21, 2011 at 6:24 PM, noisecrime <no...@noisecrime.com> wrote:


On Jul 21, 9:26 pm, Amir Hirsch <a...@zigfu.com> wrote:
> The "Gesture Generator" component of NITE is the only one that does not work
> with the depth data from the Kinect SDK. The code includes a mock
> Depth-to-Shift data which tricks NITE into working with the Kinect SDK. We
> have no idea why the Gesture Generator does not also "just work."

Thanks for the info.

Shame that NITES gestures aren't yet supported, sounds like its a bit
of a puzzle as to why it isn't.

Its not particularly the gestures that i'm interested in, but more the
hand/point tracking. From what you say can I infer that hand/point
tracking is also not supported? I understand I could use the kinect
SDK skeleton and write my own, but I like the results NITE provides
and suspect its not completely straightforward to implement (i.e how
NITE relates a hand point realtively to the screen).


Hand Point Tracking *IS* supported if you provide the hand generator with a point to start tracking at. This starting point is normally produced by a gesture generator, but you can use the calibration-free skeleton for it.
 
Is anyone still working on integrating aspects such as gestures into
the bridge, or is it considered 'feature complete' at this time?


no idea. i think the gesture system demonstrated by MichaelK would work very well with a calibration-free skeleton. also someone making a gesture system based on raw depth would be extremely useful.

Anyway the process for installation seems nice and straightforward.
I'll probably update to the latest openNI/NITE/Sensor over the next
few days, then try out the sdk sometime.


Prepare to be disappointed. The kinect skeleton seems pretty unstable and certainly has more kinetic energy than  the pixels coming from the depth cam..

Amir Hirsch

unread,
Jul 21, 2011, 10:02:26 PM7/21/11
to openn...@googlegroups.com
orientation is there, even wrists and ankles, but the wrists don't really discern between bending forward and backward and the hands aren't symmetric. best advice is that it's worth playing with it.

noisecrime

unread,
Jul 21, 2011, 10:20:49 PM7/21/11
to OpenNI


On Jul 22, 3:00 am, Amir Hirsch <a...@zigfu.com> wrote:

> Hand Point Tracking *IS* supported if you provide the hand generator with a
> point to start tracking at. This starting point is normally produced by a
> gesture generator, but you can use the calibration-free skeleton for it.

Ah interesting, thanks for that, I misread that bit in your original
reply.


> > Is anyone still working on integrating aspects such as gestures into
> > the bridge, or is it considered 'feature complete' at this time?
>
> no idea. i think the gesture system demonstrated by MichaelK would work very
> well with a calibration-free skeleton. also someone making a gesture system
> based on raw depth would be extremely useful.

Sorry I specifically meant getting NITE Gesture (push, wave) working
so you can avoiding having to pass in a hand starting point.

> Prepare to be disappointed. The kinect skeleton seems pretty unstable and
> certainly has more kinetic energy than  the pixels coming from the depth
> cam..

Oh that doesn't sound good. Half the point of using the kinect Bridge
was to gain access to what I thought was a better skeleton tracking
system. Is there any reason for this, I don't remember any demo's i've
seen using the kinect sdk having unstable skeletons, though I don't
recall seeing one using 3D joint data, just 2D projected line
drawings.

thanks for the info.



Tomoto

unread,
Jul 22, 2011, 5:21:24 AM7/22/11
to OpenNI
Hi noisecrime,

Josh and Amir already answered to your questions so I have only a few
things to add:

(1) NITE generators

I have not got NITE Hand generator worked although Amir successfully
did. It does not crash but never recognizes the hand. I do not know
what makes the difference and I cannot tell if it works with your
configuration.

WRT Gesture generator, nobody seems to have got it worked as Amir
says. We do not know what NITE Gesture generator expects on the
underlying Depth generator...

(2) Kinect SDK features

You can just call other Kinect SDK functions (such as motor) from your
application. But note you can use these functions only after the
generators start generating the data because the bridge initializes
the Kinect SDK at the timing of StartGenerating.

The bridge itself has a fun toy that uses the motor API. By using
MSRKinectUserSkeletonGeneratorWithAutoElevation instead of
MSRKinectUserSkeletonGenerator (as noted in the readme), the camera
angle will be automatically adjusted to capture your full body as
possible. This is only a demonstration and may not be much useful.

Thanks,
Tomoto

noisecrime

unread,
Jul 22, 2011, 9:57:00 AM7/22/11
to OpenNI


On Jul 22, 10:21 am, Tomoto <tom...@gmail.com> wrote:

Thanks for replying.

With regard to Hand and Gesture Generators not working is this with
the new release of openNI?
I was hoping that PrimeSense would make some efforts to help out,
especially as the difficulties encountered would seem to make a bit of
a mockery of having an open platform with user created modules if its
this hard to implement them.


> MSRKinectUserSkeletonGeneratorWithAutoElevation

I saw that in the read me and sounds like a great feature. ;)

So I can control the motor via the sdk, but only after the bridge
initializes, sounds fair enough.


Still in two minds about installing this. Its very tempting especially
to have offical(beta) drivers and access to features such as motor and
I presume audio via the SDK. Its good to see that most of openNI/NITE
can work with the camera data via the SDK, but disappointing that some
things currently don't and that user map results in losing depth
resolution.

Tomoto

unread,
Jul 22, 2011, 4:17:26 PM7/22/11
to OpenNI
> With regard to Hand and Gesture Generators not working is this with
> the new release of openNI?
> I was hoping that PrimeSense would make some efforts to help out,

I have not tried it yet. And I totally agree on your "hope".

> user map results in losing depth resolution.

There is a workaround for this -- use NITE's UserGenerator or
SceneGenerator (i.e. do not use MSRKinectUserSkeletonGenerator). Then
the bridge automatically disables Kinect SDK's user map + skeleton
tracking and configures the depth generator in the 640x480 mode. This
way you will get the depth data of higher resolution although you will
lose the calibration-less skeleton tracking.

Thanks,
Tomoto

Tomoto

unread,
Jul 22, 2011, 4:22:24 PM7/22/11
to OpenNI
Hi Josh,

> I tried the set smoothing but it didn't change much. I also changed your
> implementation slightly to set different Kinect SDK smoothing
> parameters, and it helped a bit.

If you could let me know what change you made to the smoothing
parameters, I would be happy to accommodate it into the code.

Thanks!
Tomoto


On Jun 28, 12:08 pm, Joshua Blake <joshbl...@gmail.com> wrote:
> Tomoto,
>
> > The skeleton smoothing should be turned on when you give a non-zero
> > value to SkeletonCapability::SetSmoothing. Does it not work as you
> > expect?
>
> I tried the set smoothing but it didn't change much. I also changed your
> implementation slightly to set different Kinect SDK smoothing
> parameters, and it helped a bit.
>
> What I was seeing is when I move ScatterView images around the screen using
> the Hand joints and I pause, there is a bit of jitter that I don't believe
> was there with NITE. It is probably just an artifact of how Kinect SDK does
> skeleton tracking as opposed to NITE.
>
> I think it might be interesting to add another clause to the ones the Kinect
> SDK requires that says something along the lines of "If this code is used
> with a version of the Kinect SDK that does not require the above clauses,
> then this license can be replaced with an unmodified [Apache 2 | BSD]
> license." Basically meaning that if they change the license on future
> versions, the code we distribute reverts to a normal open source license.
>
> (Not quite as extensive as Kevin Connolly's clause 4, near the bottom:http://kinectnui.codeplex.com/SourceControl/changeset/view/67930#1224657)
>
> Josh

Patrock

unread,
Aug 3, 2011, 9:14:41 AM8/3/11
to OpenNI
Hello,

at first i wanna thank you all for your work here.

I think i found a bug in the KinectUserGenerator. When i load a xml-
file with the MSR-UserGenerator into the NiUserTracker-Sample i get a
crash in the function GetUserPixels from MSRKinectUserGenerator.

It seems that the member m_pBuffer isn't initialized right because the
pointer is invalid for me. Can someone reconstruct this ? Or did i
miss something ?

I want to get all user-labels so my-user-parameter is 0.

PopulateMapMetaData(pScene->pMap);
if (user == 0) {
pScene->pData = m_pBuffer; <---- CRASH
}

thank you.

On Jun 23, 9:07 pm, Tomoto <tom...@gmail.com> wrote:
> I encoded these values but still had no luck. Then D2S and S2D might
> be the key.
>
> Tomoto
>
> On 6月23日, 午前3:38, Amir Hirsch <a...@tinkerheavy.com> wrote:
>
>
>
>
>
>
>
> > oops forgot to make it %f:
>
> > GetProperty (INT): SupportedModesCount : 15
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): SupportedModesCount : 11
> > GetProperty (INT): MaxShift : 2047
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (REAL): LDDIS : 7.500000
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (REAL): ZPPS : 0.105200
> > GetProperty (INT): ZPD : 120
> > GetProperty (INT): ConstShift : 200
> > GetProperty (INT): ConstShift : 200
>
> > 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
>
> > > I went and tweaked Sensor to print a dump. Here's a log of the
> > > GetIntProperty and GetRealProperty calls along with valid numbers. I'm
> > > going to put these into your code now as return values and if that
> > > works well... that's just epic.. otherwise i'll send a disappointing
> > > email in 10 minutes.
>
> > > GetProperty (INT): SupportedModesCount : 15
> > > GetProperty (INT): ZPD : 120
> > > GetProperty (REAL): ZPPS : 536870912
> > > GetProperty (INT): SupportedModesCount : 11
> > > GetProperty (INT): MaxShift : 2047
> > > GetProperty (INT): ZPD : 120
> > > GetProperty (REAL): ZPPS : 536870912
> > > GetProperty (REAL): LDDIS : 0
> > > GetProperty (REAL): ZPPS : 536870912
> > > GetProperty (INT): ZPD : 120
> > > GetProperty (REAL): ZPPS : 536870912
> > > GetProperty (INT): ZPD : 120
> > > GetProperty (REAL): ZPPS : 536870912
> > > GetProperty (INT): ZPD : 120
> > > GetProperty (INT): ConstShift : 200
> > > GetProperty (INT): ConstShift : 200
>
> > > 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
> > >> It's definitely looking for more Int and Real properties than just
> > >> those it seems. If I return 0 for anything but those properties it
> > >> still freezes, but if I always assign a default then it's still
> > >> running.
>
> > >> On Thu, Jun 23, 2011 at 1:32 AM, Tomoto <tom...@gmail.com> wrote:
> > >>> Interesting. I figured out NITE asked the following values. Eventually
> > >>> I got a "division by zero" error when just returned XN_STATUS_OK for
> > >>> everything. Perhaps some proper value needs to be given..
>
> > >>> GetIntProperty:
> > >>> #define XN_STREAM_PROPERTY_MAX_SHIFT                            "MaxShift"
> > >>> #define XN_STREAM_PROPERTY_CONST_SHIFT                          "ConstShift"
> > >>> #define XN_STREAM_PROPERTY_ZERO_PLANE_DISTANCE          "ZPD"
>
> > >>> GetRealProperty:
> > >>> #define XN_STREAM_PROPERTY_ZERO_PLANE_PIXEL_SIZE        "ZPPS"
> > >>> #define XN_STREAM_PROPERTY_EMITTER_DCMOS_DISTANCE       "LDDIS"
>
> > >>> GetGeneralProperty:
> > >>> /** XN_DEPTH_TYPE[] */
> > >>> #define XN_STREAM_PROPERTY_S2D_TABLE                            "S2D"
> > >>> /** XnUInt16[] */
> > >>> #define XN_STREAM_PROPERTY_D2S_TABLE                            "D2S"
>
> > >>> On 6月23日, 午前1:09,AmirHirsch <a...@tinkerheavy.com> wrote:
> > >>>> you don't actually need to return a value for the MaxShift and can
> > >>>> just return XN_STATUS_OK when NITE asks for it. The shift2Depth table
> > >>>> is gotten through the GetGeneralProperty call:
>
> > >>>> XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> > >>>> strName, XnUInt64& nValue) const
> > >>>> {
> > >>>>         return XN_STATUS_OK;}
>
> > >>>> XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
> > >>>> strName, XnUInt32 nBufferSize, void* pBuffer) const
> > >>>> {
> > >>>>         return XN_STATUS_OK;
>
> > >>>> }
>
> > >>>> 2011/6/23AmirHirsch <a...@tinkerheavy.com>:
>
> > >>>> > Thanks Tomoto for producing a good project structure for this! I've
> > >>>> > extended your work and I''ve fixed the MaxShift issue and now dealing
> > >>>> > with other stuff now.
>
> > >>>> > For everyone's reference, if you install Tomoto's dll and run
> > >>>> > UserTracker (which uses NITE) you'll see that NITE looks for some
> > >>>> > property called maxShift:
>
> > >>>> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> > >>>> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
> > >>>> > 1-EF5E-00C04F2D728B}\00
> > >>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> > >>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>> > D1-EF5E-00C04F2D728B}\00\PIPE00)
> > >>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>> > Couldn't get maxShift.
> > >>>> > Couldn't get maxShift.
> > >>>> > Find user generator failed: Error!
>
> > >>>> > I've gone down the rabbit hole of the Sensor code inheritance (someone
> > >>>> > at PrimeSense really likes extending classes, huh?) I've discovered
> > >>>> > that the MaxShift is a property requested  through a GetIntProperty
> > >>>> > request in the XnSensorProductionNode:
>
> > >>>> > I added this little work-around to the MSRKinectDepthGenerator which
> > >>>> > just reports 2047 when asked for the IntProperty "MaxShift"
> > >>>> > (frankly, i don't think this property actually matters and I wonder
> > >>>> > why NITE has dependencies on hidden properties)
>
> > >>>> > XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
> > >>>> > strName, XnUInt64& nValue) const
> > >>>> > {
> > >>>> >        if (strName== "MaxShift")
> > >>>> >        {
> > >>>> >                nValue = 2047;
> > >>>> >        }
> > >>>> >        return XN_STATUS_OK;
> > >>>> > }
>
> > >>>> > so now it gets a MaxShift and the error looks like this saying it
> > >>>> > can't find the shift2Depth table:
>
> > >>>> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> > >>>> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
> > >>>> > 1-EF5E-00C04F2D728B}\00
> > >>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> > >>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
> > >>>> > D1-EF5E-00C04F2D728B}\00\PIPE00)
> > >>>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>> > Couldn't get shift2Depth table.
> > >>>> > Couldn't get shift2Depth table.
>
> > >>>> > ONWARD.
>
> > >>>> > 2011/6/22 Tomoto <tom...@gmail.com>:
> > >>>> >> Hi,
>
> > >>>> >> I made the code public at:https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/
>
> > >>>> >> Only Depth and Image nodes work, but others (e.g. User node) do not.
> > >>>> >> It is not that cool.
>
> > >>>> >> Thanks,
> > >>>> >> Tomoto
>
> > >>>> >> On 6月22日, 午後4:43, Tomoto <tom...@gmail.com> wrote:
> > >>>> >>> HiAmirand everyone,
>
> > >>>> >>> I got the same "maxShift" issue, and, so far, it looks nontrivial to
> > >>>> >>> get through it...
>
> > >>>> >>> It seems NITE's skeleton tracker only works with PrimeSense devices
> > >>>> >>> according to the following discussion:http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd...
> > >>>> >>> . (Note Kinect could be a "3rd party device" for PrimeSense if covered
> > >>>> >>> by Microsoft's driver.) Also, I found XnDepthStream#GetMaxShift method
> > >>>> >>> in the source code of SensorKinect. I guess NITE depends on this
> > >>>> >>> method and apparently Microsoft's driver does not have it.
>
> > >>>> >>> To avoid this issue, probably we need to wrap Microsoft's skeleton
> > >>>> >>> tracker for OpenNI so that we could (unwillingly) skip NITE at all.
> > >>>> >>> Or, it may be possible to wrap Microsoft's Kinect driver in a way
> > >>>> >>> NITE's skeleton tracker expects. In any ways, it looks nontrivial and
> > >>>> >>> painful work. :-/
>
> > >>>> >>> Anyway, I will share my source code in some public source code
> > >>>> >>> repository.
>
> > >>>> >>> Thanks,
> > >>>> >>> Tomoto
>
> > >>>> >>> On 6月22日, 午前10:14,AmirHirsch <a...@tinkerheavy.com> wrote:
>
> > >>>> >>> > So NiViewer can draw the data from our depth node that produces frames
> > >>>> >>> > through the KinectSDK now.
>
> > >>>> >>> > But when I try to use the usertracker example I get some nonsense
> > >>>> >>> > aboutmaxShift:
>
> > >>>> >>> > C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
> > >>>> >>> > Attempting to open \\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11D
> > >>>> >>> > 1-EF5E-00C04F2D728B}\00
> > >>>> >>> > KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11
> > >>>> >>> > D1-EF5E-00C04F2D728B}\00\PIPE01)
> > >>>> >>> > KinectCamera_OpenStreamEndpoint Opened successfully.
> > >>>> >>> > Couldn't getmaxShift.
> > >>>> >>> > Couldn't getmaxShift.
> > >>>> >>> > Find user generator failed: Error!
>
> > >>>> >>> > working on it..
>
> > >>>> >>> >Amir
>
> > >>>> >>> > On Wed, Jun 22, 2011 at 7:10 AM, alex summer
>
> > >>>> >>> > <ezekielninju...@hotmail.com> wrote:
> > >>>> >>> > > I hope this works! Keep us informed of the progress, you're almost
> > >>>> >>> > > done!!!
> > >>>> >>> > > This will be very usefull to everyone! Cheers :D
>
> > >>>> >>> > > Alex
>
> > >>>> >>> > > On Jun 21, 6:24 am,AmirHirsch <a...@tinkerheavy.com> wrote:
> > >>>> >>> > >> Latest update: I can get the Kinect to turn on using the Kinect SDK
> > >>>> >>> > >> invoked through OpenNI module.
>
> > >>>> >>> > >> I am modifying NiSampleModule to get depth from msft Kinect SDK and
> > >>>> >>> > >> feed it to OpenNI.
>
> > >>>> >>> > >> First the DLL hell stuff in VS2010:
> > >>>> >>> > >> your "additional dependencies" in the project properties -> linker ->
> > >>>> >>> > >> input add MSRKinectNUI.lib...
>
> read more »

Tomoto

unread,
Aug 14, 2011, 3:39:35 AM8/14/11
to OpenNI
Hi Patrick,

Sorry I have not checked this thread for a while and missed your
comment.

I have not experienced this issue with NiUserTracker. If you are using
the same NiUserTracker (with no modification), it may depend on the
OpenNI version. Can you let me know (1) if you made any changes on
NiUserTracker and (2) the versions of OpenNI and NITE?

m_pBuffer is initialized for the first time when StartGenerating is
invoked. If the application invokes GetUserPixels before invoking
context.StartGeneratingAll, you will get m_pBuffer with NULL and then
the application may crash. I would like to eliminate this possibility
first which is the reason why I am asking you if you modified
NiUserTracker sample.

Thanks,
Tomoto
> ...
>
> read more »

Patrock

unread,
Aug 16, 2011, 4:48:26 AM8/16/11
to OpenNI
Hello Tomoto,

i rechecked everything and found out, that there was some kind of
mixup between OpenNI-Versions. I've recompiled the newest unstable
version and now everything works fine...

Thanks,

Patrock
> > > > >>>> >>> . (NoteKinectcould be a "3rd party device" for PrimeSense if covered
> > > > >>>> >>> by Microsoft's driver.) Also, I found XnDepthStream#GetMaxShift method
> > > > >>>> >>> in the source code of SensorKinect. I guess NITE depends on this
> > > > >>>> >>> method and apparently Microsoft's driver does not have it.
>
> > > > >>>> >>> To avoid this issue, probably we...
>
> read more »

Tomoto

unread,
Aug 17, 2011, 4:08:35 AM8/17/11
to OpenNI
Hi Patrock,

Sounds good. Have a happy hacking!

Tomoto
> ...
>
> もっと読む »

Haolin

unread,
Sep 2, 2011, 4:56:48 PM9/2/11
to OpenNI
Hi Tomoto,

Is the bridge works with the latest unstable version of
OpenNI(1.3.2.3) and NITE(1.4.1.2)? I got an error when I try to run
the niusertracker sample with the MSRKinectUserSkeletonGenerator. It
says "Register to pose in progress failed: This operatoin is invalid!"
Also when I try to compile the bridge from the source it says
MSRKinectUserSkeletonGenerator : cannot instantiate abstract class?
Any ideas on how to fix this?

Thanks a million,
Haolin
> ...
>
> read more »

Dale Phurrough

unread,
Sep 10, 2011, 9:13:39 PM9/10/11
to openn...@googlegroups.com
I believe there are significant problems running the It appears to be troublesome with Tomoto's July 17 release and OpenNI's major release 1.3.2.3.

I've found three issues so far:
1) The precompiled bridge DLL seems to have been compiled with OpenNI 1.1.0.41. A very old version which has unknown dependencies or deprecated issues..
2) The 1.3.2.3 release of OpenNI deprecated several APIs and introduced new ones. This changed some behavior which I believe could be related to app crashed I'm having with his pre-compiled DLL.
3) It is not possible to compile Tomoto's July 17 release with the more current OpenNI 1.3.2.3. This is due to changed and added APIs. The new APIs require addition work in the bridge code to successfully complete his bridge as a new User Generator. Examples of APIs/methods needed to be implemented in the bridge are: RegisterToUserExit(),  UnregisterFromUserExit(), RegisterToCalibrationStart(), etc.

I have the talents to do the needed code. However, I don't have the time at the moment to code it. I wish I did, because I would love to use the MSFT poseless skeletons in an art project. Its when I tested using the bridge that I found these issues.

Tomoto or someone already familiar with codebase can likely complete the missing APIs within 8 hours work.

Joshua Blake

unread,
Sep 10, 2011, 10:02:32 PM9/10/11
to openn...@googlegroups.com

I agree on the issues raised by Dale. We've found the bridge to be very useful, but need to update to the lastest OpenNI version due to other dependencies and the latest Kinect SDK (which didn't change any APIs that were working before, so that part shouldn't be an issue.)
 
I tried looking through the code but couldn't figure out how to properly debug any changes I was making.
 
The other issue we had with the bridge was the user pixels significantly lagged behind the RGB data, so if you did a background removal app you were "chasing" your cutout. I know the frames are not guaranteed to be synced in the Kinect SDK, but the problem through the bridge is significantly worse than both the OpenNI/NITE/SensorKinect combo and Kinect SDK alone.
 
Tomoto if you could update to the latest of everything it'd be much appreciated!
 
Thanks,
Josh

--
You received this message because you are subscribed to the Google Groups "OpenNI" group.
To view this discussion on the web visit https://groups.google.com/d/msg/openni-dev/-/wkuizKRvl-4J.

Tomoto

unread,
Sep 15, 2011, 8:25:36 PM9/15/11
to OpenNI
Hi All,

I have been busy and away from this thread these days. Let me find
some time to take a look.

Cheers,
Tomoto

Joshua Blake

unread,
Sep 15, 2011, 8:38:46 PM9/15/11
to openn...@googlegroups.com
For what it's worth, I was doing some apps directly against Kinect SDK and found that it's RGB and Depth streams get out of sync much more than we're used to with OpenNI. So Tomoto sorry this problem may not have been your fault! The solution is buffering a few frames and then match up frames by the timestamp, but this may or may not still work.
 
Thanks,
Josh

Haolin

unread,
Sep 17, 2011, 11:10:12 AM9/17/11
to OpenNI
Hi Everyone,

First I think the bridge is awesome, I have learned a lot from the
source code. I'm currently trying to modify the bridge to extract the
skeleton data from a series of gray scale images (0-255) by using
NITE. I use OpenCV to get the image data in a byte array and convert
them to a depth map by simply multiply the 1000/255 and plus some
offset say 1000. After I run the NiViewer the depth image can be
displayed, but when I run the NiUserTracker, the application couldn't
recognize the user and thus cannot get the skeleton. Currently I
simplly change the UpdateDepthData method in
MSRKinectDepthGeneratorBase.h file as following.

XnStatus UpdateDepthData(DepthPixelProcessor& proc, const
NUI_IMAGE_FRAME* pFrame, const USHORT* data, const KINECT_LOCKED_RECT&
lockedRect)
{
XnDepthPixel* pPixel = m_pBuffer;

std::stringstream imageName;
//Reset image count
if (n>1000)
n=0;
imageName << "D:\\Data\\" << n << ".png";
n += 3;
IplImage* img = cvLoadImage(imageName.str().c_str());
int i=0;

for (XnUInt y = 0; y < 480; ++y)
{
for (XnUInt x = 0; x < 640; ++x, ++pPixel)
{
*pPixel = img->imageData[i] * (1000/255) + 1500;
i = i+3;
}
}
cvReleaseImage(&img);
return XN_STATUS_OK;
}

Does anyone have any suggestions on how to get NITE working? Also
another quick question is there any general step to step guide to get
the NITE working with a normal TOF camera by modifying the
NiSampleModule?

Any suggestions would be appreciate. Thanks in advance!


On Sep 16, 1:25 am, Tomoto <tom...@gmail.com> wrote:
> Hi All,
>
> I have been busy and away from this thread these days. Let me find
> some time to take a look.
>
> Cheers,
> Tomoto
>
> On Sep 10, 7:02 pm, Joshua Blake <joshbl...@gmail.com> wrote:
>
>
>
>
>
>
>
> > I agree on the issues raised by Dale. We've found thebridgeto be very
> > useful, but need to update to the lastest OpenNI version due to other
> > dependencies and the latest Kinect SDK (which didn't change any APIs that
> > were working before, so that part shouldn't be an issue.)
>
> > I tried looking through the code but couldn't figure out how to properly
> > debug any changes I was making.
>
> > The other issue we had with thebridgewas the user pixels significantly
> > lagged behind the RGB data, so if you did a background removal app you were
> > "chasing" your cutout. I know the frames are not guaranteed to be synced in
> > the Kinect SDK, but the problem through thebridgeis significantly worse
> > than both the OpenNI/NITE/SensorKinect combo and Kinect SDK alone.
>
> > Tomoto if you could update to the latest of everything it'd be much
> > appreciated!
>
> > Thanks,
> > Josh
>
> > On Sat, Sep 10, 2011 at 9:13 PM, Dale Phurrough <d...@hidale.com> wrote:
> > > I believe there are significant problems running the It appears to be
> > > troublesome with Tomoto's July 17 release and OpenNI's major release
> > > 1.3.2.3.
>
> > > I've found three issues so far:
> > > 1) The precompiledbridgeDLL seems to have been compiled with OpenNI
> > > 1.1.0.41. A very old version which has unknown dependencies or deprecated
> > > issues..
> > > 2) The 1.3.2.3 release of OpenNI deprecated several APIs and introduced new
> > > ones. This changed some behavior which I believe could be related to app
> > > crashed I'm having with his pre-compiled DLL.
> > > 3) It is not possible to compile Tomoto's July 17 release with the more
> > > current OpenNI 1.3.2.3. This is due to changed and added APIs. The new APIs
> > > require addition work in thebridgecode to successfully complete hisbridge
> > > as a new User Generator. Examples of APIs/methods needed to be implemented
> > > in thebridgeare: RegisterToUserExit(),
It is loading more messages.
0 new messages