http://www.youtube.com/watch?v=fK7wLhdLDpU
Amir
Interesting... Any details on implementation? I'm not really sure if this is showing two Kinects or an OpenNI plug in for the Kinect SDK.
I've been able to build the NiSampleModule example in OpenNI with some
modifications that cause it to initialize the Kinect SDK.
I can also register it with niReg as a Depth node. the only difficult
part was you have to run niReg as administrator (I spent 2 hours on
that part and like 5 minutes on the rest)
Almost there....
Amir
I am modifying NiSampleModule to get depth from msft Kinect SDK and
feed it to OpenNI.
First the DLL hell stuff in VS2010:
your "additional dependencies" in the project properties -> linker ->
input add MSRKinectNUI.lib
MSRKinectNUI.lib;openNI.lib;%(AdditionalDependencies)
i dragged the msrkinectnui.lib into my project too for good measure
in the VC++ directories settings:
$(MSRKINECTSDK)\inc to include
$(MSRKINECTSDK)\lib to libraries
in SampleDepth.cpp i now initialize the kinect with the Kinect SDK. I
needed to include some headers:
#include "wtypes.h"
#include "MSR_NuiApi.h"
..
and modify init:
XnStatus SampleDepth::Init()
{
m_pDepthMap = new XnDepthPixel[SUPPORTED_X_RES * SUPPORTED_Y_RES];
if (m_pDepthMap == NULL)
{
return XN_STATUS_ALLOC_FAILED;
}
m_hNextDepthFrameEvent = CreateEvent( NULL, TRUE, FALSE, NULL );
HRESULT hr;
hr = NuiInitialize(
NUI_INITIALIZE_FLAG_USES_DEPTH_AND_PLAYER_INDEX |
NUI_INITIALIZE_FLAG_USES_SKELETON | NUI_INITIALIZE_FLAG_USES_COLOR);
if( FAILED( hr ) )
{
// MessageBoxResource(m_hWnd,IDS_ERROR_NUIINIT,MB_OK | MB_ICONHAND);
return XN_STATUS_ALLOC_FAILED;;
}
hr = NuiImageStreamOpen(
NUI_IMAGE_TYPE_DEPTH_AND_PLAYER_INDEX,
NUI_IMAGE_RESOLUTION_320x240,
0,
2,
m_hNextDepthFrameEvent,
&m_pDepthStreamHandle );
if( FAILED( hr ) )
{
// MessageBoxResource(m_hWnd,IDS_ERROR_DEPTHSTREAM,MB_OK | MB_ICONHAND);
return XN_STATUS_ILLEGAL_POSITION ;
}
return (XN_STATUS_OK);
}
the XnStatus return types in the hr error is just me being ridiculous.
also in SampleDepth.h
XnBool m_bMirror;
ChangeEvent m_generatingEvent;
ChangeEvent m_dataAvailableEvent;
ChangeEvent m_mirrorEvent;
//ADDED BY AMIR FOR KinectSDK binding
HANDLE m_hThNuiProcess;
HANDLE m_hEvNuiProcessStop;
HANDLE m_pDepthStreamHandle;
HANDLE m_hNextDepthFrameEvent;
};
now i build it and then register it with: nireg -r NiSampleModule.dll
(don't forget to run this step in admin mode)
nireg -l now lists my samplemodule.dll
F:\Hacking\OpenNI\OpenNI\Platform\Win32\Bin\Debug\NiSampleModule.dll (compiled w
ith OpenNI 1.1.0.39):
Depth: OpenNI/SampleDepth/1.1.0.39
i was getting problems that looked like this:
F:\Hacking\OpenNI\OpenNI\Platform\Win32\Bin\Debug\NiSampleModule.dll 26920
[WARNING] Failed to load library 'F:\Hacking\OpenNI\OpenNI\Platform\Win32\
Bin\Debug\NiSampleModule.dll'. Error code: 87
27160 [WARNING] Failed to load 'F:\Hacking\OpenNI\OpenNI\Platfor
m\Win32\Bin\Debug\NiSampleModule.dll' - missing dependencies?
this error occured until i included the MSRKinectNUI.lib in the
additional dependencies. but then after several hours discovered my
folly.
ok so then i modified samplesconfig.xml in the OpenNI/data directory
to look only for a depth node (i'm testing with usertracker which only
consumes depth)
<OpenNI>
<Licenses>
<!-- Add licenses here
<License vendor="vendor" key="key"/>
-->
</Licenses>
<Log writeToConsole="false" writeToFile="false">
<!-- 0 - Verbose, 1 - Info, 2 - Warning, 3 - Error (default) -->
<LogLevel value="3"/>
<Masks>
<Mask name="ALL" on="true"/>
</Masks>
<Dumps>
</Dumps>
</Log>
<ProductionNodes>
<Node type="Depth" name="Depth1">
<Configuration>
<Mirror on="true"/>
</Configuration>
</Node>
</ProductionNodes>
</OpenNI>
now i haven't started the NUIThread and mapped the depth frames to
OpenNI, but when i run UserTracker my kinect turns on, implying a
connection between openni and kinectsdk is working.
so now someone else can avoid my errors and probably get it done
tomorrow before i even wake up.
Amir
But when I try to use the usertracker example I get some nonsense
about maxShift:
C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00363600976047B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
Couldn't get maxShift.
Couldn't get maxShift.
Find user generator failed: Error!
working on it..
Amir
For everyone's reference, if you install Tomoto's dll and run
UserTracker (which uses NITE) you'll see that NITE looks for some
property called maxShift:
C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE00)
KinectCamera_OpenStreamEndpoint Opened successfully.
Couldn't get maxShift.
Couldn't get maxShift.
Find user generator failed: Error!
I've gone down the rabbit hole of the Sensor code inheritance (someone
at PrimeSense really likes extending classes, huh?) I've discovered
that the MaxShift is a property requested through a GetIntProperty
request in the XnSensorProductionNode:
I added this little work-around to the MSRKinectDepthGenerator which
just reports 2047 when asked for the IntProperty "MaxShift"
(frankly, i don't think this property actually matters and I wonder
why NITE has dependencies on hidden properties)
XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{
if (strName== "MaxShift")
{
nValue = 2047;
}
return XN_STATUS_OK;
}
so now it gets a MaxShift and the error looks like this saying it
can't find the shift2Depth table:
C:\Program Files (x86)\OpenNI\Samples\Bin\Release>NiUserTracker.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE00)
KinectCamera_OpenStreamEndpoint Opened successfully.
Couldn't get shift2Depth table.
Couldn't get shift2Depth table.
ONWARD.
2011/6/22 Tomoto <tom...@gmail.com>:
>> もっと読む ≫
XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{
return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
strName, XnUInt32 nBufferSize, void* pBuffer) const
{
return XN_STATUS_OK;
}
2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:
dValue in the GetRealProperty
nValue in the GetIntProperty
Unfortunately it doesn't track the user.... but it runs.
XnStatus MSRKinectDepthGenerator::GetIntProperty(const XnChar*
strName, XnUInt64& nValue) const
{
nValue = 2047;
return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetGeneralProperty(const XnChar*
strName, XnUInt32 nBufferSize, void* pBuffer) const
{
return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetStringProperty(const XnChar*
strName, XnChar* csValue, XnUInt32 nBufSize) const
{
return XN_STATUS_OK;
}
XnStatus MSRKinectDepthGenerator::GetRealProperty(const XnChar*
strName, XnDouble& dValue) const
{
dValue = 1.0;
return XN_STATUS_OK;
}
On Thu, Jun 23, 2011 at 1:32 AM, Tomoto <tom...@gmail.com> wrote:
GetProperty (INT): SupportedModesCount : 15
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): SupportedModesCount : 11
GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (REAL): LDDIS : 0
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 536870912
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): ConstShift : 200
2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:
GetProperty (INT): SupportedModesCount : 15
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): SupportedModesCount : 11
GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): MaxShift : 2047
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): ConstShift : 200
This all came up back in December, this thread is great:
http://groups.google.com/group/openni-dev/browse_thread/thread/f9dbfd4069722f9d
I'm nearly certain I can make a module that tricks NITE into thinking
our KinectSDK depth node is a sensorkinect equivalent.
But it's more interesting to get the user-map and their skeleton data
into OpenNI. We can generate the rotation matrices from the Position
data and also add in some pose detection capabilities. We'll need to
establish a user acquisition method that takes advantage of not
needing any calibration.
Amir
2011/6/23 Amir Hirsch <am...@tinkerheavy.com>:
Amir
2011/6/26 Tomoto <tom...@gmail.com>:
virtual XnBool NeedPoseForCalibration() { return FALSE; // yay! }
> ...
>
> もっと読む ≫
here are the gotchas:
1) Orientation isn't supported yet (we can get that working soon!)
2) The image and depth data generator is still not compatible with
NITE (I'm working on it, I'm gonna print out the shift->depth data
tables, but I can't tell if it wants something in the frame metadata?)
3) I needed to go to the XnVFeatures 1.3.0 and 1.3.1 directories under
NITE and nireg -u XnVFeatures.dll so OpenNI would use your user
tracker instead of NITE.
Amir
2011/6/26 Tomoto <tom...@gmail.com>:
The skeleton smoothing should be turned on when you give a non-zero
value to SkeletonCapability::SetSmoothing. Does it not work as you
expect?
Amir
I'll see if this combined with the GetRealProperty GetIntProperty and
GetStrProperty overrides can get NITE working with the MSR Kinect
Depth Generator module.
Amir
here: zigfu.com/Sensorbins.rar
And the answer is probably no since we do not get to send messages to
the Kinect via the Kinect SDK and I'm not sure if i can send it data
through some other mechanism while the kinect sdk is loaded in
windows.
This is only useful for convincing NITE to work with whatever depth
data you feed it. Might be useful for auto-labeling a skeleton if you
wanted to implement your own.
It would be really useful if we could choose what to input to
Microsoft's skeleton tracker and feed it our own depth data. Something
we really want to do is filter the depth image for a specific user so
we can select which user to pass to skeleton tracker.
Amir
People love us on github.com/tinkerer
var scene = userGenerator.GetUserPixels(userId);
I don't need 320x240 specifically except that I think player indexes are only supported in that mode and not 640x480 in Kinect SDK.
One drawback of this implementation is you must use 320x240 (= cannot
use 640x480) even if you don't need player indexes. I have left this
restriction so far because there is no elegant way for a Depth node to
know whether the application needs player indexes or not at the time
of opening the image stream.
I've also noticed that it asks for things in a totally different order
so it might be important to discover what other things are
communicated between NITE and Sensor. I'll examine the ONI file stuff
tomorrow. My mod of the Sensor driver reports this:
C:\Program Files (x86)\PrimeSense\NITE\Samples\Bin\Release>Sample-Boxes.exe
GetProperty (GENERAL): InstancePointer
Writing 4 to InstancePointer1.bin
GetProperty (INT): SupportedModesCount : 15
GetProperty (GENERAL): SupportedModes
Writing 90 to SupportedModes1.bin
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): D2S
Writing 20002 to D2S2.bin
GetProperty (GENERAL): S2D
Writing 4096 to S2D3.bin
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): S2D
Writing 4096 to S2D4.bin
GetProperty (GENERAL): D2S
Writing 20002 to D2S5.bin
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (REAL): LDDIS : 7.500000
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (INT): ConstShift : 200
GetProperty (INT): MaxShift : 2047
GetProperty (GENERAL): D2S
Writing 20002 to D2S6.bin
GetProperty (GENERAL): S2D
Writing 4096 to S2D7.bin
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
GetProperty (REAL): ZPPS : 0.105200
GetProperty (INT): ZPD : 120
Setting resolution to QVGA
GetProperty (INT): ConstShift : 200
Then when I install KinectSDK and attempt to get the depth generator
to look the same as the PrimeSense sensor it asks for totally
different things (MaxShift first instead of InstancePointer).
C:\Program Files (x86)\PrimeSense\NITE\Samples\Bin\Release>Sample-Boxes.exe
Attempting to open \\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11D
1-EF5E-00C04F2D728B}\00
KinWinDeviceName = (\\?\USB#VID_045E&PID_02AE#B00366707565050B#{00873FDF-61A8-11
D1-EF5E-00C04F2D728B}\00\PIPE01)
KinectCamera_OpenStreamEndpoint Opened successfully.
GetIntProperty, returning: MaxShift , 2047
GetGeneralProperty: D2S , 8002
GetGeneralProperty: S2D , 4096
GetRealProperty, returning: ZPPS , 0.105200
GetIntProperty, returning: ZPD , 120
GetIntProperty, returning: MaxShift , 2047
GetGeneralProperty: S2D , 4096
GetGeneralProperty: D2S , 8002
GetIntProperty, returning: ZPD , 120
GetRealProperty, returning: ZPPS , 0.105200
GetRealProperty, returning: LDDIS , 7.500000
GetRealProperty, returning: ZPPS , 0.105200
GetIntProperty, returning: ZPD , 120
Then the NITE boxes sample doesn't work. Booooooooooooo
OK so now i'm going to go down a deeper rabbit whole and figure out
how ONI playback works.
it's interesting that the NITE algorithms are dependent on
DepthToShift and ShiftToDepth tables pulled from the Sensor driver.
amir
3. What exactly are the steps to install everything?
In the readme, it just says to install openNI, however its unclear to
me if that means just openNI or the whole openNI suite ( i.e. NITE) as
well or what to do about drivers.
Thanks for the info.
On Jul 21, 9:26 pm, Amir Hirsch <a...@zigfu.com> wrote:
> The "Gesture Generator" component of NITE is the only one that does not work
> with the depth data from the Kinect SDK. The code includes a mock
> Depth-to-Shift data which tricks NITE into working with the Kinect SDK. We
> have no idea why the Gesture Generator does not also "just work."
Shame that NITES gestures aren't yet supported, sounds like its a bit
of a puzzle as to why it isn't.
Its not particularly the gestures that i'm interested in, but more the
hand/point tracking. From what you say can I infer that hand/point
tracking is also not supported? I understand I could use the kinect
SDK skeleton and write my own, but I like the results NITE provides
and suspect its not completely straightforward to implement (i.e how
NITE relates a hand point realtively to the screen).
Is anyone still working on integrating aspects such as gestures into
the bridge, or is it considered 'feature complete' at this time?
Anyway the process for installation seems nice and straightforward.
I'll probably update to the latest openNI/NITE/Sensor over the next
few days, then try out the sdk sometime.
--
You received this message because you are subscribed to the Google Groups "OpenNI" group.
To view this discussion on the web visit https://groups.google.com/d/msg/openni-dev/-/wkuizKRvl-4J.