Multi Hand Cursor with Kinect

1,266 views
Skip to first unread message

Iker Saint

unread,
Dec 17, 2010, 3:36:39 PM12/17/10
to OpenNI
My first approach to kinet:

http://www.youtube.com/watch?v=CE206hP7ryg

Regards.

Nip

unread,
Dec 18, 2010, 4:10:33 AM12/18/10
to OpenNI
Hi, cool,

any hints how you managed to track multiple hands? In the Sample-
PointViewer code there is support for multiple hands, but when I run
the application I can only use one. As soon as a focus gesture is
recognized the state goes to "In session" and it stops looking for
additional hands.

Nip

Douglas Teófilo

unread,
Dec 18, 2010, 9:43:32 AM12/18/10
to OpenNI
Hi,

Great job man, I was impressed with the precision of movement of the
cursor and clicking with your other hand.
Would make available the source code?
Thanks!

Peter Krakow

unread,
Dec 18, 2010, 10:42:08 AM12/18/10
to OpenNI
+1 for Source Code :)

Iker Saint

unread,
Dec 18, 2010, 1:35:55 PM12/18/10
to OpenNI
You have to enable the option in the file"..\NITE\Hands\Data\", just
modify the file Nite.ini and uncomment the lines.

Iker

Douglas Teófilo

unread,
Dec 18, 2010, 2:38:57 PM12/18/10
to OpenNI
Ok,

Would you show us an example of how exactly you did?
Well I tried and did not work.

Thanks again!
> > > Regards.- Hide quoted text -
>
> - Show quoted text -

Darien

unread,
Dec 18, 2010, 5:28:29 PM12/18/10
to OpenNI
/
****************************************************************************
*
*
* Nite 1.3 - Point
Viewer *
*
*
* Author: Oz
Magal *
*
*
****************************************************************************/

/
****************************************************************************
*
*
* Nite
1.3 *
* Copyright (C) 2006 PrimeSense Ltd. All Rights
Reserved. *
*
*
* This file has been provided pursuant to a License Agreement
containing *
* restrictions on its use. This data contains valuable trade
secrets *
* and proprietary information of PrimeSense Ltd. and is protected by
law. *
*
*
****************************************************************************/

#include "PointDrawer.h"
#include "XnVDepthMessage.h"
#include <XnVHandPointContext.h>
#include <conio.h>
#define WIN32_LEAN_AND_MEAN // Exclude Extra Windows Crap
#define WIN32_EXTRA_LEAN // Exclude More Windows Crap
#include <windows.h>


#ifdef USE_GLUT
#include <GL/glut.h>
#else
#include "opengles.h"
#endif

// Constructor. Receives the number of previous positions to store per
hand,
// and a source for depth map
XnVPointDrawer::XnVPointDrawer(XnUInt32 nHistory, xn::DepthGenerator
depthGenerator) :
XnVPointControl("XnVPointDrawer"),
m_nHistorySize(nHistory), m_DepthGenerator(depthGenerator),
m_bDrawDM(false), m_bFrameID(false)
{
m_pfPositionBuffer = new XnFloat[nHistory*3];
}

// Destructor. Clear all data structures
XnVPointDrawer::~XnVPointDrawer()
{
std::map<XnUInt32, std::list<XnPoint3D> >::iterator iter;
for (iter = m_History.begin(); iter != m_History.end(); ++iter)
{
iter->second.clear();
}
m_History.clear();

delete []m_pfPositionBuffer;
}

// Change whether or not to draw the depth map
void XnVPointDrawer::SetDepthMap(XnBool bDrawDM)
{
m_bDrawDM = bDrawDM;
}
// Change whether or not to print the frame ID
void XnVPointDrawer::SetFrameID(XnBool bFrameID)
{
m_bFrameID = bFrameID;
}

// Handle creation of a new hand
static XnBool bShouldPrint = false;
void XnVPointDrawer::OnPointCreate(const XnVHandPointContext* cxt)
{
printf("** %d\n", cxt->nID);
// Create entry for the hand
m_History[cxt->nID].clear();
bShouldPrint = true;
OnPointUpdate(cxt);
bShouldPrint = true;
}
// Handle new position of an existing hand
void XnVPointDrawer::OnPointUpdate(const XnVHandPointContext* cxt)
{
// positions are kept in projective coordinates, since they are only
used for drawing
XnPoint3D ptProjective(cxt->ptPosition);

if (bShouldPrint)printf("Point (%f,%f,%f)", ptProjective.X,
ptProjective.Y, ptProjective.Z);
m_DepthGenerator.ConvertRealWorldToProjective(1, &ptProjective,
&ptProjective);
if (bShouldPrint)printf(" -> (%f,%f,%f)\n", ptProjective.X,
ptProjective.Y, ptProjective.Z);

// Add new position to the history buffer
m_History[cxt->nID].push_front(ptProjective);
// Keep size of history buffer
if (m_History[cxt->nID].size() > m_nHistorySize)
m_History[cxt->nID].pop_back();
bShouldPrint = false;
}

// Handle destruction of an existing hand
void XnVPointDrawer::OnPointDestroy(XnUInt32 nID)
{
// No need for the history buffer
m_History.erase(nID);
}

#define MAX_DEPTH 10000
float g_pDepthHist[MAX_DEPTH];
unsigned int getClosestPowerOfTwo(unsigned int n)
{
unsigned int m = 2;
while(m < n) m<<=1;

return m;
}
GLuint initTexture(void** buf, int& width, int& height)
{
GLuint texID = 0;
glGenTextures(1,&texID);

width = getClosestPowerOfTwo(width);
height = getClosestPowerOfTwo(height);
*buf = new unsigned char[width*height*4];
glBindTexture(GL_TEXTURE_2D,texID);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

return texID;
}

GLfloat texcoords[8];
void DrawRectangle(float topLeftX, float topLeftY, float bottomRightX,
float bottomRightY)
{
GLfloat verts[8] = { topLeftX, topLeftY,
topLeftX, bottomRightY,
bottomRightX, bottomRightY,
bottomRightX, topLeftY
};
glVertexPointer(2, GL_FLOAT, 0, verts);
glDrawArrays(GL_TRIANGLE_FAN, 0, 4);

glFlush();
}
void DrawTexture(float topLeftX, float topLeftY, float bottomRightX,
float bottomRightY)
{
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, texcoords);

DrawRectangle(topLeftX, topLeftY, bottomRightX, bottomRightY);

glDisableClientState(GL_TEXTURE_COORD_ARRAY);
}

void DrawDepthMap(const xn::DepthMetaData& dm)
{
static bool bInitialized = false;
static GLuint depthTexID;
static unsigned char* pDepthTexBuf;
static int texWidth, texHeight;

float topLeftX;
float topLeftY;
float bottomRightY;
float bottomRightX;
float texXpos;
float texYpos;

if(!bInitialized)
{
XnUInt16 nXRes = dm.XRes();
XnUInt16 nYRes = dm.YRes();
texWidth = getClosestPowerOfTwo(nXRes);
texHeight = getClosestPowerOfTwo(nYRes);

depthTexID = initTexture((void**)&pDepthTexBuf,texWidth,
texHeight) ;

bInitialized = true;

topLeftX = nXRes;
topLeftY = 0;
bottomRightY = nYRes;
bottomRightX = 0;
texXpos =(float)nXRes/texWidth;
texYpos =(float)nYRes/texHeight;

memset(texcoords, 0, 8*sizeof(float));
texcoords[0] = texXpos, texcoords[1] = texYpos, texcoords[2] =
texXpos, texcoords[7] = texYpos;

}
unsigned int nValue = 0;
unsigned int nHistValue = 0;
unsigned int nIndex = 0;
unsigned int nX = 0;
unsigned int nY = 0;
unsigned int nNumberOfPoints = 0;
XnUInt16 g_nXRes = dm.XRes();
XnUInt16 g_nYRes = dm.YRes();

unsigned char* pDestImage = pDepthTexBuf;

const XnUInt16* pDepth = dm.Data();

// Calculate the accumulative histogram
memset(g_pDepthHist, 0, MAX_DEPTH*sizeof(float));
for (nY=0; nY<g_nYRes; nY++)
{
for (nX=0; nX<g_nXRes; nX++)
{
nValue = *pDepth;

if (nValue != 0)
{
g_pDepthHist[nValue]++;
nNumberOfPoints++;
}

pDepth++;
}
}

for (nIndex=1; nIndex<MAX_DEPTH; nIndex++)
{
g_pDepthHist[nIndex] += g_pDepthHist[nIndex-1];
}
if (nNumberOfPoints)
{
for (nIndex=1; nIndex<MAX_DEPTH; nIndex++)
{
g_pDepthHist[nIndex] = (unsigned int)(256 * (1.0f -
(g_pDepthHist[nIndex] / nNumberOfPoints)));
}
}

pDepth = dm.Data();
{
XnUInt32 nIndex = 0;
// Prepare the texture map
for (nY=0; nY<g_nYRes; nY++)
{
for (nX=0; nX < g_nXRes; nX++, nIndex++)
{
nValue = *pDepth;

if (nValue != 0)
{
nHistValue = g_pDepthHist[nValue];

pDestImage[0] = nHistValue;
pDestImage[1] = nHistValue;
pDestImage[2] = nHistValue;
}
else
{
pDestImage[0] = 0;
pDestImage[1] = 0;
pDestImage[2] = 0;
}

pDepth++;
pDestImage+=3;
}

pDestImage += (texWidth - g_nXRes) *3;
}
}
glBindTexture(GL_TEXTURE_2D, depthTexID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texWidth, texHeight, 0,
GL_RGB, GL_UNSIGNED_BYTE, pDepthTexBuf);

// Display the OpenGL texture map
glColor4f(0.5,0.5,0.5,1);

glEnable(GL_TEXTURE_2D);
DrawTexture(dm.XRes(),dm.YRes(),0,0);
glDisable(GL_TEXTURE_2D);
}

void glPrintString(void *font, char *str)
{
int i,l = strlen(str);

for(i=0; i<l; i++)
{
glutBitmapCharacter(font,*str++);
}
}

void DrawFrameID(XnUInt32 nFrameID)
{
glColor4f(1,0,0,1);
glRasterPos2i(20, 50);
XnChar strLabel[20];
sprintf(strLabel, "%d", nFrameID);
glPrintString(GLUT_BITMAP_HELVETICA_18, strLabel);
}

// Colors for the points
XnFloat Colors[][4] =
{
{1,0,0}, // Red
{0,1,0}, // Green
{0,0.5,1}, // Light blue
{1,1,0}, // Yellow
{1,0.5,0}, // Orange
{1,0,1}, // Purple
{1,1,1} // White. reserved for the primary point
};
XnUInt32 nColors = 6;

void XnVPointDrawer::Draw() const
{
std::map<XnUInt32, std::list<XnPoint3D> >::const_iterator
PointIterator;

// Go over each existing hand
for (PointIterator = m_History.begin();
PointIterator != m_History.end();
++PointIterator)
{
// Clear buffer
XnUInt32 nPoints = 0;
XnUInt32 i = 0;
XnUInt32 Id = PointIterator->first;

// Go over all previous positions of current hand
std::list<XnPoint3D>::const_iterator PositionIterator;
for (PositionIterator = PointIterator->second.begin();
PositionIterator != PointIterator->second.end();
++PositionIterator, ++i)
{
// Add position to buffer
XnPoint3D pt(*PositionIterator);
m_pfPositionBuffer[3*i] = pt.X;
m_pfPositionBuffer[3*i + 1] = pt.Y;
m_pfPositionBuffer[3*i + 2] = 0;//pt.Z();
printf("Session test: (%f,%f,\n", pt.X, pt.Y);
SetCursorPos((int)(pt.X*5.5),(int)(pt.Y*5.5));
}

// Set color
XnUInt32 nColor = Id % nColors;
XnUInt32 nSingle = GetPrimaryID();
if (Id == GetPrimaryID())
nColor = 6;
// Draw buffer:
glColor4f(Colors[nColor][0],
Colors[nColor][1],
Colors[nColor][2],
1.0f);
glPointSize(2);
glVertexPointer(3, GL_FLOAT, 0, m_pfPositionBuffer);
glDrawArrays(GL_LINE_STRIP, 0, i);

glPointSize(8);
glDrawArrays(GL_POINTS, 0, 1);
glFlush();

}
}

// Handle a new Message
void XnVPointDrawer::Update(XnVMessage* pMessage)
{
// PointControl's Update calls all callbacks for each hand
XnVPointControl::Update(pMessage);

if (m_bDrawDM)
{
// Draw depth map
xn::DepthMetaData depthMD;
m_DepthGenerator.GetMetaData(depthMD);
DrawDepthMap(depthMD);
}
if (m_bFrameID)
{
// Print out frame ID
xn::DepthMetaData depthMD;
m_DepthGenerator.GetMetaData(depthMD);
DrawFrameID(depthMD.FrameID());
}

// Draw hands
Draw();
}

void PrintSessionState(SessionState eState)
{
glColor4f(1,0,1,1);
glRasterPos2i(20, 20);
XnChar strLabel[200];

switch (eState)
{
case IN_SESSION:
sprintf(strLabel, "Tracking hands"); break;
case NOT_IN_SESSION:
sprintf(strLabel, "Perform click or wave gestures to track hand");
break;
case QUICK_REFOCUS:
sprintf(strLabel, "Raise your hand for it to be identified, or
perform click or wave gestures"); break;
}

glPrintString(GLUT_BITMAP_HELVETICA_18, strLabel);

Darien

unread,
Dec 18, 2010, 5:33:10 PM12/18/10
to OpenNI
The PointDrawer.cpp code I posted will support 4 points but they all
will fight over the cursor. There is also a lag between hand movement
and cursor movement.

If you want to add more input points, you need to modify this line
"XnFloat Colors[][4] =" but they are still all linked into the same
function. I haven't had the time to figure out how to add the push
event on a second input.

But maybe Iker Saint will share this implementation.

-Darien

On Dec 18, 5:28 pm, Darien <darien...@gmail.com> wrote:
> /
> *************************************************************************** *
> ...
>
> read more »

Iker Saint

unread,
Dec 18, 2010, 5:41:40 PM12/18/10
to OpenNI
You have a lot of more fun functions for the class that aren't
implemented in the example of pointviewer, like:

OnPrimaryPointCreate
OnPrimaryPointReplace
OnPrimaryPointDestroy
OnPrimaryPointUpdate

The problem that you face is that also generate events like
OnPointUpdate, but is not a big deal with the info of the primary hand
ID.

I process the point updates and compare with the current primary ID
( the first hand ) for move the cursor, if a new hand arrives just
generate a event mouse of left click down, now i'm working in gestures
for scroll and others.

Regards.
> ...
>
> leer más »

Iker Saint

unread,
Dec 18, 2010, 5:44:31 PM12/18/10
to OpenNI
I'm so sorry but my work doesn't agree with share the source code, for
the moment just videos :/.

Iker

Peter Krakow

unread,
Dec 19, 2010, 6:13:18 AM12/19/10
to OpenNI
I just downloaded the Windows SDK and indeed, there is a file called
Nite.ini with an option called "AllowMultipleHands".

Unfortunately I haven't found an equivalent in the Linux Version.

Peter

Peter Krakow

unread,
Dec 19, 2010, 6:21:39 AM12/19/10
to OpenNI
Oh, actually it's here: XnVHandGenerator/Data/Nite.ini

Lets check it out :-)

Peter

Peter Krakow

unread,
Dec 19, 2010, 6:52:33 AM12/19/10
to OpenNI
OK got it working by editing this file:

/usr/etc/primesense/XnVHandGenerator/Nite.ini

Has anyone figured out how to do push recognition for secondary
cursors?

Peter

Moshe Blitz

unread,
Dec 19, 2010, 6:55:05 AM12/19/10
to openn...@googlegroups.com
All you need to do is bringing the second hand in the vicinity of the first hand.

Regards,
Moshe Blitz
Support Manager
Prime Sense LTD

/usr/etc/primesense/XnVHandGenerator/Nite.ini

Peter

--
You received this message because you are subscribed to the Google Groups "OpenNI" group.
To post to this group, send email to openn...@googlegroups.com.
To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/openni-dev?hl=en.

Douglas Teófilo

unread,
Dec 19, 2010, 11:09:58 AM12/19/10
to OpenNI
Hi guys,

Has anyone managed to develop something like the discovery of Iker
Saint member?

Iker Saint

unread,
Dec 19, 2010, 12:25:04 PM12/19/10
to OpenNI
Well, with secondary points we have a trciky problem, the
XnVPointControl have a primary ID point associated, the other controls
also too, all the controls reconginize all the points that are
tracking, but only catch the gestures for the primary point; at the
moment I can't find a method that allow me override the primary point,
all of methos are private 'cause a child class is not the solution, in
the method that XnvPointControl implemets for "Update ( const
XnVMultipleHands &hands )" is possible but the object is "const" and
doesn't allow use some functions like "ReassignPrimary( XnVUInt32 )".
I check the members of other classes but no one implements a member of
type XnVMultipleHands with Exception of "XnVPointMessage" so i think
that is created for the array that the controls classes maintains of
points.

At this moment i can't figured out and the only solution is write your
own XnVPointControl, I still reading all API objects and I think that
early or late I figured out ;).
Message has been deleted

Peter Krakow

unread,
Dec 20, 2010, 4:28:23 AM12/20/10
to openn...@googlegroups.com
Well, reassigning the primary cursor wouldn't help much, as we want to be able to recognize gestures for all hands simultaneously, right?

More specifically what I want: Track multiple hands and allow all of them to do the "Push" gesture.

P

Iker Saint

unread,
Dec 20, 2010, 1:50:14 PM12/20/10
to OpenNI
Okay, I'm an asshole, the solution is in the question ¬¬, for process
gestures in multiple points, do you have to change the primary ID, is
only necesary overload "Update( const XnVMultipleHands &hands )" just
like this:

void MyXnVControl::Update( const XnVMultipleHands &hands )
{
XnVMultipleHands newhands( hands );
XnUInt32 uiID = GetSecondPoint();

if (m_nPointsCount <= 1 || (hands.GetPrimaryContext())->nID ==
uiID ) {
XnVControl::Update( hands );
return;
}

newhands.ReassignPrimary( uiID );

XnVControl::Update( newhands );

}

And this is all, our current primary Id is overwrite and have the
focus for the gestures, if you want push gestures in both hands, you
need 2 XnVPushDetector classes, each one associated with the ID that
you want to catch the gesture.

Iker

Douglas Teófilo

unread,
Dec 20, 2010, 2:35:23 PM12/20/10
to OpenNI
Would provide a release of your code?
Thanks!
> > P- Hide quoted text -

Videomap

unread,
Dec 21, 2010, 5:17:43 PM12/21/10
to OpenNI
Look this video http://www.youtube.com/watch?v=kipvbVTAmW

In the description you can get the url for the source code. This is
not my work, but i have compiled and run very good. Now we have blob
from multiple hands motion.
next step is output the multiple hands motion to tuio. If a good
developer can add this function to this code, we can use multitouch-
vista for have a beautiful multitouch air device for windows 7.

Multitouch-vista can accept a tuio provider.

All the world want openi skeleton to tuio

Andreas Binder

unread,
Dec 21, 2010, 5:26:25 PM12/21/10
to openn...@googlegroups.com
The url for the video is not working, could you please check the url. Tx in advance!

Videomap

unread,
Dec 21, 2010, 5:26:40 PM12/21/10
to OpenNI
Sorry the video is
http://www.youtube.com/watch?v=kipvbVTAmWk

On 21 Dic, 23:17, Videomap <spotgra...@gmail.com> wrote:
> Look this videohttp://www.youtube.com/watch?v=kipvbVTAmW

Peter Krakow

unread,
Dec 21, 2010, 6:06:03 PM12/21/10
to OpenNI
This hack works for me:

#include "MyXnVPushDetector.h"
#include <XnVMultipleHands.h>

void MyXnVPushDetector::Update(const XnVMultipleHands &hands)
{
if(hands.ActiveEntries() > 1)
{
XnVMultipleHands newhands(hands);
XnUInt32 primaryHandID = (newhands.GetPrimaryContext())->nID;

for(XnVMultipleHands::Iterator it = newhands.begin(); it !=
newhands.end(); it++)
{
if(newhands.IsActive((*it)->nID) && (*it)->nID !=
primaryHandID)
{
newhands.ReassignPrimary((*it)->nID);
XnVPushDetector::Update(newhands);
return;

Iker Saint

unread,
Dec 21, 2010, 6:41:01 PM12/21/10
to OpenNI
A non most elegant solution but it works and explain my point, well
done ;).

For the most people that ask about if I will release the code, the
answer is not :/, sorry.

Iker

Iker Saint

unread,
Dec 21, 2010, 6:46:06 PM12/21/10
to OpenNI
Okay this video show you how detect skeleton, the problem is that
gestures for single points ( hands ) is a little more complex in this
way, the UI control is doing by depth ( In a predefined distance I
think) for execute click, the precision is reduced and is possible
that you have problems with the control.

Sorry for my english :P.

Iker

On 21 dic, 17:26, Videomap <spotgra...@gmail.com> wrote:
> Sorry the video ishttp://www.youtube.com/watch?v=kipvbVTAmWk

Peter Krakow

unread,
Dec 22, 2010, 5:12:13 AM12/22/10
to OpenNI
It is probably better to use the OnPointCreate callbacks to find out
the IDs of the non-primary pointers. My code is only supposed to serve
as a proof of concept :)

Douglas Teófilo

unread,
Dec 22, 2010, 12:22:26 PM12/22/10
to OpenNI
I do not understand, you are a group of open source development and
would not share its code with other members.
Pity, then do not understand why the disclosure of your video.
Thanks!
> > > > - Show quoted text -- Hide quoted text -

Darien

unread,
Dec 22, 2010, 12:33:36 PM12/22/10
to OpenNI
You have a point but I think Iker is nice enough to help you if you
have specific technical questions and will be glad to point you in the
right direction. Teach a man to fish, and he'll fish for life, eh?

-Darien

On Dec 22, 12:22 pm, Douglas Teófilo <douglas.teof...@gmail.com>

Iker Saint

unread,
Dec 22, 2010, 12:41:12 PM12/22/10
to OpenNI
1. For my work I'm only allowed to disclosure videos, not code.
2. This is a forum-like not a "Open Source" group, read group
description.
3. I think that i don't have to release code for help you with your
questions, by the fact Peter figure out how detect gestures in
multiple points, and I don't have to write more that a pseudo code
implementation.

Iker

Douglas Teófilo

unread,
Dec 22, 2010, 1:21:59 PM12/22/10
to OpenNI
1. If you work for a mega corporation and can not reveal their code,
I'm sorry I did not.

2. As this group is dedicated to OpenNI that is written and
distributed under the GNU Lesser General Public License, which means
that its source code is freely distributed and available to the
general public. Ie you can redistribute it and / or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation. It was starting this description that I
imagine to be an open source group, but you seem to go against that
idea.

3. You have every reason not need to release more code maybe working
together with others who want to grow can do a better job.
Thanks for your help!

Iker Saint

unread,
Dec 22, 2010, 2:46:16 PM12/22/10
to OpenNI
I don't base my work in the Open source code of openNI, I only use the
binaries and SDK that is provided, by them I don't see the point about
the licence of the code.

The code doesn't give you a better job, your mind does.

The topic here is other and would be appreciated return to it.

Iker

George Toledo

unread,
Dec 22, 2010, 2:54:24 PM12/22/10
to openn...@googlegroups.com
The companies that released "free" source stand to make monetary gain because there are physical products that are being sold. If there is an SDK, more people can do more stuff, and buy more units.

For someone like Iker, he's put in hours of work in the personal domain, and his product *is that work*. No one should be indignant about not getting it for free, let alone make multiple whiney posts about it.

I'm sorry for getting involved, I just found the line of reasoning highly bothersome.

-GT

--
You received this message because you are subscribed to the Google Groups "OpenNI" group.
To post to this group, send email to openn...@googlegroups.com.
To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/openni-dev?hl=en.




--
George Toledo

The information contained in this E-mail and any attachments may be confidential.
If you have received this E-mail in error, please notify us immediately by telephone or return E-mail.
You should not use or disclose the contents of this E-mail or any of the attachments for any purpose or to any persons.

Zephyr87

unread,
Dec 26, 2010, 12:02:33 PM12/26/10
to OpenNI
where are the sources of http://www.youtube.com/watch?v=kipvbVTAmWk?

Ps.
I tried to compile the Darien's code, but I get an error.

1> .. \ Res \ NITE.rc (10): fatal error RC1015: can not open include
file 'afxres.h'.

If I change afxres.h in windows.h, but I have many errors.

Darien

unread,
Dec 26, 2010, 1:09:55 PM12/26/10
to OpenNI
Zephyr87,

Are you able to run compile and execute the original sample
PointViewer without errors. The reason I ask is because the code I
posted is not fundamentally different from the original sample code.
The only line which is really different is " SetCursorPos((int)
(pt.X*5.5),(int)(pt.Y*5.5)); " and the appropriate headers to call
that function. My code also doesn't support mouse clicks or
multitouch, so it's way behind what I'm seeing in the youtube video
you posted.

-Darien

On Dec 26, 12:02 pm, Zephyr87 <menionlea...@msn.com> wrote:
> where are the sources ofhttp://www.youtube.com/watch?v=kipvbVTAmWk?

Darien

unread,
Dec 26, 2010, 1:13:03 PM12/26/10
to OpenNI
It also seems to me that the source code to the video you posted is
located on these two pages

http://sd-tech-blog.blogspot.com/2010/12/kinectopenniflashflash.html
http://sd-tech-blog.blogspot.com/2010/12/kinectopenniflash.html

I would recommend using Google Translate on those pages and playing
around with the source.

-Darien

Mike Slinn

unread,
Dec 26, 2010, 3:56:41 PM12/26/10
to OpenNI
Same problem here, using the free VC++ Express 2010

Mike

Videomap

unread,
Dec 27, 2010, 3:05:02 AM12/27/10
to OpenNI
I modified the the source of sd-tech-blog.blogspot.com and i have
added to it the TUIO protocol

relevant code is :
tuio = new TuioServer(false);
starttime = TuioTime::getSessionTime();
tuio->initFrame(starttime);

g_DepthGenerator.ConvertRealWorldToProjective(1, pt, pt);

double mult_x = 1;
double mult_y = 1;
double mult_z = 1;
double off_x = 0.0;
double off_y = 0.0;
double off_z = 0.0;


float pX = off_x + (mult_x * (pt[0].X) / 640); //Normalize
coords to 0..1 interval
float pY = off_y + (mult_y * (pt[0].Y) / 480); //Normalize
coords to 0..1 interval
float pZ = off_z + (mult_z * pt[0].Z * 7.8125 / 10000); //
Normalize coords to 0..7.8125 interval

tuio->stopUntouchedMovingCursors();
tuio->removeUntouchedStoppedCursors();
tuio->commitFrame();

This work very well with many TUIO program like Tuiomouse , multitouch-
vista, and other

pandereto

unread,
Dec 31, 2010, 4:00:47 AM12/31/10
to OpenNI
Did you implement a 3d cursor in TUIO to get Z position???

Im trying to do the same, but im programming in openframeworks, the
tuio warper doesnt have 3d support....

Can you share your code, or show me how you did it¿?


Reagrads






On 27 dic, 09:05, Videomap <spotgra...@gmail.com> wrote:
> I modified  the the source of sd-tech-blog.blogspot.com and i have
> added to it theTUIOprotocol
>
> relevant code is :
>        tuio= new TuioServer(false);
>         starttime = TuioTime::getSessionTime();
>        tuio->initFrame(starttime);
>
> g_DepthGenerator.ConvertRealWorldToProjective(1, pt, pt);
>
>                double mult_x = 1;
>                double mult_y = 1;
>                double mult_z = 1;
>                double off_x = 0.0;
>                double off_y = 0.0;
>                double off_z = 0.0;
>
>          float pX = off_x + (mult_x * (pt[0].X) / 640); //Normalize
> coords to 0..1 interval
>          float pY = off_y + (mult_y * (pt[0].Y) / 480); //Normalize
> coords to 0..1 interval
>          float pZ = off_z + (mult_z * pt[0].Z * 7.8125 / 10000); //
> Normalize coords to 0..7.8125 interval
>
>                                tuio->stopUntouchedMovingCursors();
>                                tuio->removeUntouchedStoppedCursors();
>                                tuio->commitFrame();
>
> This work very well with manyTUIOprogram like Tuiomouse , multitouch-

sthelen

unread,
Jan 5, 2011, 10:27:16 AM1/5/11
to OpenNI
I want to identify the push gestures of both hands of a single user
(later I also want to deal with multiple users) at the same time. I'm
basically applying Peter's hack:

void MyXnVPushDetector::Update(const XnVMultipleHands &hands)
{
if(hands.ActiveEntries() > 1)
{
XnVMultipleHands newhands(hands);
XnUInt32 primaryHandID = (newhands.GetPrimaryContext())->nID;
for(XnVMultipleHands::Iterator it = newhands.begin(); it !=
newhands.end(); it++)
{
if(newhands.IsActive((*it)->nID) && (*it)->nID !=
primaryHandID)
{
newhands.ReassignPrimary((*it)->nID);
XnVPushDetector::Update(newhands);
return;
}
}
} else
{
XnVPushDetector::Update(hands);
}
}


The problem is that only the push gestures of one hand are identified.
If one hand is being tracked then all push gestures of that one hand
are identified. If two hands are being tracked then only the pushes of
the second hand are recognized. I don't see how this supposed to work
for two hands at the same time. I know Iker suggested 2
MyXnVPushDetector objects tracking different IDs but the problem I see
is that IDs constantly change as hands enter or leave a scene. If I
register a specific ID for one detector object the ID becomes invalid
when the hand is not being tracked anymore. Any ideas?

Sebastian
Message has been deleted

Iker Saint

unread,
Jan 5, 2011, 2:05:20 PM1/5/11
to OpenNI
The Id's change constantly but.. you always have a notification,
actually I have multiple gestures ( swipe, push, steady, etc.. ) in
all points, how?, is basically overload update ( you can create a
base
class for all the gestures ( from XnVPointControl ) and use this:

BaseControl::BaseControl( XnUInt32 uiPrimaryID ) :
XnVPointControl("XnVPointDrawer")
{
m_uiMyPrimaryID = uiPrimaryID;

}

BaseControl::~BaseControl()
{

}

XnUInt32 BaseControl::GetMyPrimaryID( void )
{
return m_uiMyPrimaryID;
}

void BaseControl::SetMyPrimaryID( XnUInt32 uiNewID )
{
m_uiMyPrimaryID = uiNewID;

}

// declaration "class MyPushDetector : public BaseControl, public
XnVPushDetector"

void MyPushDetector::Update( const XnVMultipleHands &hands )
{
XnUInt32 nID = GetMyPrimaryID();

// Actually -1 on no ID
if (nID == CDUI_VOID_POINT_ID) {
XnVPushDetector::Update( hands );
return;
}

XnVMultipleHands NewHands( hands );

NewHands.ReassignPrimary( nID );

XnVPushDetector::Update( NewHands );

}

Now how yo implemet this in all points?¿

void XN_CALLBACK_TYPE OnPointCreateCllBck( const XnVHandPointContext
*pContext, void *lpvParam)
{
MyPushDetector *pNew = new MyPushDetector( pContext-
>nID );

// Register your callbacks, configure control, add to a
generator, etc..
AddControlToList( (BaseControl *)pNew ); // My own list
of controls

}

void XN_CALLBACK_TYPE OnPointDestroyCllBck( XnUInt32 *nID, void
*lpvParam)
{
// Just an Idea FindControlForPoint( XnUInt32 (ID), bool
bDelete = false)
BaseControl *pControl = FindControlForPoint( nID,
true);
// Just unregister from the listener, remove whatever
you want and..... the magic does
delete pControl;
}

And here is a simple way, create your own callback for receive the
callings, callss the GetMyPrimaryID and proccess It.

Best Regards.

Sebastian Thelen

unread,
Jan 6, 2011, 10:02:59 AM1/6/11
to openn...@googlegroups.com
That looks promising. Thanks for sharing your code  I'll give it a try tomorrow and let you know. 

Best, 
Sebastian 

sthelen

unread,
Jan 10, 2011, 2:53:30 AM1/10/11
to OpenNI
It seems like detecting push gestures seems to work for multiple hands
now. Thx a lot!

Sebastian
> > openni-dev+...@googlegroups.com<openni-dev%2Bunsubscribe@googlegrou ps.com>
> > .

sthelen

unread,
Jan 10, 2011, 8:48:10 AM1/10/11
to OpenNI
Here is a little follow-up question.

Is there a way to tell which MyPushDetector object identified a
specific gesture (like "Push performed by point #1...")?
In my code all detectors are connectted to an XnVBroadcaster and their
references are stored in a vector of type
std::vector<MyXnVPushDetector*>.

I've been digging through the API for a while now but could't find
anything useful.


Sebastian

Iker Saint

unread,
Jan 10, 2011, 9:15:01 AM1/10/11
to OpenNI
Sorry but I don't understand you question.

Iker

sthelen

unread,
Jan 10, 2011, 10:53:52 AM1/10/11
to OpenNI
Sorry for not being clear enough.

I want to be able to tell which hand of a scene triggered a push
event. So for example if I there are three hands labeled 1,2, and 3
and hand # 2 performs a push gesture I want an output like "Push
performed by point 2..."
At the moment I don't see how to get that information. Since there are
three points in the scene there are three objects of type
MyPushDetector created by OnPointDestroyCllBck. Each of them is
connected to a XnVBroadcaster and I separately store their references
in a std::vector.

Sebastian

Lior Cohen

unread,
Jan 11, 2011, 10:04:28 AM1/11/11
to OpenNI
Dear Sebastian and others,

I wanted to offer a new alternative way for addressing the Multiple Hands issue that might assist you.

Attached please find a sample (windows based), containing the code of XnVSecondaryFilter, This sample contains the class itself, and a sample that uses it, all in the main.cpp file.

The class inherits from XnVPointFilter class. This means it receives PointMessages, and sends PointMessages.
It sends all the points it receives, but changes the primary point.
Expected behavior:
1. When there is only one point in the system, it will be the primary in the main tree, and also in the SecondaryFilter's subtree.
2. If there are multiple points in the system, the primary point in the SecondaryFilter's subtree will always be different than the primary point in the main tree.
3. This means that if the primary point from the main tree has disappeared, and the SecondaryFilter's point is selected as the primary point of the main tree, it will no longer be the SecondaryFilter's point (if there are additional points available to switch to)

In the sample, the new XnVSecondaryFilter and XnVSessionManager are connected. There are 2 XnVPushDetectors: one is connected directly to the XnVSessionManager, and will therefore work on the Primary Point; the other is connected to the XnVSecondaryFilter, and will therefore work on the Secondary Point. When there is only one point in the system, both will work on it.

Please note that this is a simple example of how one could create such a filter, if you need a different algorithm for choosing the primary point or extending its capabilities to support more than two points you are more than welcome to alter the code as you see fit.

REMINDER NOTE: OFFICIALLY MULTIPLE HANDS IS NOT A SUPPORTED FEATURE IN NITE!

Best Regards,
Lior Cohen

Sebastian

> > > > > openni-dev+be@googlegrou ps.com>


> > > > > .
> > > > > For more options, visit this group at
> > > > >http://groups.google.com/group/openni-dev?hl=en.

--


You received this message because you are subscribed to the Google Groups "OpenNI" group.
To post to this group, send email to openn...@googlegroups.com.

To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.

SecondaryPrimaryPoint.zip

Iker Saint

unread,
Jan 11, 2011, 11:52:47 AM1/11/11
to OpenNI
Great Job Lior.

Well I think that how reassing the primary point is clear but better a
little of code ( However at this point I can release almost all of
them :) ).


class CMyBaseControl : public XnVPointControl {
public:
CDavinciUIBaseControl( XnUInt32 uiPrimaryID = VOID_POINT_ID // -1 );
virtual ~CDavinciUIBaseControl();

XnUInt32 GetMyPrimaryID( void ) const;
virtual void SetMyPrimaryID( XnUInt32 uiNewID );
protected:
XnUInt32 m_uiMyPrimaryID;
};

As you see, you have a internal primary ID in your base class, this
means that each control is independent of the generator ( Broadcaster,
denoiser, etc.. ) and have his own primary ID, tha can be assigned,
now for the callbacks you save the problem generating your owns
callbacks ( Or override, as you wish ). In code looks like:

typedef void(__stdcall *MYPUSHDETECTOR_ONPUSH_CLLBCK)(XnFloat
fVelocity, XnFloat fAngle, XnUInt32 nPoint, void *lpvParam);
typedef void(__stdcall *MYPUSHDETECTOR_ONESTABILIZED_CLLBCK)(XnFloat
fVelocity, XnUInt32 nPoint, void *lpvParam);

class CMyPushDetector : public XnVPushDetector, public CBaseControl {
public:
CMyPushDetector( MYPUSHDETECTOR_ONPUSH_CLLBCK pPushCllBck,
MYPUSHDETECTOR_ONESTABILIZED_CLLBCK pStabilizedCllBck, void
*lpvParam);
~CMyPushDetector();

static void XN_CALLBACK_TYPE OnPushCllBck(XnFloat fVelocity, XnFloat
fAngle, void *lpvParam);
static void XN_CALLBACK_TYPE OnStabilizedCllBck(XnFloat fVelocity,
void *lpvParam);
private:
void Update( const XnVMultipleHands &hands );

void m_lpvCllBckParam;
MYPUSHDETECTOR_ONPUSH_CLLBCK m_OnPushCllBck;
MYPUSHDETECTOR_ONESTABILIZED_CLLBCK m_OnStabilizedCllBck;
};

CMyPushDetector::CMyPushDetector( MYPUSHDETECTOR_ONPUSH_CLLBCK
pPushCllBck, MYPUSHDETECTOR_ONESTABILIZED_CLLBCK pStabilizedCllBck,
void *lpvParam ) : XnVPushDetector( "XnVPushDetector"), CBaseControl()
{
m_lpvCllBckParam = lpvParam;;
m_OnPushCllBck = pPushCllBck;
m_OnStabilizedCllBck = pStabilizedCllBck;

RegisterPush( this, OnPushCllBck);
RegisterStabilized( this, OnStabilizedCllBck);
}


void XN_CALLBACK_TYPE CMyPushDetector::OnPushCllBck( XnFloat
fVelocity, XnFloat fAngle, void *lpvParam)
{
CMyPushDetector *pThis = (CMyPushDetector *)lpvParam;
XnUInt32 nID;

if (pThis->m_OnPushCllBck == NULL)
return;

nID = pThis->GetMyPrimaryID();

if (nID == CDUI_VOID_POINT_ID)
nID = pThis->XnVPushDetector::GetPrimaryID();

pThis->m_OnPushCllBck( fVelocity, fAngle, nID, pThis-
>m_lpvCllBckParam );
}

//etc.. callbacks

void CMyPushDetector::Update( const XnVMultipleHands &hands )
{
XnUInt32 nID = GetMyPrimaryID();

if (nID == CDUI_VOID_POINT_ID) {
XnVPushDetector::Update( hands );
return;
}

XnVMultipleHands NewHands( hands );

NewHands.ReassignPrimary( nID );

XnVPushDetector::Update( NewHands );
}

I think that this code is far enough for implement and independent
tracking way for all points and detect gestures in each one.

Best Regards.

Iker.
> ...
>
> leer más »
>
>  SecondaryPrimaryPoint.zip
> 10 KVerDescargar

sthelen

unread,
Jan 12, 2011, 5:18:29 AM1/12/11
to OpenNI
Hello Lior, hello Iker,

thx for sharing ideas. I'll check out your code as soon as possible to
see if it helps me with my multitouch user interface implementation.

Since I had some trouble with gesture detection I switched to the
approach described in this thread
http://groups.google.com/group/openni-dev/browse_thread/thread/1947fd202d569a47
using a depth threshold for touching. This is a lot less elegant and
usability suffers but it's a little more straight forward to
implement. Once I'm done I'll see if I can get it to work using
PushDetectors and let you know.

Sebastian
> ...
>
> read more »

Lior Cohen

unread,
Jan 12, 2011, 10:12:55 AM1/12/11
to OpenNI
Dear Sebastian and others,

I've received a failure notice on my last posting so I've removed the attachment and added a link instead to the posting:

I wanted to offer a new alternative way for addressing the Multiple Hands issue that might assist you.

I've uploaded a sample (windows based) to http://rapidshare.com/#!download|76|442037836|SecondaryPrimaryPoint.zip|15.292, containing the code of XnVSecondaryFilter, This sample contains the class itself, and a sample that uses it, all in the main.cpp file.

Lior Cohen

unread,
Jan 12, 2011, 10:50:02 AM1/12/11
to openn...@googlegroups.com
Dear Iker,

Very nice work, I only wanted to emphases the differences between the two approaches.

The advantage of having the decision on which point should be the primary point in a PointFilter is that:
1) The logic of choosing and setting a new primary point is done once
2) You can add any control/s as a listener to it and the entire NITE branch that is connected to the PointFilter will work with this primary point.
On the other hand when you change the control/s to choose a hand point to work with, you are doing the "point filtering" at each one of the controls and also you need to create new "special" controls collection (MyWaveDetector, MySwipeDetector, MySelectableSlider1D, ...).

Another technical problem that I've encounter when trying to use MyPushDetector was that because both MyPushDetector and BaseControl inherits from XnVPointControl (XnVPushDetector is also a XnVPointControl) and XnVPointControl is also XnVPointListener, there is an ambiguity as to which "XnVPointListener" should be used.

Perhaps, if you wish to create an independent MyPushDetector that can use a different handpoint as a primary handpoint is to create a Compound Control that has a modified PointFilter that is connected to a regular XnVPushDetector in a single class. You can do that also to any other control e.g., MyWaveDetector. You can look at the "Sample-Boxes" that is included in the NITE installation for a reference code for creating a Compound Control (MyBox).

Best Regards,
Lior Cohen


-----Original Message-----
From: openn...@googlegroups.com [mailto:openn...@googlegroups.com] On Behalf Of Iker Saint
Sent: Wednesday, January 05, 2011 9:05 PM
To: OpenNI
Subject: [OpenNI-dev] Re: Multi Hand Cursor with Kinect

}

BaseControl::~BaseControl()
{

}

}

XnVMultipleHands NewHands( hands );

NewHands.ReassignPrimary( nID );

XnVPushDetector::Update( NewHands );

}

}

Best Regards.

--

Iker Saint

unread,
Jan 12, 2011, 3:58:21 PM1/12/11
to OpenNI
The problems that I see in the use of filter is that you have to
create a message generator for each point, this could be a little
tricky in the code.
The problem of ambiguity is well known in C++, in fact you can use a
pointer to the base class ( allocating a MyPushDetectot) for the
control and is solved, you don't need any function of PushDetector in
practice; I'm exploring another ways to have a better and flexible
primary point replacement, at this point the base class seems to be
the more flexible for me; the only problem is that in each control you
have to allocate a newhands object and replacing, this isn't directly
a problem but is not the most elegant way.

Someone with another Idea or implementation?.

Iker.

Chanakyan

unread,
Jan 12, 2011, 10:28:39 PM1/12/11
to OpenNI
Hello Lior,

The Link that you posted - > http://rapidshare.com/#!download|76|442037836|SecondaryPrimaryPoint.zip|15.292,
Does not seem to work. Can you please check and send the correct one ?

Thanks
Chanakyan

On Jan 12, 9:12 am, Lior Cohen <Lior.Co...@PrimeSense.com> wrote:
> Dear Sebastian and others,
>
> I've received a failure notice on my last posting so I've removed the attachment and added a link instead to the posting:
>
> I wanted to offer a new alternative way for addressing the Multiple Hands issue that might assist you.
>
> I've uploaded a sample (windows based) tohttp://rapidshare.com/#!download|76|442037836|SecondaryPrimaryPoint.zip|15.292, containing the code of XnVSecondaryFilter, This sample contains the class itself, and a sample that uses it, all in the main.cpp file.
>
> The class inherits from XnVPointFilter class. This means it receives PointMessages, and sends PointMessages.
> It sends all the points it receives, but changes the primary point.
> Expected behavior:
> 1.      When there is only one point in the system, it will be the primary in the main tree, and also in the SecondaryFilter's subtree.
> 2.      If there are multiple points in the system, the primary point in the SecondaryFilter's subtree will always be different than the primary point in the main tree.
> 3.      This means that if the primary point from the main tree has disappeared, and the SecondaryFilter's point is selected as the primary point of the main tree, it will no longer be the SecondaryFilter's point (if there are additional points available to switch to)
>
> In the sample, the new XnVSecondaryFilter and XnVSessionManager are connected. There are 2 XnVPushDetectors: one is connected directly to the XnVSessionManager, and will therefore work on the Primary Point; the other is connected to the XnVSecondaryFilter, and will therefore work on the Secondary Point. When there is only one point in the system, both will work on it.
>
> Please note that this is a simple example of how one could create such a filter, if you need a different algorithm for choosing the primary point or extending its capabilities to support more than two points you are more than welcome to alter the code as you see fit.
>
> REMINDER NOTE: OFFICIALLY MULTIPLE HANDS IS NOT A SUPPORTED FEATURE IN NITE!
>
> Best Regards,
> Lior Cohen-----Original Message-----

Chanakyan

unread,
Jan 12, 2011, 10:31:24 PM1/12/11
to OpenNI

Please ignore my previous message . It was just a connectivity issue
at my side.

Thanks
Chanakyan
> ...
>
> read more »- Hide quoted text -

asboy83

unread,
Mar 7, 2011, 10:02:08 AM3/7/11
to OpenNI
Hello all,
i'm trying to implement the code from Lior in C# with manageNite.dll
but without success, for example ++iter produces an error or iter.nID
does not exist. I'm using the latest unstable version from NITE. Does
someone have an example tracking multiple primary points using C# ?.
Many thanks in advance.

On 13 Jan., 04:31, Chanakyan <priye...@gmail.com> wrote:
> Please ignore my previous message . It was just a connectivity issue
> at my side.
>
> Thanks
> Chanakyan
>
> On Jan 12, 9:28 pm, Chanakyan <priye...@gmail.com> wrote:
>
>
>
>
>
>
>
> > Hello Lior,
>
> > The Link that you posted - >http://rapidshare.com/#!download|76|442037836|SecondaryPrimaryPoint.zip|15.292,
> > Does not seem to work. Can you please check and send the correct one ?
>
> > Thanks
> > Chanakyan
>
> > On Jan 12, 9:12 am, Lior Cohen <Lior.Co...@PrimeSense.com> wrote:
>
> > > Dear Sebastian and others,
>
> > > I've received a failure notice on my last posting so I've removed the attachment and added a link instead to the posting:
>
> > > I wanted to offer a new alternative way for addressing theMultipleHandsissue that might assist you.
>
> > > I've uploaded a sample (windows based) tohttp://rapidshare.com/#!download|76|442037836|SecondaryPrimaryPoint.zip|15.292, containing the code of XnVSecondaryFilter, This sample contains the class itself, and a sample that uses it, all in the main.cpp file.
>
> > > The class inherits from XnVPointFilter class. This means it receives PointMessages, and sends PointMessages.
> > > It sends all the points it receives, but changes the primary point.
> > > Expected behavior:
> > > 1.      When there is only one point in the system, it will be the primary in the main tree, and also in the SecondaryFilter's subtree.
> > > 2.      If there aremultiplepoints in the system, the primary point in the SecondaryFilter's subtree will always be different than the primary point in the main tree.
> > > 3.      This means that if the primary point from the main tree has disappeared, and the SecondaryFilter's point is selected as the primary point of the main tree, it will no longer be the SecondaryFilter's point (if there are additional points available to switch to)
>
> > > In the sample, the new XnVSecondaryFilter and XnVSessionManager are connected. There are 2 XnVPushDetectors: one is connected directly to the XnVSessionManager, and will therefore work on the Primary Point; the other is connected to the XnVSecondaryFilter, and will therefore work on the Secondary Point. When there is only one point in the system, both will work on it.
>
> > > Please note that this is a simple example of how one could create such a filter, if you need a different algorithm for choosing the primary point or extending its capabilities to support more than two points you are more than welcome to alter the code as you see fit.
>
> > > REMINDER NOTE: OFFICIALLYMULTIPLEHANDSIS NOT A SUPPORTED FEATURE IN NITE!
>
> > > Best Regards,
> > > Lior Cohen-----Original Message-----
> > > From: openn...@googlegroups.com [mailto:openn...@googlegroups.com] On Behalf Of sthelen
> > > Sent: Monday, January 10, 2011 5:54 PM
> > > To: OpenNI
> > > Subject: [OpenNI-dev] Re: Multi Hand Cursor with Kinect
>
> > > Sorry for not being clear enough.
>
> > > I want to be able to tell which hand of a scene triggered a push event. So for example if I there are threehandslabeled 1,2, and 3 and hand # 2 performs a push gesture I want an output like "Push performed by point 2..."
> > > At the moment I don't see how to get that information. Since there are three points in the scene there are three objects of type MyPushDetector created by OnPointDestroyCllBck. Each of them is connected to a XnVBroadcaster and I separately store their references in a std::vector.
>
> > > Sebastian
>
> > > On Jan 10, 3:15 pm, Iker Saint <iker.sa...@gmail.com> wrote:
> > > > Sorry but I don't understand you question.
>
> > > > Iker
>
> > > > On 10 ene, 08:48, sthelen <sebastianthe...@gmail.com> wrote:
>
> > > > > Here is a little follow-up question.
>
> > > > > Is there a way to tell which MyPushDetector object identified a
> > > > > specific gesture (like "Push performed by point #1...")?
> > > > > In my code all detectors are connectted to an XnVBroadcaster and
> > > > > their references are stored in a vector of type
> > > > > std::vector<MyXnVPushDetector*>.
>
> > > > > I've been digging through the API for a while now but could't find
> > > > > anything useful.
>
> > > > > Sebastian
>
> > > > > On Jan 10, 8:53 am, sthelen <sebastianthe...@gmail.com> wrote:
>
> > > > > > It seems like detecting push gestures seems to work formultiple
> > > > > >handsnow. Thx a lot!
>
> > > > > > Sebastian
>
> > > > > > On Jan 6, 4:02 pm, Sebastian Thelen <sebastianthe...@gmail.com> wrote:
>
> > > > > > > That looks promising. Thanks for sharing your code  I'll give it
> > > > > > > a try tomorrow and let you know.
>
> > > > > > > Best,
> > > > > > > Sebastian
>
> > > > > > > On Wed, Jan 5, 2011 at 8:05 PM, Iker Saint <iker.sa...@gmail.com> wrote:
> > > > > > > > The Id's change constantly but.. you always have a
> > > > > > > > notification, actually I havemultiplegestures ( swipe, push,
> > > > > > > > > I want to identify the push gestures of bothhandsof a
> > > > > > > > > single user (later I also want to deal withmultipleusers)
> > > > > > > > > at the same time. I'm basically applying Peter's hack:
>
> > > > > > > > > void MyXnVPushDetector::Update(const XnVMultipleHands
> > > > > > > > > &hands) {
> > > > > > > > >     if(hands.ActiveEntries() > 1)
> > > > > > > > >     {
> > > > > > > > >         XnVMultipleHands newhands(hands);
> > > > > > > > >         XnUInt32 primaryHandID =
> > > > > > > > > (newhands.GetPrimaryContext())->nID;
> > > > > > > > >         for(XnVMultipleHands::Iterator it =
> > > > > > > > > newhands.begin(); it != newhands.end(); it++)
> > > > > > > > >         {
> > > > > > > > >             if(newhands.IsActive((*it)->nID) && (*it)->nID
> > > > > > > > > !=
> > > > > > > > > primaryHandID)
> > > > > > > > >             {
> > > > > > > > >                 newhands.ReassignPrimary((*it)->nID);
> > > > > > > > >                 XnVPushDetector::Update(newhands);
> > > > > > > > >                 return;
> > > > > > > > >             }
> > > > > > > > >         }
> > > > > > > > >     } else
> > > > > > > > >     {
> > > > > > > > >         XnVPushDetector::Update(hands);
> > > > > > > > >     }
>
> > > > > > > > > }
>
> > > > > > > > > The problem is that only the push gestures of one hand are identified.
> > > > > > > > > If one hand is being tracked then all push gestures of that
> > > > > > > > > one hand are identified. If twohandsare being tracked then
> > > > > > > > > only the pushes of the second hand are recognized. I don't
> > > > > > > > > see how this supposed to work for twohandsat the same
> > > > > > > > > time. I know Iker suggested 2 MyXnVPushDetector objects
> > > > > > > > > tracking different IDs but the problem I see is that IDs
> > > > > > > > > constantly change ashandsenter or leave a scene. If I
> > > > > > > > > register a specific ID for one detector object the ID becomes invalid when the hand is not being tracked anymore. Any ideas?
>
> > > > > > > > > Sebastian
>
> > > > > > > > > On Dec 22 2010, 12:46 am, Iker Saint <iker.sa...@gmail.com> wrote:
>
> > > > > > > > > > Okay this video show you how detect skeleton, the...
>
> Erfahren Sie mehr »

Lior Cohen

unread,
Mar 8, 2011, 10:11:18 AM3/8/11
to openn...@googlegroups.com
Dear asboy83,

At the moment there is no access to the XnVMultipleHands via C# wrapper (ManagedNite.dll). What you can do for now is create a managed code that handles this exact code and reviles a C# API for your application. Ether warp the XnVSecondaryFilter or the XnVMultipleHands (which is probably harder).

Best Regards,
Lior Cohen

--

Ricardo Torres

unread,
Apr 25, 2011, 11:30:13 AM4/25/11
to OpenNI
Hi all,

Is there any new solution for multi hand gesture recognition based on
the new versions of the C# libraries?

Best regards
Ricardo Torres

On Mar 8, 4:11 pm, Lior Cohen <Lior.Co...@PrimeSense.com> wrote:
> Dear asboy83,
>
> At the moment there is no access to the XnVMultipleHands via C# wrapper (ManagedNite.dll). What you can do for now is create a managed code that handles this exact code and reviles a C# API for your application. Ether warp the XnVSecondaryFilter or the XnVMultipleHands (which is probably harder).
>
> Best Regards,
> Lior Cohen
>
> -----Original Message-----
> From: openn...@googlegroups.com [mailto:openn...@googlegroups.com] On Behalf Of asboy83
> Sent: Monday, March 07, 2011 5:02 PM
> To: OpenNI
> Subject: [OpenNI-dev] Re: Multi Hand Cursor with Kinect
>
> Hello all,
> i'm trying to implement the code from Lior in C# with manageNite.dll
> but without success, for example ++iter produces an error or iter.nID
> does not exist.  I'm using the latest unstable version from NITE. Does
> someone have an example trackingmultipleprimary points using C# ?.
> > > > I want to be able to tell which hand of a scene triggered a push event. So for example if I there are threehandslabeled 1,2, and 3 and hand # 2 performs a pushgestureI want an output like "Push performed by point 2..."
> > > > At the moment I don't see how to get that information. Since there are three points in the scene there are three objects of type MyPushDetector created by OnPointDestroyCllBck. Each of them is connected to a XnVBroadcaster and I separately store their references in a std::vector.
>
> > > > Sebastian
>
> > > > On Jan 10, 3:15 pm, Iker Saint <iker.sa...@gmail.com> wrote:
> > > > > Sorry but I don't understand you question.
>
> > > > > Iker
>
> > > > > On 10 ene, 08:48, sthelen <sebastianthe...@gmail.com> wrote:
>
> > > > > > Here is a little follow-up question.
>
> > > > > > Is there a way to tell which MyPushDetector object identified a
> > > > > > specificgesture(like "Push performed by point #1...")?
> ...
>
> read more »

vakuljindal

unread,
Jun 21, 2011, 7:46:44 AM6/21/11
to openn...@googlegroups.com
I did what you have written here. The Primary point of the Push Detector gets
changed. But it stops recognising push gesture... What should I do then?


--
View this message in context: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-Multi-Hand-Cursor-with-Kinect-tp2107192p3090288.html
Sent from the OpenNI discussions mailing list archive at Nabble.com.

juned.munshi

unread,
Jul 26, 2011, 9:11:41 AM7/26/11
to openn...@googlegroups.com
Any workaround for c# wrapper to support multi hand gestures in new release.

--
View this message in context: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-Multi-Hand-Cursor-with-Kinect-tp2107192p3200273.html

Reply all
Reply to author
Forward
0 new messages