About skeleton tracking with multiple kinects

1,058 views
Skip to first unread message

pcp

unread,
Jan 24, 2012, 8:01:37 AM1/24/12
to OpenNI
Hi people,

I'm new here. I've been using OpenNI recently, and I'm still getting
accostumed to it. Well, it's been nice to program with it until
yesterday (I'm using the C# wrapper). What I am trying since then, is
to provide TWO different UserGenerators (skeletons) that rely on TWO
different depth nodes, from TWO different Kinect devices.

I've built the production nodes programatically (since, some posts
state it is harder to get it working with XML ---I'm interested to
know if that's true, or somebody knows how to declare the two kinects
via XML).

I've been able to print out the UserGenerators' Info, which gave,
among other things, these trees I've simplified:

User1 <-- Depth1 <-- Device1.
User1 <-- Depth1 <-- Device2.

(I'm guessing, Depth1 and User1 are called the same way for both
Devices since they are the 1st Depth/User Nodes for each of them ---
correct me if otherwise---).

My output? Well, I'm getting DIFFERENT depth maps, but THE SAME user
information for BOTH kinects (that means, I have to images on a window
form, one showing a skeleton over a user, and another one, showing the
same skeleton over a different depth map ---from the other device---).

I've been reading different discussion lists, and found things such
as:

http://groups.google.com/group/openni-dev/browse_thread/thread/aa7a558382572b36/b2db9e614edfa855
http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-User-generator-with-multiples-devices-td2753125.html

In the second link, the programmer wonders how to "link" or "feed" the
UserGenerator with the appropriate DepthGenerator (So do I).

So the question remains, how can one get two kinects working with
UserGenerators FROM different Depth nodes? How can I get different
users/skeletons for each device?

Thanks a lot in advance.

pcp

unread,
Jan 24, 2012, 11:59:02 AM1/24/12
to OpenNI
I've also found this, which is exactly what happens to me ... but no
answer is given on how to "assign" that depth map:

http://groups.google.com/group/openni-dev/browse_thread/thread/32f19fc33aba6c61/d70926374cba873e

I'm not getting 4 user generators when enumerating, but dozens of
them ...

pcp.

On 24 ene, 14:01, pcp <pclimentpe...@gmail.com> wrote:
> Hi people,
>
> I'm new here. I've been using OpenNI recently, and I'm still getting
> accostumed to it. Well, it's been nice to program with it until
> yesterday (I'm using the C# wrapper). What I am trying since then, is
> to provide TWO different UserGenerators (skeletons) that rely on TWO
> different depth nodes, from TWO different Kinect devices.
>
> I've built the production nodes programatically (since, some posts
> state it is harder to get it working with XML ---I'm interested to
> know if that's true, or somebody knows how to declare the two kinects
> via XML).
>
> I've been able to print out the UserGenerators' Info, which gave,
> among other things, these trees I've simplified:
>
> User1 <-- Depth1 <-- Device1.
> User1 <-- Depth1 <-- Device2.
>
> (I'm guessing, Depth1 and User1 are called the same way for both
> Devices since they are the 1st Depth/User Nodes for each of them ---
> correct me if otherwise---).
>
> My output? Well, I'm getting DIFFERENT depth maps, but THE SAME user
> information for BOTH kinects (that means, I have to images on a window
> form, one showing a skeleton over a user, and another one, showing the
> same skeleton over a different depth map ---from the other device---).
>
> I've been reading different discussion lists, and found things such
> as:
>
> http://groups.google.com/group/openni-dev/browse_thread/thread/aa7a55...http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-User-genera...

pcp

unread,
Jan 25, 2012, 3:47:50 AM1/25/12
to OpenNI
This is my init (C#):

                context = new Context();

                System.Console.WriteLine("Enumerating all depth
nodes ...");
                List<DepthGenerator> depthGens = new
List<DepthGenerator>();
                NodeInfoList depthList =
context.EnumerateProductionTrees(NodeType.Depth, new Query());

                foreach (NodeInfo node in depthList)
                {
                    System.Console.WriteLine(" - " +
node.Description.Name + "; vendor = " + node.Description.Vendor + ";
instance = " + node.InstanceName);
                    depthGens.Add(context.CreateProductionTree(node)
as DepthGenerator);
                }

                System.Console.WriteLine("Node naming after
initialization ...");
                foreach (NodeInfo node in depthList){
                    System.Console.WriteLine(" - " +
node.Description.Name + "; vendor = " + node.Description.Vendor + ";
instance = " + node.InstanceName);
                }

                depth1 = depthGens[0];
                Query query1 = new Query();
                query1.AddNeededNode(depth1.Name);
                user1 = new UserGenerator(context, query1);

                depth2 = depthGens[1];
                Query query2 = new Query();
                query2.AddNeededNode(depth2.Name);
                user2 = new UserGenerator(context, query2);

                depth1.StartGenerating();
                depth2.StartGenerating();

                user1.StartGenerating();
                user2.StartGenerating();

pcp

On 24 ene, 17:59, pcp <pclimentpe...@gmail.com> wrote:
> I've also found this, which is exactly what happens to me ... but no
> answer is given on how to "assign" that depth map:
>
> http://groups.google.com/group/openni-dev/browse_thread/thread/32f19f...
> >http://groups.google.com/group/openni-dev/browse_thread/thread/aa7a55......

pcp

unread,
Jan 25, 2012, 3:49:41 AM1/25/12
to OpenNI
With that init I just posted, I get a "Can't create any node of the
requested type!".

pcp.

Eddie Cohen

unread,
Jan 25, 2012, 7:49:37 AM1/25/12
to openn...@googlegroups.com

Hi,

 

There are two bugs that currently prevent you from doing it that way:

1.       It seems there’s a bug when using AddNeededNode to a query object in C# Wrapper.

2.       In latest version, the UserGenerator has complex production graph, which might cause pose detection to take place on one sensor and skeleton algorithm on the other.

 

To workaround both bugs, use the following code:

 

class Program

{

    private static bool IsUsingAnotherDepth(NodeInfo info, DepthGenerator depth)

    {

        if (info.Description.Type == NodeType.Depth)

        {

            return (info.Instance != depth);

        }

        else

        {

            foreach (NodeInfo needed in info.NeededNodes)

            {

                if (IsUsingAnotherDepth(needed, depth))

                {

                    return true;

                }

            }

        }

 

        // no other depth found

        return false;

    }

 

    private static NodeInfo FindUserForDepth(NodeInfoList userInfoList, DepthGenerator depth)

    {

        foreach (NodeInfo info in userInfoList)

        {

            if (!IsUsingAnotherDepth(info, depth))

                return info;

        }

 

        return null;

    }

 

    static void Main()

    {

        ScriptNode script;

        Context context = new Context();

 

        List<DepthGenerator> depthList = new List<DepthGenerator>();

 

        // search and create available depth nodes (there should be one for every device)

        NodeInfoList depthInfoList = context.EnumerateProductionTrees(NodeType.Depth, null);

        foreach (NodeInfo depthInfo in depthInfoList)

        {

            DepthGenerator depth = (DepthGenerator)context.CreateProductionTree(depthInfo);

            depthList.Add(depth);

        }

 

        // search and create a user generator for each one of the depth nodes

        List<UserGenerator> userList = new List<UserGenerator>();

        NodeInfoList userInfoList = context.EnumerateProductionTrees(NodeType.User, null);

 

        foreach (DepthGenerator depth in depthList)

        {

            // find the first user generator that depends solely on this depth generator

            NodeInfo userInfo = FindUserForDepth(userInfoList, depth);

            if (userInfo == null)

            {

                throw new System.Exception("Can't find any user alternative for depth!");

            }

 

            // create it

            UserGenerator user = (UserGenerator)context.CreateProductionTree(userInfo);

            userList.Add(user);

        }

 

        // we have it all, now start reading frames

        context.StartGeneratingAll();

 

        while (true)

        {

            context.WaitAnyUpdateAll();

        }

    }

}

 

Eddie.

--

You received this message because you are subscribed to the Google Groups "OpenNI" group.

To post to this group, send email to openn...@googlegroups.com.

To unsubscribe from this group, send email to openni-dev+...@googlegroups.com.

For more options, visit this group at http://groups.google.com/group/openni-dev?hl=en.

 

pcp

unread,
Jan 25, 2012, 8:52:21 AM1/25/12
to OpenNI
Thank you very much for your answer, I'll give that code a try; and
I'll post my results.

I really appreciate your rapid replying, and all the involvement
you've taken in this,

pcp.
> > On 24 ene, 17:59, pcp <pclimentpe...@gmail.com<mailto:pclimentpe...@gmail.com>> wrote:
>
> > > I've also found this, which is exactly what happens to me ... but no
>
> > > answer is given on how to "assign" that depth map:
>
> > >http://groups.google.com/group/openni-dev/browse_thread/thread/32f19f...
>
> > > I'm not getting 4 user generators when enumerating, but dozens of
>
> > > them ...
>
> > > pcp.
>
> To post to this group, send email to openn...@googlegroups.com<mailto:openn...@googlegroups.com>.
>
> To unsubscribe from this group, send email to openni-dev+...@googlegroups.com<mailto:openni-dev+unsubscribe@googl egroups.com>.

pcp

unread,
Jan 25, 2012, 11:24:37 AM1/25/12
to OpenNI
Just WOW, I've seen the info generated by the "userList[i].Info"s ...
and it seems to be working perfectly!

User2 <-- Depth2 <-- Device2
User1 <-- Depth1 <-- Device1

The problem now is I'm getting an Access Violation Exception when
running ... I don't know exactly what's causing it; the stacktrace
says it's when freeing the NodeInfoList ...

System.AccessViolationException was unhandled
Message=Intento de leer o escribir en la memoria protegida. A
menudo, esto indica que hay otra memoria dañada. (yep, that's spanish
for "you tried to read or write in a protected memory area, this might
mean other memory areas are damaged too" more or less).
Source=OpenNI.Net
StackTrace:
en OpenNI.SafeNativeMethods.xnNodeInfoListFree(IntPtr pList)
en OpenNI.NodeInfoList.FreeObject(IntPtr ptr, Boolean
disposing)
en OpenNI.ObjectWrapper.Dispose(Boolean disposing)
en OpenNI.ObjectWrapper.Finalize()
InnerException:

I also activated the Log, and saw this error (happening twice, once
per device?):

ERROR Could not open file mapping object (2).

I don't have a clue if these two errors are related with the former
exception ...

Tarek Belkahia

unread,
Jan 26, 2012, 3:33:27 AM1/26/12
to OpenNI
Hi

I have been trying to make this multiple skeleton thingy work for
weeks now but in vain.
Here are my conclusions so maybe it will help some.

First, let's try to understand what we have when we enumerate the user
generators nodes.
So when I plug in 2 kinects (on two different usb controllers. I know
that because I managed to make 2 depth generators work in parralel),
and then call context.EnumerateProductionTrees(XN_NODE_TYPE_USER,
NULL, list, NULL) here's what I get : http://pastebin.com/VQH2ShWj

Each * represents a potential user generator, I displayed the whole
tree with the needed nodes for each user generator, the type, class
and version (the number is the type XnPredefinedProductionNodeType see
XnCppWrapper.h).

We can make these conclusions :
- There are two types of user generators, one needing only a depth
generator, (tree structure 6->2->1), available with all the versions
of NITE and OpenNI and one with a more complicated structure, needing
a gesture generator and scene analyzer (tree structure 6->(9->2->1),
(10->2->1)) available only with the last version of OpenNI and NITE.
After some tests, the complicated version is the one not needing user
pose for calibration while the simpler version needs it.
- We somehow need to pick the right user generator nodes in order to
avoid the data from one kinect at one level getting mixed with the
data from another kinect (assuming this is possible).

Then, I decided to go with the more developed tree structure, and try
to manually build two different user generator nodes with the whole
tree structure, making sure no node from the first tree structure was
depending on one from the other. I proceeded from the bottom up,
enumerating the devices (i found only 2 so I assumed one is kinect1
and the other kinect2), the I created 2 depth nodes each one depending
on one of these devices nodes, then 2 gesture nodes then scene nodes
and so on. I used instance names to make the difference between the
nodes. I set an instance name to the node info before calling
context.CreateProductionTree(...), and I also specifically chose that
the nodes are from the same and last version available.

At the end, my structure of created nodes was like that :

* Node: 6 XnVSkeletonGenerator 1.5.2.21 User 1
-Node: 9 XnVGestureGenerator 1.5.2.21 Gesture 1
--Node: 2 SensorKinect 5.1.0.25 Depth 1
---Node: 1 SensorKinect 5.1.0.25 Kinect 1
-Node: 10 XnVSceneAnalyzer 1.5.2.21 Scene 1
--Node: 2 SensorKinect 5.1.0.25 Depth 1
---Node: 1 SensorKinect 5.1.0.25 Kinect 1
* Node: 6 XnVSkeletonGenerator 1.5.2.21 User 2
-Node: 9 XnVGestureGenerator 1.5.2.21 Gesture 2
--Node: 2 SensorKinect 5.1.0.25 Depth 2
---Node: 1 SensorKinect 5.1.0.25 Kinect 2
-Node: 10 XnVSceneAnalyzer 1.5.2.21 Scene 2
--Node: 2 SensorKinect 5.1.0.25 Depth 2
---Node: 1 SensorKinect 5.1.0.25 Kinect 2

So up to here, everything seems fine.

I run the program displaying basic data from each user generator, like
the confidence and the position of the head.
But unfortunately, the data is not good, there's still inconsistency,
for example, if I place myself in front of one kinect only, the other
one facing a wall, I still obtain data for the second kinect.

Here's the result of that experiment (user in front of kinect1,
kinect2 facing wall) : http://pastebin.com/PwAeZnmb

So, how can we explain that ?

I think it's one out of the three following hypothesis :

1. With the present implementation of XnVSkeletonTracking in NITE, it
is NOT possible to track 2 users simultaneously, then, we're fucked
and we can move on to do something else and pray/wish that people at
PrimeSense will take care of this.

And honestly, I think this is the most likely explanation.

2. The way I implemented the solution makes data go nuts. for example
I used the same callbacks for both user generators, and stuff like
that. But here I think I can't do anything more because I've tried
everything. Maybe if someone has a look at the code he'll find
something.

3. Multiple skeleton tracking doesn't work with the complicated user
generator nodes but may work with the simpler.
-> NO, I've tried the same thing with other simpler nodes and still
nothing, it was even worse (no tracking at all, stuck in user
detection/calibration)

4. A workaround of trick to make the multiple skeleton tracking work
would be to pick 2 user generators with different versions -> they may
be unrelated/independent and thus avoiding the problems I stated
earlier.
I'm not sure, I don't think so, but I didn't try it because I had
enough.

Voila, hope this helps someone, or if someone got it working, please
tell me where I fucked up.
My code is available on https://github.com/tokou/KinectDrone (the git
repo is still a mess but you can find what you're looking for in the
TestMultiKinect directory) please have a look, feel free to ask if you
wanna know anything else.

Oh and if some people from PrimeSense read this, please tell us if you
know that it is impossible to do, or if you know how to do it, or if
you know that it will be implemented in next versions... because I saw
a lot of people lately trying to to this without any luck and I'm one
of them. Thanks in advance.

Tarek

pcp

unread,
Jan 26, 2012, 5:45:11 AM1/26/12
to OpenNI
Hi Tarek,

I've tried "mixing" UserGenerators of both approaches (with psi-pose
"needed" --1.5.0.0-- and "not needed" --latest--), as you suggest it
might work, and it didn't (just as it doesn't with psi-pose-needed
UserGenerators --1.5.0.0-- only). The problem, as I've been told
(thanks, Eddie), is a bug in the underlying skeleton algorithm.

I completely understand your 'anger?' ... but try to calm down,
ooooohmm

pcp.
> My code is available onhttps://github.com/tokou/KinectDrone(the git

Tarek Belkahia

unread,
Jan 26, 2012, 9:30:19 AM1/26/12
to OpenNI
Hi pcp

Thank you for your answer, now we know.

And no I'm not angry at all :)
I was just a bit frustrated that I couldn't make it work, but now that
I know that it is not my fault, well... there's nothing I can do about
it.

Oooooooooooohhhhmmmm

Tarek

pankaj.deharia

unread,
Oct 10, 2012, 9:51:33 AM10/10/12
to openn...@googlegroups.com
Is this issue resolved?
If yes, then please let me know the steps.



--
View this message in context: http://openni-discussions.979934.n3.nabble.com/OpenNI-dev-About-skeleton-tracking-with-multiple-kinects-tp3684610p4025265.html
Sent from the OpenNI discussions mailing list archive at Nabble.com.

David Menard

unread,
Oct 10, 2012, 10:08:21 AM10/10/12
to openn...@googlegroups.com
Not resolved yet.
The workaround is to start a separate process for each device.

For example, I launch a server that will launch two processes, each capturing from a device. Each process then sends its skeleton information to the server which does its thing. It works great, plus I am able to use as many devices as I want on multiple machines.
Reply all
Reply to author
Forward
0 new messages