Tracking multiple objects question

1,028 views
Skip to first unread message

Pedro Ferreira

unread,
Aug 29, 2017, 1:27:51 PM8/29/17
to Bonsai Users
Hey there,

I'm currently trying to build a workflow where I am tracking two mice at the same time. By using BinaryRegionAnalysis, LargestBinaryRegion and BinaryRegionExtremes I can get a centroid and extreme points on the animals when they are together. But when they are not touching, I can only analyse one of them, and can't figure out how to track the other one. Is there a way to apply BinaryRegionExtremes to all binary regions detected? 

Thank you :-)

Gonçalo Lopes

unread,
Aug 30, 2017, 10:23:53 AM8/30/17
to Pedro Ferreira, Bonsai Users
Hi Pedro,

Thank you, this is a nice question. The basic idea is that in every frame we get a list of objects and we would like to loop through this list and apply the BinaryRegionExtremes operator to each one of these objects.

We can do this using the SelectMany operator, like so:


Inside the SelectMany we can create a pipeline that specifies what to do for each list of objects that arrives from the pipeline, like so:


You can use the detection pipeline that best works for you. A couple of main points to notice here:

1) I am using SortBinaryRegions to ensure that the largest objects are always placed at the beginning of the list.
2) Each time a new frame is processed, SelectMany gets a new list and runs the workflow inside. We need to flatten this list to get at the individual elements using Concat.
3) After applying BinaryRegionExtremes to each element of the list, we turn the sequence into an array again. This is to ensure the output of SelectMany is an array of extremes that we can now manipulate further.

The final PythonTransform is some custom code I wrote to pick out the extremes in order:

import clr
clr.AddReference("OpenCV.Net")
from OpenCV.Net import *
from System import Tuple

nanpoint = Point2f(float.NaN,float.NaN)

@returns(Tuple[Point2f,Point2f,Point2f,Point2f])
def process(value):

  if len(value) == 0:
    return Tuple.Create(nanpoint,nanpoint,
                        nanpoint,nanpoint)
  elif len(value) == 1:
    return Tuple.Create(value[0].Item1,value[0].Item2,
                        nanpoint,nanpoint)
  else:
    return Tuple.Create(value[0].Item1,value[0].Item2,
                        value[1].Item1,value[1].Item2)

I'm also attaching an example workflow with these operations.

Hope this helps.

--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/4b2f8582-e3e6-4d01-8320-cd89ce42170b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

multipleextremes.bonsai
multipleextremes.bonsai.layout

Pedro Ferreira

unread,
Aug 30, 2017, 11:20:27 AM8/30/17
to Bonsai Users, pedro.bio...@gmail.com
Well that worked perfectly, thank you very much!

I also altered the script a bit and included centroid tracking for both objects as well. I'll post it here to whoever might need it. 


import clr
clr.AddReference("OpenCV.Net")
from OpenCV.Net import *
from System import Tuple

nanpoint = Point2f(float.NaN,float.NaN)

@returns(Tuple[Point2f,Point2f,Point2f,Point2f,Point2f,Point2f])
def process(value):

  if len(value) == 0:
    return Tuple.Create(nanpoint,nanpoint,
                        nanpoint,nanpoint,
                        nanpoint,nanpoint)
  elif len(value) == 1:
    return Tuple.Create(value[0].Item1.Item1,value[0].Item1.Item2,
                        nanpoint,nanpoint,
                        value[0].Item2,nanpoint)
  else:
    return Tuple.Create(value[0].Item1.Item1,value[0].Item1.Item2,
                        value[1].Item1.Item1,value[1].Item1.Item2,
                        value[0].Item2,value[1].Item2)


A follow up question: Is it possible to get this points to show on the video, as when you drag the BinaryRegionExtremes node to the video while it plays?
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.

Gonçalo Lopes

unread,
Aug 30, 2017, 11:23:32 AM8/30/17
to Pedro Ferreira, Bonsai Users
Glad to hear you got it working!

Actually, you can, but it is a bit klunky and they will all have the same color for now. You need to select each point individually from the final tuple (by right-clicking each one). Then you have to drag each point, one at a time, on top of the video. Bonsai should memorize the layout for subsequent replays.

To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.

Pedro Ferreira

unread,
Aug 30, 2017, 11:50:44 AM8/30/17
to Bonsai Users, pedro.bio...@gmail.com
Yes, a bit cluncky, but it will do. It was just to help visualize it. 
Again, thank you very much!

lrmc...@ucsd.edu

unread,
Apr 18, 2019, 1:19:06 PM4/18/19
to Bonsai Users
Hello all-
I know this thread is older, but I am in a similar boat and trying to adapt the code used above (including the python code) that Gonçalo suggested to track larvae swimming behavior in a chamber. I am struggling, because I put up to 20 individuals in the chamber at once, and the current python code (used above) only returns coordinates (e.g. 112, 81) for 4 objects, even though the binary regions and binary region extremes are picking up >4 objects. The GoodFeaturestoTrack also picks up the larvae very well, and tracks all of them, but I have no idea how to get the coordinates from that. Can anyone please advise? I apologize if this is an easy fix, I have only been working with Bonsai for a few weeks now.
Thank you!
Lillian

Gonçalo Lopes

unread,
Apr 30, 2019, 10:31:54 PM4/30/19
to lrmc...@ucsd.edu, Bonsai Users
Hi Lillian,

The new preview release of Bonsai should have support to record multiple properties per row directly to a file. In the meantime, if you are using the old version, what you can do is simply replicate the code above, until enough values are stored, e.g. for 6 objects:

@returns(Tuple[Tuple[Point2f,Point2f,Point2f,Point2f],Tuple[Point2f,Point2f]])
def process(value):
  if len(value) == 0:
    return Tuple.Create(Tuple.Create(nanpoint,nanpoint,nanpoint,nanpoint), Tuple.Create(nanpoint,nanpoint))
  else:
    return Tuple.Create(Tuple.Create(value[0].Centroid, value[1].Centroid, value[2].Centroid, value[3].Centroid), Tuple.Create(value[4].Centroid, value[5].Centroid)

This is definitely annoying, but will do the job. There will be some examples of the dynamic case soon.

lrmc...@ucsd.edu

unread,
May 7, 2019, 1:40:48 PM5/7/19
to Bonsai Users
Hello-
Thanks very much! This worked really well. I now have the ability to pick up data from 20 objects. 
I now have several issues:
1) When I concat in the Select many and then use binary region extremes, it stops picking up actual objects, and just makes the whole video looks blurred, which means the coordinates/extremes I am getting don't mean anything. The code doesn't work if I remove the binary region extremes, so is there a way to just get coordinates for a centroid on each object its picking up? Every time I try to extract data from the centroid points, I just get an error code when I try to run the video. (As a side note, GoodFeatures to Track picks up the eyes very well, but I don't know how to get centroid data from this).

2) Because I want the video to track the objects frame to frame, I removed SortBinaryRegion because that sorts by the size, and I don't want the objects to shuffle around between frames. Is there any way to select individual points and follow them? 
Thanks very much! I've attached a short video clip of the objects I am trying to detect (larvae) and an image of my workflow.
Thanks,
Lillian
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai...@googlegroups.com.
BehaviorTrack_LRM_main.png
BehaviorTrack_LRM_selectmany.png
Squid6_short.mp4

Gonçalo Lopes

unread,
May 15, 2019, 8:02:20 PM5/15/19
to lrmc...@ucsd.edu, Bonsai Users
Hi Lillian,

When you say the code doesn't work, do you mean the python code I posted above? That one didn't use the binary extremes, so it should work with just the normal centroid. Can you attach the entire workflow so we can test what the problem may be?

Reply all
Reply to author
Forward
0 new messages