Color Track with depth

138 views
Skip to first unread message

Arnold X Irizarry Rosas

unread,
Jul 17, 2013, 4:24:39 PM7/17/13
to ros-by-...@googlegroups.com
In the object tracker example from the book, is there any way to make him move forward, other than rotating in the same place without going to PCL?

Patrick Goebel

unread,
Jul 18, 2013, 10:40:35 AM7/18/13
to ros-by-...@googlegroups.com
Hi Arnold,

Just making sure I understand your question first: do you mean you want
to estimate depth without having a depth camera? I ask because the
follower.py script described in section 11.3.2 does not use PCL nodelets
but simply iterates through all the points in the point cloud to figure
out how the robot should move. If you do have a depth camera but you do
not want to use point clouds at all, then you can use the depth image
published on the /camera/depth_registered/image_rect topic.

--patrick
> --
> You received this message because you are subscribed to the Google
> Groups "ros-by-example" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to ros-by-exampl...@googlegroups.com.
> Visit this group at http://groups.google.com/group/ros-by-example.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>

Arnold X Irizarry Rosas

unread,
Jul 18, 2013, 11:02:32 AM7/18/13
to ros-by-...@googlegroups.com
Hi Patrick,

i have a kinect sensor as my depth camera, but i don't know very much about point cloud and i want my turtlebot to follow the object (color) wherever he move, like the follower.py but being able to use camshift.py

--
Arnold

On Thu, Jul 18, 2013 at 10:40 AM, Patrick Goebel <pat...@pirobot.org> wrote:
Hi Arnold,

Just making sure I understand your question first:  do you mean you want to estimate depth without having a depth camera?  I ask because the follower.py script described in section 11.3.2 does not use PCL nodelets but simply iterates through all the points in the point cloud to figure out how the robot should move.  If you do have a depth camera but you do not want to use point clouds at all, then you can use the depth image published on the /camera/depth_registered/image_rect topic.

--patrick


On 07/17/2013 01:24 PM, Arnold X Irizarry Rosas wrote:
In the object tracker example from the book, is there any way to make him move forward, other than rotating in the same place without going to PCL?

--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-example+unsubscribe@googlegroups.com.

--
You received this message because you are subscribed to a topic in the Google Groups "ros-by-example" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ros-by-example/6Th03m_RPE0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ros-by-example+unsubscribe@googlegroups.com.

Patrick Goebel

unread,
Jul 18, 2013, 1:56:53 PM7/18/13
to ros-by-...@googlegroups.com
Hi Arnold,

OK, I understand.  Which version of the ROS By Example code are you using?  Electric, Fuerte or Groovy?

--patrick


On 07/18/2013 08:02 AM, Arnold X Irizarry Rosas wrote:
Hi Patrick,

i have a kinect sensor as my depth camera, but i don't know very much about point cloud and i want my turtlebot to follow the object (color) wherever he move, like the follower.py but being able to use camshift.py

--
Arnold
On Thu, Jul 18, 2013 at 10:40 AM, Patrick Goebel <pat...@pirobot.org> wrote:
Hi Arnold,

Just making sure I understand your question first:  do you mean you want to estimate depth without having a depth camera?  I ask because the follower.py script described in section 11.3.2 does not use PCL nodelets but simply iterates through all the points in the point cloud to figure out how the robot should move.  If you do have a depth camera but you do not want to use point clouds at all, then you can use the depth image published on the /camera/depth_registered/image_rect topic.

--patrick


On 07/17/2013 01:24 PM, Arnold X Irizarry Rosas wrote:
In the object tracker example from the book, is there any way to make him move forward, other than rotating in the same place without going to PCL?

--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-exampl...@googlegroups.com.

--
You received this message because you are subscribed to a topic in the Google Groups "ros-by-example" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ros-by-example/6Th03m_RPE0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ros-by-exampl...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-exampl...@googlegroups.com.

Arnold X Irizarry Rosas

unread,
Jul 18, 2013, 2:19:15 PM7/18/13
to ros-by-...@googlegroups.com
Hi Patrick,

the version I'm using is groovy

Patrick Goebel

unread,
Jul 18, 2013, 6:22:52 PM7/18/13
to ros-by-...@googlegroups.com
OK good. I have uploaded a new object tracker script called
object_tracker2.py that uses the depth information from the
/camera/depth_registered/image_rect image. I have only tested it with
the fake TurtleBot so be sure to test it on your end with the fake
TurtleBot first before trying it with your real robot.

To get the new script and associated launch file
(object_tracker2.launch), run the commands:

$ roscd rbx1
$ git pull

To test it out using the fake TurtleBot, run this series of commands
similar to the instructions in the book:

$ roslaunch rbx1_vision openni_node.launch

$ roslaunch rbx1_bringup fake_turtlebot.launch

$ rosrun rviz rviz -d `rospack find rbx1_nav`/sim.rviz

$ roslaunch rbx1_apps object_tracker2.launch

$ roslaunch rbx1_vision camshift.launch

Now select the object you want to track in the CamShift window and watch
the motion of the fake TurtleBot in RViz as you move the object in front
of the camera. Remember that the Kinect is blind to depths within about
60cm of the lens so moving the object closer than that will not cause
the robot to move backward since it can no longer see depth values. (It
can still rotate left/right however since the RGB image is not affected
by the minimum distance limitation).

Once you are satisfied with the simulated results, try it out with your
robot. You can play with the parameters in the launch file
(object_tracker2.launch), especially the z_scale and x_scale parameters
to change the speed the robot will respond with either forward/backward
(z_scale) or left/right (x_scale). Increment these a little bit at a
time; e.g. from 1.0 to 1.5.

Also, you can run the new object tracker with any other tracker that
publishes on the /roi topic such as the face tracker or lk_tracker.

Let me know if you run into trouble.

To see how the script works, look at the set_cmd_vel() function in the
object_tracker2.py script. Here we compute the average depth value over
the ROI returned by the CamShift node. The depth values are published
on the registered depth topic, /camera/depth_registered/image_rect. The
camera is set into depth registration mode in the object_tracker2.launch
file by setting the parameter /camera/driver/depth_registration to
True. This causes the x-y coordinates in the RGB image to correspond to
the same x-y coordinates in the depth image.

--patrick

On 07/18/2013 11:19 AM, Arnold X Irizarry Rosas wrote:
> Hi Patrick,
>
> the version I'm using is groovy

Patrick Goebel

unread,
Jul 18, 2013, 6:45:29 PM7/18/13
to ros-by-...@googlegroups.com
P.S. If you happened to pull the update right after I posted the message
below, please do another pull as I found a bug shortly after I sent out
the message.

--patrick

Arnold X Irizarry Rosas

unread,
Jul 19, 2013, 9:38:02 AM7/19/13
to ros-by-...@googlegroups.com
Hi Patrick,

The script worked perfectly and i also tested it on the turtlebot 2 which was outstanding.

Thank you so much for your help

P.S. Great Book!

Patrick Goebel

unread,
Jul 19, 2013, 10:19:36 AM7/19/13
to ros-by-...@googlegroups.com
Hi Arnold,

Great to hear it! Especially that it works with a TurtleBot 2.

--patrick
> --

Tao Cao

unread,
Jul 19, 2013, 10:32:54 AM7/19/13
to ros-by-...@googlegroups.com
Dear Patric,

Many thanks for your detailed guidence, on how to use the indepth camara, Kinect, I also have a turtlebot, could you please clarify, which version of ROS shall I use for your test script? Cheers.

Kind Regards

Tao


To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-example+unsubscribe@googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-example+unsubscribe@googlegroups.com.

Arnold X Irizarry Rosas

unread,
Jul 19, 2013, 10:43:54 AM7/19/13
to ros-by-...@googlegroups.com

Hi Patrick,

 I have used all your codes on the TurtleBot 2, and they work perfectly. For being able to use the TurtleBot 2 with your code I only change the base platform in the turtlebot_minimal_create.launch file to kobuki, and automatically everything started working well.

 --Arnold

Patrick Goebel

unread,
Jul 19, 2013, 11:38:38 AM7/19/13
to ros-by-...@googlegroups.com
Hi Tao,

The original instructions (below) are for ROS Groovy.  I just now made a version of the script and launch file for ROS Fuerte.  This script might also work under ROS Electric but I no longer have a version of Electric to test it with.

To get the Fuerte version (which might also work under Electric):

$ roscd rbx_vol_1
$ svn update

Then to test it out under ROS Fuerte:

$ roslaunch rbx1_vision openni_node_fuerte.launch

$ roslaunch rbx1_bringup fake_turtlebot.launch

$ rosrun rviz rviz -d `rospack find rbx1_nav`/sim_fuerte.vcg


$ roslaunch rbx1_apps object_tracker2.launch

$ roslaunch rbx1_vision camshift.launch


To test it under ROS Electric:


$ roslaunch rbx1_vision openni_node.launch

$ roslaunch rbx1_bringup fake_turtlebot.launch

$ rosrun rviz rviz -d `rospack find rbx1_nav`/sim_electric.vcg


$ roslaunch rbx1_apps object_tracker2.launch

$ roslaunch rbx1_vision camshift.launch

--patrick

Kind Regards

Tao


To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-exampl...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-exampl...@googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "ros-by-example" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-by-exampl...@googlegroups.com.

Tao Cao

unread,
Jul 19, 2013, 11:57:19 AM7/19/13
to ros-by-...@googlegroups.com
Dear Patric,

Many thanks!

Kind Regards

Tao

Patrick Goebel

unread,
Jul 19, 2013, 12:06:27 PM7/19/13
to ros-by-...@googlegroups.com
Excellent! Glad to hear it is that simple. (Of course, it is also a
testament to the power of ROS--one should be able to program ROS nodes
that work on any hardware whose drivers support the basic message topics
like /cmd_vel and /odom).

--patrick
> --
> You received this message because you are subscribed to the Google
> Groups "ros-by-example" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to ros-by-exampl...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages