Let me get this straight:

251 views
Skip to first unread message

Alan Federman

unread,
Jul 9, 2021, 10:03:06 PM7/9/21
to Camp Peavy' via HomeBrew Robotics Club
on Nvidia Nano and Xavier you can only run Foxy and Noetic in docker.  BTW the install process takes about 5 hours.
 
Docker is designed to work CLI primarily but it can work with GUI in most situations.
 
Apparently the Nvidia docker images for ROS don't support the Nvidia accelerated display. I found this out by installing the ros2 turtlesim and finding out it can't run on a Nano. 
 
So at this point I am reaching the conclusion that Nvidia Jetson is just not worth it as a ROS development platform.

Chris Albertson

unread,
Jul 9, 2021, 11:38:02 PM7/9/21
to hbrob...@googlegroups.com
I think you are correct.  The Nvidia is designed as an "edge" computer. That means a way to get a GPU closer to the real-world action rather than in the data center.

The GPU is not used in these systems for graphic acceleration, it is there to run neural networks.   Quoting from the Nvidia website the Nano's purpose is to "let you run multiple neural networks in parallel for applications like image classification, object detection, segmentation, and speech processing. All in an easy-to-use platform that runs in as little as 5 watts."

A good way to use these in a ROS-based robot is to place the Jetson inside the robot and connect it via WiFi to a server-class PC running Linux/ROS.  That said I get very good, perhaps better performance at lower cost by running a Raspberry Pi4 with a Google/Corral "TPU" accelerator plugged into a USB slot in the Pi.

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/890092334.267025.1625882584394%40connect.xfinity.com.


--

Chris Albertson
Redondo Beach, California

Kar-Han Tan

unread,
Jul 10, 2021, 1:04:49 AM7/10/21
to HomeBrew Robotics Club
I believe Nvidia intended for Jetson to be the onboard computer for robotics, autonomous vehicles, etc, and expects development and neural network training to happen on "host" workstations and cloud servers. I was confused reading some of the documentation (eg. SDK Manager) until I realize that a host-jetson pair is the intended setup. Similarly the Isaac SDK is also intended to be developed on a host workstation, cross-compiled and then deployed to a jetson, and they did make it quite easy to do this

Chris Albertson

unread,
Jul 10, 2021, 1:39:19 AM7/10/21
to hbrob...@googlegroups.com
I agree, Jetson is the "on-device" computer.   I used the term "edge" in my last post.

That said, between then and now I got an email describing a ROS-based robot on Jetson that uses the GPU to run neural network-based inverse kinematics.



Kevin Chow

unread,
Jul 10, 2021, 1:48:11 AM7/10/21
to HomeBrew Robotics Club
Have you tried installing ROS without using a Docker image? Haven't used a Jetson Nano before... but I have used an Jetson TX2 with ROS before (not from a Docker image), and it works.  (Although I didn't set up that TX2 though... I think most of the ROS-related stuff were just installed via apt-get)

Alan Federman

unread,
Jul 10, 2021, 10:04:08 AM7/10/21
to hbrob...@googlegroups.com, Kevin Chow
ros melodic works fine, but since it uses python 2, it is not suitable as a development platform. Once upgrading the Nano to 20.04, It runs Noetic just fine, but has no access to the accelerated GPU.
 
What I am thinking for sophisticated robotics development of any accelerated GPU ROS you need a dedicated platform($$$$) that then can deploy those portions of the code to an onboard Jetson.
 
This is the approach used for the Nvidia jetbot.
 
Well, learning is often a painful process.
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Chris Albertson

unread,
Jul 10, 2021, 12:10:15 PM7/10/21
to hbrob...@googlegroups.com
On Sat, Jul 10, 2021 at 7:04 AM Alan Federman <anfed...@comcast.net> wrote:
ros melodic works fine, but since it uses python 2, it is not suitable as a development platform. Once upgrading the Nano to 20.04, It runs Noetic just fine, but has no access to the accelerated GPU.

I don't understand.  Nothing in ROS uses the GPU.  You could write a new ROS node that does use the GPU and I can't see what would stop you.   As I posted above there is an example of exactly this.  The GitHub link below has a walkthrough of using the Nvidia GPU and ROS.    The key is to install the CUDA.    libraries.  https://github.com/youtalk/iknet

That said I would never use the Nano as a development system for ROS.  You be better off with a $200 refurb'd PC from Newegg.  The Jetson is just to limited for development.

Another budget idea is to buy a 3-generation old Nvidia datacenter class GPU card and place it inside that $200 used PC and then you have a real "powerhouse"  Look at this:  A 12GB GPU card for $159.  ebay.com/itm/194213291799   I have something like this and it's almost "overkill" for my needs.   But the Jetson is frustratingly under powered for development.  It is, as everyone says, for running inside the device you are building.   It is what software engineers call "The target platform" not "The development platform".


 
What I am thinking for sophisticated robotics development of any accelerated GPU ROS you need a dedicated platform($$$$) that then can deploy those portions of the code to an onboard Jetson.
 
This is the approach used for the Nvidia jetbot.
 
Well, learning is often a painful process.
On 07/09/2021 7:44 PM Kevin Chow <kvnc...@gmail.com> wrote:
 
 
Have you tried installing ROS without using a Docker image? Haven't used a Jetson Nano before... but I have used an Jetson TX2 with ROS before (not from a Docker image), and it works.  (Although I didn't set up that TX2 though... I think most of the ROS-related stuff were just installed via apt-get)

On Friday, July 9, 2021 at 7:03:06 PM UTC-7 Alan Federman wrote:
on Nvidia Nano and Xavier you can only run Foxy and Noetic in docker.  BTW the install process takes about 5 hours.
 
Docker is designed to work CLI primarily but it can work with GUI in most situations.
 
Apparently the Nvidia docker images for ROS don't support the Nvidia accelerated display. I found this out by installing the ros2 turtlesim and finding out it can't run on a Nano. 
 
So at this point I am reaching the conclusion that Nvidia Jetson is just not worth it as a ROS development platform.

 

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/16d58df1-4050-4ccc-b8ec-d20757684fd3n%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Kevin Chow

unread,
Jul 10, 2021, 7:57:14 PM7/10/21
to HomeBrew Robotics Club
Well, maybe not ROS itself, but there may be some ROS packages that use the GPU... in particular, stuff dealing with OpenCV and other vision stuff.

Yousof Ebneddin

unread,
Jul 14, 2021, 2:30:47 AM7/14/21
to HomeBrew Robotics Club
Another option is to run the roscore on the jetson and setup your PC to connect to the roscore on jetson rather than launching it on the PC. Run the nodes that do not need GUI on the jetson and run the nodes that need GUI like rviz on your PC. If you setup it correctly, they should be able to communicate to each other on network.

I had such a setup with duckiebot with a 2GB jetson. I was running all the nodes that reading the sensor data and publishing them on jetson, the rest on my desktop. Duckiebot framework made it easy to setup the communication. with bunch of arguments, it started working.

Steve " 'dillo" Okay

unread,
Jul 14, 2021, 10:04:58 AM7/14/21
to HomeBrew Robotics Club
I have the Jetson Nano on Tenacity set up to run "headless" as well. The Pi runs roscore and rosserial to talk to the Dynamixels and Arduino(s), the Jetson runs the camera(s) and streams their topics back over the network.
My laptop is really the only piece of the whole system that has all the GUI/Display packages and drivers installed. Unless you're going to have a display mounted on the 'bot, I'd avoid installing a GUI/"Desktop" flavor OS on it.
It eats up storage, memory and processor time and can cause headaches when the "Desktop" tries to come up and doesn't find a screen attached.

HTH,
'dillo

Chris Albertson

unread,
Jul 14, 2021, 1:02:19 PM7/14/21
to hbrob...@googlegroups.com
Yes, the below is, I think, what most people do.  Think how hard it would be to do otherwise.  If rviz ran on a mobile robot, you would have to follow the robot around to see the display.

A purpose of that GPU on the Jetson is to turn the high bandwidth video data into lower-bandwidth information like odometry and a list of recognized objects.  

On Tue, Jul 13, 2021 at 11:30 PM Yousof Ebneddin <youe...@gmail.com> wrote:
Another option is to run the roscore on the jetson and setup your PC to connect to the roscore on jetson rather than launching it on the PC. Run the nodes that do not need GUI on the jetson and run the nodes that need GUI like rviz on your PC. If you setup it correctly, they should be able to communicate to each other on network.

Ryan D

unread,
Jul 14, 2021, 5:16:24 PM7/14/21
to hbrob...@googlegroups.com
hoped on and noticed this conversation.

ROS development is moving towards using foxy and not being able to apt-get install or compile foxy on jetson nano since there is no ubuntu 20.04 is a major enough reason to not use it in my book.

I've gotten ros2 foxy to run on the pi4 by building it on raspian with no docker non-sense using my process here(https://answers.ros.org/question/364425/how-do-i-access-the-source-of-mimick-that-mimick_vendors-cmake-file-references-in-order-to-edit-mimicks-cmake-file-to-give-it-arm7l-compatability/?comment=370623#post-id-370623), but I had to remove graphics stuff and offload that to rendering on my ubuntu 20.04 pc instead.

I'm only just starting to get odometry to work on my robot, so im not too sure how problematic the weaker GPU is with visual odometry, but being able to run ROS2 Foxy WITHOUT docker on raspian(so GPIO pins and peripherals don't break), for like, $50-75, is a big plus in my book.

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Charles de Montaigu

unread,
Jul 15, 2021, 4:29:39 PM7/15/21
to hbrob...@googlegroups.com
For Nano 1a/1b ROS 2 Foxy+ with GPU on Nano the compromise is a Docker (See Nvidia - Dusty stuff) ith ROS 2 version built from source and Nvidia SDK 4.2+ (Docker support with GPU acceleration like DeepStreams)

Jretson Xavier Nx, Jetson Xavier are based on the same GPU architecture pending Nvidia SDK 5.x  will be available with Ubuntu 20.04 support and Kernel 5.x in the next 6months. Jet Nano Nx 1+ year next hardware release will rejoin the other member of the family with 20.04 support.

The Pi is Cool and OpenCV can get you a lot but if you want to do "AI" you need an accelerator since the pi GPU is closed design and does not have support for any of the AI framework (TensorFlow, PyTorch,MXNet, ....). Most commercial robots I have look at all have TX2 or Xavier NX as baseline hardware.

Chris Albertson

unread,
Jul 15, 2021, 4:37:12 PM7/15/21
to hbrob...@googlegroups.com
One good option with Pi is to plug-in a Coral TPU.  They work exactly as advertised on the site https://coral.ai/products/accelerator/
Note this is an 8-bit integer accelerator and is only useful for Tensorflow at run-time.  Not for training or for general GPU use.

If you robot needs to run Tensorflow networks, you can add the Google/Coral TPU to any computer that has a USB port.





Kar-Han Tan

unread,
Jul 15, 2021, 5:17:18 PM7/15/21
to HomeBrew Robotics Club
Coral only supports Tensorflow, so Intel's Neural Compute Stick is another option for deep neural net inference. Nvidia GPUs support a wider range of parallel computing workloads say for image processing or point cloud processing

Charles de Montaigu

unread,
Jul 15, 2021, 5:17:49 PM7/15/21
to hbrob...@googlegroups.com
.... for Vision workload you can use Movidius but TPU is the current equivalent accelerator of choice for the Pi. I am not impressed with Google's community support in the DIY space compared to Nvidia.

e...@okerson.com

unread,
Jul 15, 2021, 7:03:54 PM7/15/21
to hbrob...@googlegroups.com, Chris Albertson
You can actually run Neural Nets on the Pi natively with ArmNN:

https://github.com/ARM-software/armnn

It currently supports TensorFlow Lite and ONNX models. It is
considerably faster than doing it in Python.

Ed
>> [1].
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/hbrobotics/CAPD-i8OQFgn9ku-NCuWHbOyXJaDD1TBO6J9PO%3DsRmPAQ5jf35g%40mail.gmail.com
> [2].
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/hbrobotics/CANcQ5GfR7QCwh1dbMw-KRHMSz-OFLPQhtAsxxLYOb3dXev-aQw%40mail.gmail.com
> [3].
>
> --
>
> Chris Albertson
> Redondo Beach, California
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/hbrobotics/CABbxVHta0M%3DEFM7VFKhUceF1RwWRtT6%2BQ8%2BvfO2RLzs9HUrSjw%40mail.gmail.com
> [4].
>
>
> Links:
> ------
> [1]
> https://groups.google.com/d/msgid/hbrobotics/CABbxVHuc7hsY2_s4%3DDLnu7S43XuBJFxj2JhrrTZ9LaOGjcRbXw%40mail.gmail.com?utm_medium=email&amp;utm_source=footer
> [2]
> https://groups.google.com/d/msgid/hbrobotics/CAPD-i8OQFgn9ku-NCuWHbOyXJaDD1TBO6J9PO%3DsRmPAQ5jf35g%40mail.gmail.com?utm_medium=email&amp;utm_source=footer
> [3]
> https://groups.google.com/d/msgid/hbrobotics/CANcQ5GfR7QCwh1dbMw-KRHMSz-OFLPQhtAsxxLYOb3dXev-aQw%40mail.gmail.com?utm_medium=email&amp;utm_source=footer
> [4]
> https://groups.google.com/d/msgid/hbrobotics/CABbxVHta0M%3DEFM7VFKhUceF1RwWRtT6%2BQ8%2BvfO2RLzs9HUrSjw%40mail.gmail.com?utm_medium=email&utm_source=footer

Alan Federman

unread,
Jul 17, 2021, 4:17:28 PM7/17/21
to hbrob...@googlegroups.com, Chris Albertson
So having exhausted my attempts to get ROS 1 working with my cameras Raspicam, Xtion and my platforms AMD, PI, Nano. I thought I'd give ROS 2 a try. I got the talker and listener to work on all three platforms, but the Raspicam won't work on the Pi, and there is no Xtion support in ROS 2.
 
The reason I want to do this is for a robot to navigate and co-exist in a crowded space, it had to react to hazards and avoid hitting people. This needs a lot of onboard visual and other data processing and can't rely on cloud or networks.
 
My goal of getting an inexpensive GPU supported on board mobile robot system on just a Pi 4 or Nano just isn't possible at present.  At a minimum, on board multi processing is required. So  I hope one of you guys saved the matchbook with the add for the famous truck driver's school.

Tim Craig

unread,
Jul 17, 2021, 7:29:32 PM7/17/21
to hbrob...@googlegroups.com

Alan,

Fergs started porting the Openni2 package supporting the Xtions to ROS2 and I took it and worked out some of the kinks I ran into and last I played with it under Foxy with Ubuntu 20.04, I could get images as direct topics but using Image Transport didn't work.  I saw online where there was a fix for the problems with Image Transport in general but never got around to trying to use them.  I figured they'd be part of Galactic but I haven't had a chance to work on it lately.

https://github.com/TimCraig/openni2_camera_param

Tim

Steve " 'dillo" Okay

unread,
Jul 17, 2021, 9:14:49 PM7/17/21
to HomeBrew Robotics Club
It would be good to have the Xtions as an option for indoor depth cams.
I have a small pile of them from my Service Robot Labs days and it would be a shame if they weren't usable on modern ROS stacks.
'dillo

Michael Ferguson

unread,
Jul 17, 2021, 9:23:28 PM7/17/21
to hbrob...@googlegroups.com
So at some point, I imagine we'll have a properly merged and supported openni2_camera driver for ROS2 - but there are a few key pieces still missing from ROS2 to support things like lazy subscribers/publishers (AFAIK, these didn't make it into Galactic either). Without true lazy publishers/subscribers throughout the driver AND image_proc/depth_image_proc, you really can't support the versatility that these drivers had in ROS1. I have a bit of a write up on my openni2_camera port here: https://www.robotandchisel.com/2020/06/16/ubr1-on-ros2/

-Fergs

Steve " 'dillo" Okay

unread,
Jul 18, 2021, 9:57:24 AM7/18/21
to HomeBrew Robotics Club
Heheh, that "Famous Truck Driving School" is preparing their students to manage the coming revolution in self-driving trucks.
I'm afraid you're going to be stuck working with robots no matter where you go Alan! :)

Alan Federman

unread,
Jul 18, 2021, 1:01:02 PM7/18/21
to hbrob...@googlegroups.com, Michael Ferguson
Thanks Fergs,
 
   I am glad I am just not hallucinating. I am confused as to working with Foxy or Galactic, since both are advertised as working with Ubuntu 20.04
 
Any reason to use one over the other?  Will your openni2 port work with Galactic?
 
My goal is on one smaller robot with one ARM64 computer do people detection and navigation - in other words solve the last mile problem. (Yes, I know the other 99 miles haven't been solved yet.)

Chris Albertson

unread,
Jul 18, 2021, 2:04:15 PM7/18/21
to hbrob...@googlegroups.com
On Sun, Jul 18, 2021 at 10:01 AM Alan Federman <anfed...@comcast.net> wrote:
My goal is on one smaller robot with one ARM64 computer do people detection and navigation - in other words solve the last mile problem.

The best solution today is the Intel Realsence depth camera.  They start at $159.   This is not expensive for what you get, on-camera 3D processing and a wide-enough field of view.

The camera does not identify objects, semantic segmentation of the image needs to be done in a trained neural network.  But as it turns out this type of processing can but done in 8-bit integers so low-cost, low-power GPUs work well.  (The $60 Google TPU with USB interface works for 8-bit Tensorflow based networks)  All the higher-speed work is done inside the camera using special ASIC chips.  It does this 90 FPS stereo processing for you.   

I'm experimenting with this and found that you really can not just use something simple like Yolo.  It generates too much junk and noise but if you retain a history or "memory" and look at what Yolo tells you over the last second or two you get reliable information.

As for which ROS to use.  Hands down you want the new Navigation stack in ROS2 because it allows you to write path planner "Plug ins" that can use the data described above.     As for which ROS2, on a new-start project, I'd start on Foxy as this gives you the longest run before you are forced to change again in 5 years.

But while you are using Foxy, keep also a copy of "rolling" and make sure the stuff you write building on Rolling.  You don't want to be surprised later.





 

Dheera Venkatraman

unread,
Jul 18, 2021, 2:27:29 PM7/18/21
to hbrob...@googlegroups.com
https://github.com/jetsonhacks/installROS2

____________________
Dheera Venkatraman
http://dheera.net

Chris Albertson <alberts...@gmail.com> 於 2021年7月18日 週日 上午11:04寫道:
> --
> You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/CABbxVHt%2BM3wHR9-Voa_92kUJBADYge3mU7J92bjThKgoQ9RK5g%40mail.gmail.com.

Martin Spencer

unread,
Jul 18, 2021, 5:18:23 PM7/18/21
to Chris Albertson, hbrob...@googlegroups.com, dpa, Steve 'dillo Okay, Brian Higgins, Alan Federman
yes, FOV sufficiency can be achieved with 2 of the RealSense depth cameras.

yes, you will need to abstract the data cloud from the depth cameras for feeding to the nav engine.

unless total WCET is human quick, even with FOV sufficiency, this solution will not be 100% safe.

this is what must be achieved to the compliant with AI ethics.

graphic
> --
> You received this message because you are subscribed to the
> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> from this group and stop receiving emails from it, send an email
> discussion on the web visit
hts_1.PNG

Tim Craig

unread,
Jul 18, 2021, 8:22:03 PM7/18/21
to hbrob...@googlegroups.com

Mike,

I decided to sacrifice "versatility" for a working solution.  Since I think most of the time people aren't interesting on switching things around all the time, I went with selecting which streams are published through parameters. 

Tim

Chris Albertson

unread,
Jul 18, 2021, 10:25:02 PM7/18/21
to hbrob...@googlegroups.com
On Sun, Jul 18, 2021 at 10:01 AM Alan Federman <anfed...@comcast.net> wrote:
Thanks Fergs,
 
   I am glad I am just not hallucinating. I am confused as to working with Foxy or Galactic, since both are advertised as working with Ubuntu 20.04
 
Any reason to use one over the other?  Will your openni2 port work with Galactic?

I am asking myself this too.   I'm using Foxy and wonder if I should change.  I think I will.

Unless you need a greater period of stability go with Galactix.  In your case "doing last mile", some big changes happen on Nav2.  For example, Galactic adds "Navigate athroughPases" where rather then giving just the end goal you can pass a list of poses.  Change to the cost map too, for example, keep out zones.   This could work well with a camera

As for the goal of using just one ARM computer, who would not want that?   But you can't possibly know if that will work until you have all the software up and running.    My plan, the one I've used for 30+ years is to do development on the most powerful computer I can get my hands on then back port it to target hardware.  With luck the target hardware will improve over time.

I think the release of the Raspberry Pi Pico will cause me to do some redesign.  Now I can fit the entire ROS2 base controller on the Pico.   I may buy a Jetson to replace my Pi4   But today I run on an Intel 16-core Xeon with 64GB RAM.

My goal is an "explorer 'bot".  It rolls around and looks at stuff. You tell it where to go and then you can ask it what did it see.

Navigation is key and Galactic seems to have over a dozen upgrades to the Nav2.


Michael Ferguson

unread,
Jul 18, 2021, 10:35:12 PM7/18/21
to hbrob...@googlegroups.com
I haven't tried my branch in Galactic yet - the low level ROS2 APIs haven't been super stable, so I wouldn't be surprised if there is a patch or two needed with either Galactic or Rolling - but it should all be documented in the Release notes for Galactic.

You really have three choices of ROS2 releases right now:

 * Foxy - still has 4 years of support. I would suggest this for any commercial product.
 * Galactic - has a little under 18 months of support left. Probably a good option for hobby projects, assuming you plan to eventually upgrade to Ubuntu 22.04 as well
 * Rolling - this is the bleeding edge, it's supported forever, but will keep moving forward on Ubuntu releases (I think it jumps to 22.04 shortly after it comes out next year). Probably the best option for developers.

-Fergs

Scott Horton

unread,
Jul 19, 2021, 9:16:40 PM7/19/21
to HomeBrew Robotics Club

I thought I would add my two cents worth since the ElsaBot project I have been working on (the one I showed at the last club challenge meeting) includes some of the ingredients discussed above…

My robot is running Ubuntu 20.04 and a clone of ROS 2 Rolling from around mid April.  I’m running a bunch of nodes including:

  • Behavior tree execution node for high-level control
  • Roomba Create2 driver
  • Rplidar driver
  • Pose publisher
  • Nav2 nodes
  • Vision node for controlling and processing NN output from an OAK-D camera
  • Tracker node for controlling servos that move the camera to track a person in view
  • Speech output and speech input/wake word monitoring nodes
  • Other misc nodes including ros2 web bridge and web video server

I planned on using one RPi4 but found just the core nodes for robot base control and nav consumed around 75% of the processor.  The other nodes needed around 45% of the processor.  To keep the project going, I cloned the RPi4 SD card and just added another RPi4 to the robot.  That allowed plenty of processing head room.

I grouped the base control and nav related nodes on one RPi and the rest of the nodes on the other.  While developing, I usually run the nodes I am working on from my laptop since builds are so much quicker there.  It’s easy to then commit the changes and push to Github and then fetch those changes to the RPi where they normally run (and then wait for the painfully slow compile to finish).

I switched from Foxy to Rolling since I encountered a bug that was causing a deadlock (https://github.com/ros2/rclcpp/issues/1285).  It is handy building from source in case you want to better understand the code by putting in some log prints or making some minor change to prove-out your understanding of things.

The robot uses the OAK-D camera to run human pose detection NN and a MobileNet SSD NN for person detection.  Except for the post processing required for the pose detection NN, the AI processing doesn’t take too much CPU.  I’ve constrained the video output resolution from the OAK-D to 640x360 to reduce the overhead of sending it to the webui clients that display it.

The OAK-D has an Intel Movidius Myriad X which can be configured to process the output from one of the on-device cameras or can be used independently of the video sources (like the Google TPU use case).   I'm using python for the vision and tracking nodes since the post-processing, setup, and image handling is much easier with python.

I’m thinking of moving to a Celeron-based SBC such as this one (https://www.seeedstudio.com/Odyssey-Blue-J4125-128GB-p-4921.html) so I can get back to one CPU. It’s still somewhat power consumption friendly and can be powered from 12-19VDC.  It also has a built-in Arduino which would be handy for low-level control.  The X86 aspect might also make life a little easier.  Anybody tried one of those or a Celeron-based board for ROS2?

Scott Horton

Martin Spencer

unread,
Jul 20, 2021, 9:41:32 AM7/20/21
to Scott Horton, hbrob...@googlegroups.com
while this sounds like great and impressive work due to the technical detail, unless WCET is human quick coupled with FOV sufficiency, not revealed whether safe around puppies/kittens/infants, or moving people.

safety not a concern?     <ouch>    having a car that will do a 100 in a quarter mile, but with bad brakes is known to be imprudent and unsafe-

same applies to all forms of mobile robots.   nothing more important that safety.  nothing.



>
>
> I thought I would add my two cents worth since the ElsaBot
> project I have been working on (the one I showed at the last
> club challenge meeting) includes some of the ingredients
> discussed above…
>
> My robot is running Ubuntu 20.04 and a clone of ROS 2 Rolling
> from around mid April.  I’m running a bunch of nodes
> including:
>
>
>    - Behavior tree execution node for high-level control
>    - Roomba Create2 driver
>    - Rplidar driver
>    - Pose publisher
>    - Nav2 nodes
>    - Vision node for controlling and processing NN output from
>    an OAK-D camera - Tracker node for controlling servos that
>    move the camera to track a person in view - Speech output and
>    speech input/wake word monitoring nodes - Other misc nodes
> >>> from the Nvidia website the Nano's purpose is to " *let* *
> >>> you run multiple neural networks in parallel for
> >>> applications like image classification, object detection,
> >>> segmentation, and speech processing. All in an easy-to-use
> >>> platform that runs in as little as 5 watts.*"
> >>>
> >>>
> >>>
> >>>  
> >>> --
> >>>
> >>> Chris Albertson
> >>> Redondo Beach, California
> >>>
> >>> 
> >>> --
> >>> You received this message because you are subscribed to the
> >>> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >>> from this group and stop receiving emails from it, send an
> >>> email to hbrobotics+...@googlegroups.com. To view this
> >>> discussion on the web visit
> >>> 1jVa3V5vF%2B6mt_o3wrduJfO_P7OkhSsen%2BLA%40mail.gmail.com
> >>> z1jVa3V5vF%2B6mt_o3wrduJfO_P7OkhSsen%2BLA%40mail.gmail.com?u
> >>> tm_medium=email&utm_source=footer>.
> >>>
> >>>
> >>> --
> >>> You received this message because you are subscribed to the
> >>> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >>> from this group and stop receiving emails from it, send an
> >>>
> >>> To view this discussion on the web visit
> >>>
> >>>
> >>> 
> >>> --
> >>> You received this message because you are subscribed to the
> >>> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >>> from this group and stop receiving emails from it, send an
> >>> email to hbrobotics+...@googlegroups.com. To view this
> >>> discussion on the web visit
> >>>
> >>>
> >>> 
> >>> --
> >>> You received this message because you are subscribed to the
> >>> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >>> from this group and stop receiving emails from it, send an
> >>> email to hbrobotics+...@googlegroups.com. To view this
> >>> discussion on the web visit
> >>> kY0jy5V1x%3DjdeDpjF1Sk4WG5ptnYokh60WQQ%40mail.gmail.com
> >>> pkY0jy5V1x%3DjdeDpjF1Sk4WG5ptnYokh60WQQ%40mail.gmail.com?utm
> >>> _medium=email&utm_source=footer>.
> >>>
> >>>
> >>> --
> >>> You received this message because you are subscribed to the
> >>> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >>> from this group and stop receiving emails from it, send an
> >>> email to hbrobotics+...@googlegroups.com. To view this
> >>> discussion on the web visit
> >>>
> >>
> >>
> >> --
> >>
> >> Chris Albertson
> >> Redondo Beach, California
> >>
> >> --
> >> You received this message because you are subscribed to the
> >> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> >> from this group and stop receiving emails from it, send an
> >>
> > To view this discussion on the web visit
> >> VP8O1yVPUUFLpJBLErR%2BynjoEdHKPq2d%3DnQ%40mail.gmail.com
> >> 0VP8O1yVPUUFLpJBLErR%2BynjoEdHKPq2d%3DnQ%40mail.gmail.com?utm
> >> _medium=email&utm_source=footer> .
> >>
> >
>
> --
> You received this message because you are subscribed to the
> Google Groups "HomeBrew Robotics Club" group. To unsubscribe
> from this group and stop receiving emails from it, send an email
> discussion on the web visit

Chris Albertson

unread,
Jul 20, 2021, 12:02:38 PM7/20/21
to hbrob...@googlegroups.com, Scott Horton

> I’m thinking of moving to a Celeron-based SBC such as this one
> (
> ) so I can get back to one CPU.


Try moving the base controller to Micro-ROS on a Raspberry Pi Pico.   It may seem counterintuitive to downgrade the CPU but the PID loops and odometry use up a lot of the Pi4 time   The Pico is very efficient.  The built-in state machines (there are 8 of them) looks like it can decode a pair of quadrature encoders with zero CPU time.  Same with PWM, PWM is a little bit expensive on a Pi4 but can be done with zero CPU time on the Pico.  I just bought four of these Picos and just getting started to redesign a small robot to use Galactic on Pi4 and the Pico with microROS.

I might take your advice to use Rolling and not have to go through updates again.


 
It’s still somewhat power
> consumption friendly and can be powered from 12-19VDC.  It also
> has a built-in Arduino which would be handy for low-level
> control.  
--

Scott Horton

unread,
Jul 20, 2021, 1:56:35 PM7/20/21
to HomeBrew Robotics Club
Thanks Chris.  That Pi Pico sounds interesting and makes a lot of sense for the low level processing.

However, for the Create2 base the low-level work is down in the Roomba firmware.  It provides a serial stream of encoder counts and executes commands for controlling the motors.  I'm finding that the Nav2 nodes account for the lion's share of CPU load.  Is that to be expected on a RPi4, or do I maybe have something goofed up in way of Nav2 config or something?

Scott Horton


Chris Albertson

unread,
Jul 20, 2021, 2:18:59 PM7/20/21
to hbrob...@googlegroups.com
Ok, in the case of a Roomba vacuum, there already is a microcontroller.

But for the rest of us, the Pico, I think will be the uP of choice.   I think it is MUCH easier to use than Arduino because Pico can be programmed in Python and is over-the-top well documented and supported and costs $4.

As for what to use to run Nav2.  I think a plan would be to run it on a PC notebook and look at the CPU load on an Intel platform.  Then choose an embedded platform that provides 100% headroom (50% CPU utilization while running your current load)  The Celeron-based Odessy Blue might be a good step up from Pi but how much? and how much power is used  I don't know

The Pico is easy to use for beginners with Micro Python but others can program it using C and FreeRTOS or MicroROS.  You can also mix C and Python and then those state machines do over 100 million ops per second and there are 8 of them.    It will be a while yet until people learn to use them.

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages