https://launchpad.net/~marco-sonic/+archive/ubuntu/rasppios/+packages might not work as expected!!

62 views
Skip to first unread message

Marco Walther

unread,
Feb 3, 2026, 3:53:05 AMFeb 3
to hbrob...@googlegroups.com
Sorry, I'm trying to build and publish newer versions. But every step
takes a long time (the launchpad servers have to build each step and
publish it - >> 1h :-()
libcamera 0.6 is pushed now but the rpicamera stuff not yet, so things
will not update cleanly right now!
I have a build of rpicamera, but I'll push it later today (> 8:00 PST)

Sorry,
-- Marco

Marco Walther

unread,
Feb 3, 2026, 3:25:37 PMFeb 3
to hbrob...@googlegroups.com
OK, new packages are available at the PPA now;-) Please do the usual

$ sudo apt-get update && sudo apt-get dist-upgrade

and the

$ ros2 run camera_ros camera_node

should work again;-)

Please let me know, if you find any problems.

Thanks,
-- Marco

>
> Sorry,
> -- Marco
>

Ross Lunan

unread,
Feb 3, 2026, 9:38:21 PMFeb 3
to HomeBrew Robotics Club
Marco, Sergei, Alan
Tah Dah - We have contact ! Installing Marco's latest libcamera0.6 apps packages on my RPi5/v2.1 Camera system and running debian installed ros-jazzy-camera-ros works. The $ ros2 run camera-ros camera_node command on a Remote Desktop with rqt_image_view displays the camera 800x600 image  as does a $ ros2 launch camera_ros camera.launch.py command on the RPI5 with connected Monitor (see the attached happy camper picture).  The rpicam-hello cli works

$ rpicam-hello --list-cameras

Available cameras

-----------------

0 : imx219 [3280x2464 10-bit RGGB] (/base/axi/pcie@120000/rp1/i2c@88000/imx219@10)

    Modes: 'SRGGB10_CSI2P' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]

                             1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]

                             1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]

                             3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]

           'SRGGB8' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]

                      1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]

                      1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]

                      3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]

However, the other rpican-hello, -still, -vid run but do not save an image or video. Any ideas to investigate?


So I can now revise my repo Raspberry-Pi-Camera-ROS to reflect using a camera_ros debian install with Marco's ppa packages.  

sudo dpkg -l |grep libcamera

ii  gstreamer1.0-libcamera:arm64  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (GStreamer plugin)

ii  libcamera-ipa:arm64           0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (IPA modules)

ii  libcamera-tools               0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (tools)

ii  libcamera-v4l2:arm64          0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (V4L2 module)

ii  libcamera0.6:arm64            0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library

ii  python3-libcamera             0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (Python bindings)

ii  ros-jazzy-camera-ros          0.5.2-1noble.20260124.190824      arm64 node for libcamera supported cameras (V4L2, Raspberry Pi Camera Modules)

ii  rpicam-apps-core              1.11.0-1ubuntu1~marco1            arm64 Camera based applications for Raspberry Pi using the libcamera framework


Ross

RPI5-RPiv.2.1Cam_camera_ros_640_20260203.JPG

Marco Walther

unread,
Feb 3, 2026, 10:07:31 PMFeb 3
to hbrob...@googlegroups.com
On 2/3/26 18:38, 'Ross Lunan' via HomeBrew Robotics Club wrote:
> Marco, Sergei, Alan
> Tah Dah - We have contact ! Installing Marco's latest libcamera0.6 apps
> packages on my RPi5/v2.1 Camera system and running debian installed ros-
> jazzy-camera-ros works. The $ ros2 run camera-ros camera_node command on
> a Remote Desktop with rqt_image_view displays the camera 800x600 image
>  as does a $ ros2 launch camera_ros camera.launch.py command on the
> RPI5 with connected Monitor (see the attached happy camper picture).
>  The rpicam-hello cli works
>
> $ rpicam-hello --list-cameras

Yeah, the problem is, the libcamera adds the 'minor' version to the
package and shared library names:-( So, switches from 0.5 -> 0.6 -> ...
will always be a pain:-(

No real way around that while trying to stay within the established
naming scheme:-(

Please let me know, when you run into problems [again].

-- Marco

>
> Available cameras
>
> -----------------
>
> 0 : imx219 [3280x2464 10-bit RGGB] (/base/axi/pcie@120000/rp1/i2c@88000/
> imx219@10)
>
>     Modes: 'SRGGB10_CSI2P' : 640x480 [103.33 fps - (1000, 752)/1280x960
> crop]
>
>                              1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
>
>                              1920x1080 [47.57 fps - (680,
> 152)/1920x2160 crop]
>
>                              3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
>
>            'SRGGB8' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]
>
>                       1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
>
>                       1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]
>
>                       3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
>
> However, the other rpican-hello, -still, -vid run but do not save an
> image or video. Any ideas to investigate?
>
>
> So I can now revise my repo Raspberry-Pi-Camera-ROS <https://github.com/
> ARLunan/Raspberry-Pi-Camera-ROS/blob/main/Documents/RPCamera-
> Installation-RP5.md> to reflect using a camera_ros debian install with
> Marco's ppa packages.
>
> sudo dpkg -l |grep libcamera
>
> ii  gstreamer1.0-*libcamera*:arm64
> 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (GStreamer plugin)
>
> ii *libcamera*-ipa:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (IPA modules)
>
> ii *libcamera*-tools
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (tools)
>
> ii *libcamera*-v4l2:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (V4L2 module)
>
> ii *libcamera*0.6:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
>
> ii  python3-*libcamera*
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (Python bindings)
>
> ii  ros-jazzy-camera-ros          0.5.2-1noble.20260124.190824
>  arm64 node for *libcamera* supported cameras (V4L2, Raspberry Pi
> Camera Modules)
>
> ii  rpicam-apps-core              1.11.0-1ubuntu1~marco1
>  arm64 Camera based applications for Raspberry Pi using the *libcamera*
> framework
>
>
> Ross
>
> On Tuesday, February 3, 2026 at 3:25:37 p.m. UTC-5 Marco Walther wrote:
>
> On 2/3/26 00:52, Marco Walther wrote:
> > Sorry, I'm trying to build and publish newer versions. But every
> step
> > takes a long time (the launchpad servers have to build each step and
> > publish it - >> 1h :-()
> > libcamera 0.6 is pushed now but the rpicamera stuff not yet, so
> things
> > will not update cleanly right now!
> > I have a build of rpicamera, but I'll push it later today (> 8:00
> PST)
>
> OK, new packages are available at the PPA now;-) Please do the usual
>
> $ sudo apt-get update && sudo apt-get dist-upgrade
>
> and the
>
> $ ros2 run camera_ros camera_node
>
> should work again;-)
>
> Please let me know, if you find any problems.
>
> Thanks,
> -- Marco
>
> >
> > Sorry,
> > -- Marco
> >
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com
> <mailto:hbrobotics+...@googlegroups.com>.
> To view this discussion visit https://groups.google.com/d/msgid/
> hbrobotics/37bee1d0-6dc0-4391-b5f6-600f2c612991n%40googlegroups.com
> <https://groups.google.com/d/msgid/hbrobotics/37bee1d0-6dc0-4391-
> b5f6-600f2c612991n%40googlegroups.com?utm_medium=email&utm_source=footer>.

Nathan Lewis

unread,
Feb 4, 2026, 2:39:32 PMFeb 4
to Marco Walther, hbrob...@googlegroups.com
Hey Marco,

Are you building locally on the Pi?

If so I could probably help you set up a foreign architecture build environment. Basically if cross compiling is too much of a pain or impossible, you can do some fun tricks with qemu user mode to run your RPi image as a docker container on an x86_64 host.

- Nathan
-- 
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.


Michael Wimble

unread,
Feb 4, 2026, 4:36:46 PMFeb 4
to hbrob...@googlegroups.com
If someone wants to write articles on:
1. How to set up kernel source build on a Pi.
2. How Marco figured out what to build and how to build it.

I’d love to have those two articles in my big bucket o bits repository.


Marco Walther

unread,
Feb 5, 2026, 4:08:31 AMFeb 5
to Nathan Lewis, hbrob...@googlegroups.com
On 2/4/26 11:39, Nathan Lewis wrote:
> Hey Marco,
>
> Are you building locally on the Pi?
>
> If so I could probably help you set up a foreign architecture build
> environment. Basically if cross compiling is too much of a pain or
> impossible, you can do some fun tricks with qemu user mode to run your
> RPi image as a docker container on an x86_64 host.

The bigger pain point is the launchpad infrastructure:-( I have a Pi5
with 16GB and SSD to build locally. That works fine. But pushing things
to launchpad is hard, especially since the packages have dependencies
between them. Push one, wait an hour+ until it's built & published and
then you can push the next package which depends on the first. Do that
for four or five layers and you spend the whole day, pretty much waiting:-(

Usually the builds are << 10 minutes per source package, but the publish
takes some time.

Thanks,
-- Marco


>
> - Nathan
>
> On Tue, Feb 3, 2026, at 12:52 AM, Marco Walther wrote:
>> Sorry, I'm trying to build and publish newer versions. But every step
>> takes a long time (the launchpad servers have to build each step and
>> publish it - >> 1h :-()
>> libcamera 0.6 is pushed now but the rpicamera stuff not yet, so things
>> will not update cleanly right now!
>> I have a build of rpicamera, but I'll push it later today (> 8:00 PST)
>>
>> Sorry,
>> -- Marco
>>
>> --
>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>> To unsubscribe from this group and stop receiving emails from it, send
>> an email to hbrobotics+...@googlegroups.com
>> <mailto:hbrobotics%2Bunsu...@googlegroups.com>.
>> To view this discussion visit https://groups.google.com/d/msgid/
>> hbrobotics/61c183df-2896-4c33-be96-dd7a33c586a3%40gmail.com <https://
>> groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96-
>> dd7a33c586a3%40gmail.com>.
>>
>

Chris Albertson

unread,
Feb 5, 2026, 12:47:43 PMFeb 5
to hbrob...@googlegroups.com, Nathan Lewis
For Pi development work, I run a virtual machine on my Mac. The OS from ther Pi runs on the VM without emulation because both the Mac and the Pi are ARM CPUs. Except that the ARM in the Mac has 12 cores and each is MUCH faster than the Pi4. I have a folder called “Projects” that holds the data and it gets mounted on the Mac and the Pi. So when I have something to run on the Pi, there is no need to transfer the data or software. It is always present on all the computers I use.

So I build Pi software in effectively zero time. I have never noticed any kind of delay or lag. I don’t know if it is worth buying a modern Mac just for Pi development. That would depend on how much of it you do. Prices are reasonable, < $400 for a used Mini. Which Mac? Any of them will be well more than an order of magnitude faster than a Pi. Possibly even 100X faster for some things because the storage on the Mac tests at about 6,300 MB per second, and I’m lucky to see 100 MB/sec on the Pi’s SD card.
> To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/6f88214e-f0c0-483c-aa87-71ab8d96ba72%40gmail.com.

Sergei Grichine

unread,
Feb 5, 2026, 2:07:05 PMFeb 5
to hbrob...@googlegroups.com
Well, if a “< $400 used Mac Mini” (ARM-based) can run the Raspberry Pi edition of Ubuntu 24.04 in a virtual machine—with full ROS 2 support—it could actually make a solid on-board computer for robots.

The problem, of course, is how to power it from a typical 12 V battery, along with the complete lack of I2C and GPIO access. A Teensy over USB could handle all sensors and actuators, with a bit of creative programming.

It would open a lot of possibilities, including AI.

Have anybody tried this? Any thoughts?

Best Regards,
-- Sergei


e...@okerson.com

unread,
Feb 5, 2026, 3:30:46 PMFeb 5
to hbrob...@googlegroups.com
Here is an ESP32 solution for the GPIO.

https://www.hackster.io/news/give-any-computer-raspberry-pi-style-gpio-pins-1e12f919feb4

Ed Okerson

On 2026-02-05 13:06, Sergei Grichine wrote:
> Well, if a “_< $400 used Mac Mini_” (ARM-based) can run the
>> hbrobotics/61c183df-2896-4c33-be96-dd7a33c586a3%40gmail.com [1]
>> <https://
>> groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96- [2]
>> dd7a33c586a3%40gmail.com [1]>.
>>>>>
>>>
>>> --
>>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>>> To unsubscribe from this group and stop receiving emails from it,
>> send an email to hbrobotics+...@googlegroups.com.
>>> To view this discussion visit
>>
> https://groups.google.com/d/msgid/hbrobotics/6f88214e-f0c0-483c-aa87-71ab8d96ba72%40gmail.com.
>>
>> --
>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>> To unsubscribe from this group and stop receiving emails from it,
>> send an email to hbrobotics+...@googlegroups.com.
>> To view this discussion visit
>>
> https://groups.google.com/d/msgid/hbrobotics/E7B8634D-485F-4BE4-8DEE-AEA4730B84C4%40gmail.com.
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com.
> To view this discussion visit
> https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVP5J2iYKVyeL7Kw%3D1MCX35XcAGSanTGbjPtJy3d_9Um7w%40mail.gmail.com
> [3].
>
>
> Links:
> ------
> [1] http://40gmail.com
> [2]
> http://groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96-
> [3]
> https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVP5J2iYKVyeL7Kw%3D1MCX35XcAGSanTGbjPtJy3d_9Um7w%40mail.gmail.com?utm_medium=email&utm_source=footer

Nathan Lewis

unread,
Feb 5, 2026, 4:25:14 PMFeb 5
to hbrob...@googlegroups.com
This is void your warranty territory, but if you crack the lid on your Mac Mini, the AC-DC power supply board is separate from the rest of the system. If you supply 12V @ 5.5A, the Mac will happy run off a DC supply. There are a number of mods floating around the internet for this, including Power-over-Ethernet mods.

Supposedly the Mac will still happily boot up without the 3-pin sense cable connected between the logic board and power supply.

Admittedly, you probably want to run things under macOS if you’re going this route so you get the NPU among other stuff.

- Nathan

Michael Wimble

unread,
Feb 5, 2026, 4:59:39 PMFeb 5
to hbrob...@googlegroups.com
I’ve been a big Mac user since almost the beginning of the Mac line of Apple devices. I worked for Apple in the years just before Jobs returned. I even worked for Microsoft for a bit. I run Parallels on my Mac and use that to run my one program I still need to run in Windows a few times a year (though I’ve nearly written my own version of it to run on Linux), and I run Linux as well. It’s not unusual for me to be running the Mac OS, Windows, and Linux all at the same time. The only problem is that getting Gazebo to run on Mac Silicon with the LIDAR emulation is problematic. I haven’t owned a computer running winders since I left Microsoft. My Mac has much better hardware quality hardware for my money and the screen quality is unparalleled. On top of that I can run everything on it.

My setup, though, is that I have a Linux desktop whose motherboard is identical to the PC embedded in my robot. It’s fairly fast and has lots of RAM. I do this because I want to be sure there isn’t some bizarre driver problem making things work on one machine and not the other. When I stop working on those two machines in the evening, having checked everything into GitHub, I go to my laptop and run native Visual Studio for some things and bring up one of my Linux images under Parallels for other things and keep developing until early in the morning.

I have figured out the painful path to run ROS2/Jazzy natively on my Mac desktop, in order that I could experiment with writing some ROS2 code I wanted to run on my Phone and iPad. Not all of RVIZ2 and I don’t think much of Gazebo were working in that mode, but I made no effort to solve the last of those problems. For a long time, I had all of ROS2 and Gazebo working on Parallels/Linux, but the ability to keep the LIDAR simulator working seems to be fragile. Every now and then, I make the effort to get it going again.

I also was using Docker a lot for a while. Docker does rely on the underlying instruction set architecture though, so my Linux images do not play between the AMD machines and the Apple Silicon machines. But, I pretty much used nearly identical Docker text files to create images. I just never got into the habit of using an alias for the monster command line you use to run Docker with all the fiddly bits working as needed. And I never found the convenient habit of how to keep doing the sudo apt-get install to keep the images up to date and then save the image so the old alias still worked. All due to laziness on my part, for sure, but it was just ADT (Another Damn Thing TM) to learn so it’s down a few slots on my list.

Right now my immediate problem is that my AI which does object recognition in the OAKD with a YOLO model needs to then convert the centroid of the recognized object to a base_link based 3D pose by using the depth map. It runs too damn slowly at the moment.

Michael Wimble

unread,
Feb 5, 2026, 5:07:05 PMFeb 5
to hbrob...@googlegroups.com
Oh, also, I do very little with GPIO and such on any Linux box, and if you don’t know why, ask someone. Get yourself a big pot of coffee before you sit down to listen to the answer. I run nearly all of my I/O on my Teensy 4.1 MCUs though I’ve played with a few other devices as well. I’m about to add e-Ink devices driven by ESP32s. Those devices talk USB or WiFi back to the Linux box. I bet there are GPIO to USB dongles out there if you want, or USB to I2C, USB to SPI, etc. If not, I’ve already paid the price of learning KiCad and EasyCAD Pro (two instances of ADT—see quoted text below) so I could easily build me a board that does it. Oh, wait, I’ve already build some 6 or 7 board versions now that do something like that—which is why I have three of them in my robot that talk to Teensy 4.1s that control a lot of I/O without any of the problems of connecting sensors to Linux :-)

Sergei Grichine

unread,
Feb 5, 2026, 5:59:36 PMFeb 5
to hbrob...@googlegroups.com
(Sorry, Marco for stealing your thread...)

OK, let’s focus on some realistic goals here. Full disclosure: I haven’t touched Macs—or any other fruit-branded products—since about 1995. Allergy, you know…

- The hardware (a used Mac Mini) appears to be hackable to accept a 12 V input (though the motherboard’s “power stable” signal must be handled carefully).
- It will need to run its native OS while hosting an exact Raspberry Pi Ubuntu 24.04 image in a virtual machine—binaries, possibly even an R/T kernel. Is this feasible? If yes, which Mini models (M1…M4) would work best? Is "Apple silicon" ARM enough for this?
- I personally don’t need any GUI on my robots—headless operation is perfectly fine. That means Ubuntu Server with ROS 2 Base (Jazzy, for example). But again, binaries.
- The GPIO and I²C limitations can be addressed by using, for example, Michael Wimble’s Teensy boards and software (especially if ros2_control support is adopted). All sensors and actuators would connect to the MCU controller board.
- And finally—has anyone actually done this before? Not compiling ROS 2 from source and hacking through the weeds, but using ARM binaries: `apt` installs, updates, the full enchilada?

The ChatGPT.com digest:

Here is a short practical comparison for on-board robot compute:

PlatformProsConsBest use
Raspberry Pi 5Very low power, native GPIO/I²C, cheap, strong ROS2 community supportLimited CPU for perception/planning, no GPU accelerationSmall indoor robots, navigation-only systems
Jetson Orin NanoExcellent GPU for AI/vision, ROS2 ecosystem supportHigher cost, higher power draw, still limited CPU for heavy planningVision-heavy robots, AI perception pipelines
x86 Mini-PC (NUC / similar)Strong CPU performance, easy Ubuntu/ROS2 binary installs, good virtualizationHigher power draw, limited GPIOMid- to large robots needing planning/perception
Mac Mini (M1–M4)Very high performance per watt, quiet, inexpensive used unitsNo native GPIO/I²C, virtualization complexity, robotics stack less testedHigh-compute robots using MCU I/O bridges (Teensy, STM32)

Simple rule of thumb

  • Navigation-only rover → RPi 5

  • Vision / ML workloads → Jetson

  • General robotics compute (safe choice) → x86 mini-PC

  • Experimental high-performance low-power compute (with MCU hardware bridge) → Mac Mini

Here is an AI digest on the ARM-based Minis:

The Apple Mac mini lineup consists of several generations, with the current, compact models featuring powerful M4 and M4 Pro chips designed for Apple Intelligence, featuring 16GB+ RAM. Previous Apple Silicon versions include the M2/M2 Pro (2023) and M1 (2020), which replaced the earlier Intel-based models (2010–2018).
Current & Recent Mac mini Models (Apple Silicon)
  • Mac mini (M4 / M4 Pro, 2024): The latest model (released Oct/Nov 2024), featuring a smaller design, M4/M4 Pro chips, Thunderbolt 4 or 5, and 16GB RAM as base.
  • Mac mini (M2 / M2 Pro, 2023): Introduced in Jan 2023, offering high-performance alternatives to the M1 with faster speeds and more configuration options
    .
  • Mac mini (M1, 2020): The first Apple Silicon model, released in Nov 2020, featuring significant performance gains over Intel models.
Key Differences
  • Performance: M4/M4 Pro offers the best performance and AI capabilities, followed by M2 Pro, M2, and M1.
  • Memory/Storage: Newer models (M4) start at higher base RAM (16GB) compared to older M1/M2 (8GB) models.
  • Ports: M4 models feature Thunderbolt 4/5, while M1/M2 models use Thunderbolt/USB 4.
And the prices:

Used Mac mini prices for M1–M4 models range from roughly $250 for base M1 units up to over $1,300 for high-spec M4 Pro machines. M4 models (16GB/256GB) are frequently found used or open-box for $350–$500. Older M1/M2 models generally sell for $250–$400, depending on storage and RAM.
Used/Refurbished Price Estimates (as of early 2026)

Best Regards,
-- Sergei


--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Sergei Grichine

unread,
Mar 23, 2026, 12:56:58 PM (10 days ago) Mar 23
to hbrob...@googlegroups.com
Just wanted to re-confirm that the current Marco Walther RPi camera packages function correctly on a Raspberry Pi 5 running Ubuntu 24.04 Server.

I am currently playing with a dual Arducam setup for stereo vision and PointCloud2 generation. The fresh installation is performing as expected.



Best Regards,
-- Sergei

Michael Wimble

unread,
Mar 23, 2026, 1:38:27 PM (10 days ago) Mar 23
to hbrob...@googlegroups.com
I’d generally agree with the platform assessment. My OAKD Yolo5-based recognizer is running about 15 FPS with a 416x416 image frame. It also provides depth annotation and produces a point cloud, 3 kinds of camera images, along with the detection messages, so I know the x, y, z of every pixel of any detected object.  My Pi5 with the faster AI hat does a 640x640 Yolo11 model at over 20 FPS, but there is no point cloud, so it’s only good for 2D positioning—I pretty much use it for X/Y servoing of the gripper, whereas the output from the OAD is used to rapidly do X/Y/Z position of the gripper to within a centimeter or so before I switch to the Pi recognizer for final gripper positioning. Oh, both the OAKD and Pi5 outputs are rate-limited. I spent a good amount of time fixing networking to make it faster and more reliable  but even at 200 Mbps WiFi, I can easily saturate the network if I don’t rate-limit all the output. I need to re-send my latest network fix paper as it turns out that if you use Google Nest mesh routers, the protocol you need to keep the WiFis on the robot to keep pairing to the best repeater is different than I though.

My AMD 7900x takes about 60W of power to run and has so much excess CPU power that I can run Gazebo and RVIZ2 on it, and it hardly moves the CPU use meter. My Pi5 with its AI Hat-assisted detector is running flat out with not a lot of wiggle room.  The AMD 7900x was especially chosen for its low power consumption at 5GHz even using all 24 threads and the miniI-TX motherboard was chosen for its small size so it can fit in the robot.

I run Linux under Parallels on my Macs, and they work a treat, but I really only use them for Gazebo simulation, and there are a handful of problems in trying to do that. On my desktop Mac, I did the arduous tasks of getting ROS2/Jazzy to work natively except for Gazebo and Rviz2 and decided that keeping that running wasn’t worth the effort. So I use WebSocket to push whatever I want from the robot itself over the wires and write custom visualizers on the Mac—hardly a good use for the CPU power of the Mac.

I have three Teensy 4.1s to augment my robot because I need performance in a way that even my PC can’t provide. And, even at 600 MHz with a lot of RAM, the Teensy cannot handle all of my sensors and still provide required sensor read rates. Eventually, the fact that each sensor requires a small number of milliseconds to read, no matter how tricky you get, adds up sequentially and limits how many sensors you can use with a board. So I had to spit sensor duties across two boards. The third board has to be physically mounted far away in order to run stepper motors for the gripper, so that’s just a third board that doesn’t have much to do.

I used to run MicroROS on the Teensy boards, but installing it is a real pain in the butt. Keeping it up to date compounds that pain. And, in the end, it really didn’t offer any functionality that was useful to me. I replaced it with just very high-speed USB communication with the PC. It’s very reliable communication; it doesn’t need any frame overhead; I use JSON with short literals and have bandwidth to spare. The tricky part is the multi threading on the PC side to make sure nothing stalls. The PC side converts ROS messages to and from JSON, and everyone participates in a sophisticated safety system. 

Mind, I’m not trying to run a humanoid robot — I’m building a personal assistive robot that, currently, I don’t need or want to fold clothes for me. I do want it to patrol the house looking for all sorts of things out of place, including me lying on the floor, and perform some small duties, like being able to fetch the beer and to distinguish between various medicine bottles and bring me what I need reliably, without failing because of power cords on the floor, or floor mats that have gotten folded up or that cause the wheels to slip.

It’s taken me a few years to get to this point. I’ve generated about 72 thousand lines of code, and with the latest rewrite that I finished a few hours ago over a two-month period, I now have about 200 tests in the test suite, both for all of the Teensy functionality with simulators for the hardware and for the PC side.

There’s still work to be done, of course. 

I’m now considering whether anyone would be interested in learning from my experiences and maybe getting on with finishing my book, “ROS for Mere Mortals,” and starting my blog all over again on how to design the very sort of robot I’m building.

I do take advantage of Sergei’s prodigious output. It’s a good resource. And the Tuesday night ROS discussion group is now filled with people who have a lot to contribute. I couldn’t have gotten here without all that community support. And, of course, the three Horsemen of the Apocalypse: Claude, Gemini and GPT.

Chris Albertson

unread,
Mar 23, 2026, 6:15:39 PM (10 days ago) Mar 23
to hbrob...@googlegroups.com

On Feb 5, 2026, at 12:30 PM, ed via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:

Here is an ESP32 solution for the GPIO.

You don’t need the custom PCB.   Unless you really need the Pi-specific I/O header.    “Stock” ESP32 dev board wil do the exact same thing but with different pin locations.

But this is a waste of a very powerful device.   The ESP32 is a 32-bit dual-core RISC-V with up to 16MB of flash.  And many of the development system you use for ESP32 have FreeRTOS under the hood doing low level stuff like drivers, event handling and running concurent tasks.    I think even Arduino has this.   This allows the ESP to host a web site while also running motors and liting to “robot commands” over some WiFi interface.    Maybe yours robot’s traction motor controller doe not need a web site, but, it makes a great debug and status tool

The S3 version has SIMD instructions for AI vector math. All for under $10.





Pito Salas

unread,
Mar 27, 2026, 4:10:23 PM (6 days ago) Mar 27
to hbrob...@googlegroups.com
Hi Michael

You are successfully using OAKD. I have some questions:

- is that OAK_D Lite or other model?
- Are you running the NN onboard the camera’s processor or off board?
- If onboard, which NN model are you using?
- What are you recognizing exactly?

Thanks!!

Pito

Boston Robot Hackers &&
Comp. Sci Faculty, Brandeis University (Emeritus)


Reply all
Reply to author
Forward
0 new messages