https://launchpad.net/~marco-sonic/+archive/ubuntu/rasppios/+packages might not work as expected!!

34 views
Skip to first unread message

Marco Walther

unread,
Feb 3, 2026, 3:53:05 AM (6 days ago) Feb 3
to hbrob...@googlegroups.com
Sorry, I'm trying to build and publish newer versions. But every step
takes a long time (the launchpad servers have to build each step and
publish it - >> 1h :-()
libcamera 0.6 is pushed now but the rpicamera stuff not yet, so things
will not update cleanly right now!
I have a build of rpicamera, but I'll push it later today (> 8:00 PST)

Sorry,
-- Marco

Marco Walther

unread,
Feb 3, 2026, 3:25:37 PM (6 days ago) Feb 3
to hbrob...@googlegroups.com
OK, new packages are available at the PPA now;-) Please do the usual

$ sudo apt-get update && sudo apt-get dist-upgrade

and the

$ ros2 run camera_ros camera_node

should work again;-)

Please let me know, if you find any problems.

Thanks,
-- Marco

>
> Sorry,
> -- Marco
>

Ross Lunan

unread,
Feb 3, 2026, 9:38:21 PM (5 days ago) Feb 3
to HomeBrew Robotics Club
Marco, Sergei, Alan
Tah Dah - We have contact ! Installing Marco's latest libcamera0.6 apps packages on my RPi5/v2.1 Camera system and running debian installed ros-jazzy-camera-ros works. The $ ros2 run camera-ros camera_node command on a Remote Desktop with rqt_image_view displays the camera 800x600 image  as does a $ ros2 launch camera_ros camera.launch.py command on the RPI5 with connected Monitor (see the attached happy camper picture).  The rpicam-hello cli works

$ rpicam-hello --list-cameras

Available cameras

-----------------

0 : imx219 [3280x2464 10-bit RGGB] (/base/axi/pcie@120000/rp1/i2c@88000/imx219@10)

    Modes: 'SRGGB10_CSI2P' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]

                             1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]

                             1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]

                             3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]

           'SRGGB8' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]

                      1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]

                      1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]

                      3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]

However, the other rpican-hello, -still, -vid run but do not save an image or video. Any ideas to investigate?


So I can now revise my repo Raspberry-Pi-Camera-ROS to reflect using a camera_ros debian install with Marco's ppa packages.  

sudo dpkg -l |grep libcamera

ii  gstreamer1.0-libcamera:arm64  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (GStreamer plugin)

ii  libcamera-ipa:arm64           0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (IPA modules)

ii  libcamera-tools               0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (tools)

ii  libcamera-v4l2:arm64          0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (V4L2 module)

ii  libcamera0.6:arm64            0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library

ii  python3-libcamera             0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (Python bindings)

ii  ros-jazzy-camera-ros          0.5.2-1noble.20260124.190824      arm64 node for libcamera supported cameras (V4L2, Raspberry Pi Camera Modules)

ii  rpicam-apps-core              1.11.0-1ubuntu1~marco1            arm64 Camera based applications for Raspberry Pi using the libcamera framework


Ross

RPI5-RPiv.2.1Cam_camera_ros_640_20260203.JPG

Marco Walther

unread,
Feb 3, 2026, 10:07:31 PM (5 days ago) Feb 3
to hbrob...@googlegroups.com
On 2/3/26 18:38, 'Ross Lunan' via HomeBrew Robotics Club wrote:
> Marco, Sergei, Alan
> Tah Dah - We have contact ! Installing Marco's latest libcamera0.6 apps
> packages on my RPi5/v2.1 Camera system and running debian installed ros-
> jazzy-camera-ros works. The $ ros2 run camera-ros camera_node command on
> a Remote Desktop with rqt_image_view displays the camera 800x600 image
>  as does a $ ros2 launch camera_ros camera.launch.py command on the
> RPI5 with connected Monitor (see the attached happy camper picture).
>  The rpicam-hello cli works
>
> $ rpicam-hello --list-cameras

Yeah, the problem is, the libcamera adds the 'minor' version to the
package and shared library names:-( So, switches from 0.5 -> 0.6 -> ...
will always be a pain:-(

No real way around that while trying to stay within the established
naming scheme:-(

Please let me know, when you run into problems [again].

-- Marco

>
> Available cameras
>
> -----------------
>
> 0 : imx219 [3280x2464 10-bit RGGB] (/base/axi/pcie@120000/rp1/i2c@88000/
> imx219@10)
>
>     Modes: 'SRGGB10_CSI2P' : 640x480 [103.33 fps - (1000, 752)/1280x960
> crop]
>
>                              1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
>
>                              1920x1080 [47.57 fps - (680,
> 152)/1920x2160 crop]
>
>                              3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
>
>            'SRGGB8' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]
>
>                       1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
>
>                       1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]
>
>                       3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
>
> However, the other rpican-hello, -still, -vid run but do not save an
> image or video. Any ideas to investigate?
>
>
> So I can now revise my repo Raspberry-Pi-Camera-ROS <https://github.com/
> ARLunan/Raspberry-Pi-Camera-ROS/blob/main/Documents/RPCamera-
> Installation-RP5.md> to reflect using a camera_ros debian install with
> Marco's ppa packages.
>
> sudo dpkg -l |grep libcamera
>
> ii  gstreamer1.0-*libcamera*:arm64
> 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (GStreamer plugin)
>
> ii *libcamera*-ipa:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (IPA modules)
>
> ii *libcamera*-tools
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (tools)
>
> ii *libcamera*-v4l2:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (V4L2 module)
>
> ii *libcamera*0.6:arm64
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
>
> ii  python3-*libcamera*
>  0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
> (Python bindings)
>
> ii  ros-jazzy-camera-ros          0.5.2-1noble.20260124.190824
>  arm64 node for *libcamera* supported cameras (V4L2, Raspberry Pi
> Camera Modules)
>
> ii  rpicam-apps-core              1.11.0-1ubuntu1~marco1
>  arm64 Camera based applications for Raspberry Pi using the *libcamera*
> framework
>
>
> Ross
>
> On Tuesday, February 3, 2026 at 3:25:37 p.m. UTC-5 Marco Walther wrote:
>
> On 2/3/26 00:52, Marco Walther wrote:
> > Sorry, I'm trying to build and publish newer versions. But every
> step
> > takes a long time (the launchpad servers have to build each step and
> > publish it - >> 1h :-()
> > libcamera 0.6 is pushed now but the rpicamera stuff not yet, so
> things
> > will not update cleanly right now!
> > I have a build of rpicamera, but I'll push it later today (> 8:00
> PST)
>
> OK, new packages are available at the PPA now;-) Please do the usual
>
> $ sudo apt-get update && sudo apt-get dist-upgrade
>
> and the
>
> $ ros2 run camera_ros camera_node
>
> should work again;-)
>
> Please let me know, if you find any problems.
>
> Thanks,
> -- Marco
>
> >
> > Sorry,
> > -- Marco
> >
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com
> <mailto:hbrobotics+...@googlegroups.com>.
> To view this discussion visit https://groups.google.com/d/msgid/
> hbrobotics/37bee1d0-6dc0-4391-b5f6-600f2c612991n%40googlegroups.com
> <https://groups.google.com/d/msgid/hbrobotics/37bee1d0-6dc0-4391-
> b5f6-600f2c612991n%40googlegroups.com?utm_medium=email&utm_source=footer>.

Nathan Lewis

unread,
Feb 4, 2026, 2:39:32 PM (5 days ago) Feb 4
to Marco Walther, hbrob...@googlegroups.com
Hey Marco,

Are you building locally on the Pi?

If so I could probably help you set up a foreign architecture build environment. Basically if cross compiling is too much of a pain or impossible, you can do some fun tricks with qemu user mode to run your RPi image as a docker container on an x86_64 host.

- Nathan
-- 
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.


Michael Wimble

unread,
Feb 4, 2026, 4:36:46 PM (5 days ago) Feb 4
to hbrob...@googlegroups.com
If someone wants to write articles on:
1. How to set up kernel source build on a Pi.
2. How Marco figured out what to build and how to build it.

I’d love to have those two articles in my big bucket o bits repository.


Marco Walther

unread,
Feb 5, 2026, 4:08:31 AM (4 days ago) Feb 5
to Nathan Lewis, hbrob...@googlegroups.com
On 2/4/26 11:39, Nathan Lewis wrote:
> Hey Marco,
>
> Are you building locally on the Pi?
>
> If so I could probably help you set up a foreign architecture build
> environment. Basically if cross compiling is too much of a pain or
> impossible, you can do some fun tricks with qemu user mode to run your
> RPi image as a docker container on an x86_64 host.

The bigger pain point is the launchpad infrastructure:-( I have a Pi5
with 16GB and SSD to build locally. That works fine. But pushing things
to launchpad is hard, especially since the packages have dependencies
between them. Push one, wait an hour+ until it's built & published and
then you can push the next package which depends on the first. Do that
for four or five layers and you spend the whole day, pretty much waiting:-(

Usually the builds are << 10 minutes per source package, but the publish
takes some time.

Thanks,
-- Marco


>
> - Nathan
>
> On Tue, Feb 3, 2026, at 12:52 AM, Marco Walther wrote:
>> Sorry, I'm trying to build and publish newer versions. But every step
>> takes a long time (the launchpad servers have to build each step and
>> publish it - >> 1h :-()
>> libcamera 0.6 is pushed now but the rpicamera stuff not yet, so things
>> will not update cleanly right now!
>> I have a build of rpicamera, but I'll push it later today (> 8:00 PST)
>>
>> Sorry,
>> -- Marco
>>
>> --
>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>> To unsubscribe from this group and stop receiving emails from it, send
>> an email to hbrobotics+...@googlegroups.com
>> <mailto:hbrobotics%2Bunsu...@googlegroups.com>.
>> To view this discussion visit https://groups.google.com/d/msgid/
>> hbrobotics/61c183df-2896-4c33-be96-dd7a33c586a3%40gmail.com <https://
>> groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96-
>> dd7a33c586a3%40gmail.com>.
>>
>

Chris Albertson

unread,
Feb 5, 2026, 12:47:43 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com, Nathan Lewis
For Pi development work, I run a virtual machine on my Mac. The OS from ther Pi runs on the VM without emulation because both the Mac and the Pi are ARM CPUs. Except that the ARM in the Mac has 12 cores and each is MUCH faster than the Pi4. I have a folder called “Projects” that holds the data and it gets mounted on the Mac and the Pi. So when I have something to run on the Pi, there is no need to transfer the data or software. It is always present on all the computers I use.

So I build Pi software in effectively zero time. I have never noticed any kind of delay or lag. I don’t know if it is worth buying a modern Mac just for Pi development. That would depend on how much of it you do. Prices are reasonable, < $400 for a used Mini. Which Mac? Any of them will be well more than an order of magnitude faster than a Pi. Possibly even 100X faster for some things because the storage on the Mac tests at about 6,300 MB per second, and I’m lucky to see 100 MB/sec on the Pi’s SD card.
> To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/6f88214e-f0c0-483c-aa87-71ab8d96ba72%40gmail.com.

Sergei Grichine

unread,
Feb 5, 2026, 2:07:05 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
Well, if a “< $400 used Mac Mini” (ARM-based) can run the Raspberry Pi edition of Ubuntu 24.04 in a virtual machine—with full ROS 2 support—it could actually make a solid on-board computer for robots.

The problem, of course, is how to power it from a typical 12 V battery, along with the complete lack of I2C and GPIO access. A Teensy over USB could handle all sensors and actuators, with a bit of creative programming.

It would open a lot of possibilities, including AI.

Have anybody tried this? Any thoughts?

Best Regards,
-- Sergei


e...@okerson.com

unread,
Feb 5, 2026, 3:30:46 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
Here is an ESP32 solution for the GPIO.

https://www.hackster.io/news/give-any-computer-raspberry-pi-style-gpio-pins-1e12f919feb4

Ed Okerson

On 2026-02-05 13:06, Sergei Grichine wrote:
> Well, if a “_< $400 used Mac Mini_” (ARM-based) can run the
>> hbrobotics/61c183df-2896-4c33-be96-dd7a33c586a3%40gmail.com [1]
>> <https://
>> groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96- [2]
>> dd7a33c586a3%40gmail.com [1]>.
>>>>>
>>>
>>> --
>>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>>> To unsubscribe from this group and stop receiving emails from it,
>> send an email to hbrobotics+...@googlegroups.com.
>>> To view this discussion visit
>>
> https://groups.google.com/d/msgid/hbrobotics/6f88214e-f0c0-483c-aa87-71ab8d96ba72%40gmail.com.
>>
>> --
>> You received this message because you are subscribed to the Google
>> Groups "HomeBrew Robotics Club" group.
>> To unsubscribe from this group and stop receiving emails from it,
>> send an email to hbrobotics+...@googlegroups.com.
>> To view this discussion visit
>>
> https://groups.google.com/d/msgid/hbrobotics/E7B8634D-485F-4BE4-8DEE-AEA4730B84C4%40gmail.com.
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com.
> To view this discussion visit
> https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVP5J2iYKVyeL7Kw%3D1MCX35XcAGSanTGbjPtJy3d_9Um7w%40mail.gmail.com
> [3].
>
>
> Links:
> ------
> [1] http://40gmail.com
> [2]
> http://groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96-
> [3]
> https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVP5J2iYKVyeL7Kw%3D1MCX35XcAGSanTGbjPtJy3d_9Um7w%40mail.gmail.com?utm_medium=email&utm_source=footer

Nathan Lewis

unread,
Feb 5, 2026, 4:25:14 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
This is void your warranty territory, but if you crack the lid on your Mac Mini, the AC-DC power supply board is separate from the rest of the system. If you supply 12V @ 5.5A, the Mac will happy run off a DC supply. There are a number of mods floating around the internet for this, including Power-over-Ethernet mods.

Supposedly the Mac will still happily boot up without the 3-pin sense cable connected between the logic board and power supply.

Admittedly, you probably want to run things under macOS if you’re going this route so you get the NPU among other stuff.

- Nathan

Michael Wimble

unread,
Feb 5, 2026, 4:59:39 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
I’ve been a big Mac user since almost the beginning of the Mac line of Apple devices. I worked for Apple in the years just before Jobs returned. I even worked for Microsoft for a bit. I run Parallels on my Mac and use that to run my one program I still need to run in Windows a few times a year (though I’ve nearly written my own version of it to run on Linux), and I run Linux as well. It’s not unusual for me to be running the Mac OS, Windows, and Linux all at the same time. The only problem is that getting Gazebo to run on Mac Silicon with the LIDAR emulation is problematic. I haven’t owned a computer running winders since I left Microsoft. My Mac has much better hardware quality hardware for my money and the screen quality is unparalleled. On top of that I can run everything on it.

My setup, though, is that I have a Linux desktop whose motherboard is identical to the PC embedded in my robot. It’s fairly fast and has lots of RAM. I do this because I want to be sure there isn’t some bizarre driver problem making things work on one machine and not the other. When I stop working on those two machines in the evening, having checked everything into GitHub, I go to my laptop and run native Visual Studio for some things and bring up one of my Linux images under Parallels for other things and keep developing until early in the morning.

I have figured out the painful path to run ROS2/Jazzy natively on my Mac desktop, in order that I could experiment with writing some ROS2 code I wanted to run on my Phone and iPad. Not all of RVIZ2 and I don’t think much of Gazebo were working in that mode, but I made no effort to solve the last of those problems. For a long time, I had all of ROS2 and Gazebo working on Parallels/Linux, but the ability to keep the LIDAR simulator working seems to be fragile. Every now and then, I make the effort to get it going again.

I also was using Docker a lot for a while. Docker does rely on the underlying instruction set architecture though, so my Linux images do not play between the AMD machines and the Apple Silicon machines. But, I pretty much used nearly identical Docker text files to create images. I just never got into the habit of using an alias for the monster command line you use to run Docker with all the fiddly bits working as needed. And I never found the convenient habit of how to keep doing the sudo apt-get install to keep the images up to date and then save the image so the old alias still worked. All due to laziness on my part, for sure, but it was just ADT (Another Damn Thing TM) to learn so it’s down a few slots on my list.

Right now my immediate problem is that my AI which does object recognition in the OAKD with a YOLO model needs to then convert the centroid of the recognized object to a base_link based 3D pose by using the depth map. It runs too damn slowly at the moment.

Michael Wimble

unread,
Feb 5, 2026, 5:07:05 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
Oh, also, I do very little with GPIO and such on any Linux box, and if you don’t know why, ask someone. Get yourself a big pot of coffee before you sit down to listen to the answer. I run nearly all of my I/O on my Teensy 4.1 MCUs though I’ve played with a few other devices as well. I’m about to add e-Ink devices driven by ESP32s. Those devices talk USB or WiFi back to the Linux box. I bet there are GPIO to USB dongles out there if you want, or USB to I2C, USB to SPI, etc. If not, I’ve already paid the price of learning KiCad and EasyCAD Pro (two instances of ADT—see quoted text below) so I could easily build me a board that does it. Oh, wait, I’ve already build some 6 or 7 board versions now that do something like that—which is why I have three of them in my robot that talk to Teensy 4.1s that control a lot of I/O without any of the problems of connecting sensors to Linux :-)

Sergei Grichine

unread,
Feb 5, 2026, 5:59:36 PM (4 days ago) Feb 5
to hbrob...@googlegroups.com
(Sorry, Marco for stealing your thread...)

OK, let’s focus on some realistic goals here. Full disclosure: I haven’t touched Macs—or any other fruit-branded products—since about 1995. Allergy, you know…

- The hardware (a used Mac Mini) appears to be hackable to accept a 12 V input (though the motherboard’s “power stable” signal must be handled carefully).
- It will need to run its native OS while hosting an exact Raspberry Pi Ubuntu 24.04 image in a virtual machine—binaries, possibly even an R/T kernel. Is this feasible? If yes, which Mini models (M1…M4) would work best? Is "Apple silicon" ARM enough for this?
- I personally don’t need any GUI on my robots—headless operation is perfectly fine. That means Ubuntu Server with ROS 2 Base (Jazzy, for example). But again, binaries.
- The GPIO and I²C limitations can be addressed by using, for example, Michael Wimble’s Teensy boards and software (especially if ros2_control support is adopted). All sensors and actuators would connect to the MCU controller board.
- And finally—has anyone actually done this before? Not compiling ROS 2 from source and hacking through the weeds, but using ARM binaries: `apt` installs, updates, the full enchilada?

The ChatGPT.com digest:

Here is a short practical comparison for on-board robot compute:

PlatformProsConsBest use
Raspberry Pi 5Very low power, native GPIO/I²C, cheap, strong ROS2 community supportLimited CPU for perception/planning, no GPU accelerationSmall indoor robots, navigation-only systems
Jetson Orin NanoExcellent GPU for AI/vision, ROS2 ecosystem supportHigher cost, higher power draw, still limited CPU for heavy planningVision-heavy robots, AI perception pipelines
x86 Mini-PC (NUC / similar)Strong CPU performance, easy Ubuntu/ROS2 binary installs, good virtualizationHigher power draw, limited GPIOMid- to large robots needing planning/perception
Mac Mini (M1–M4)Very high performance per watt, quiet, inexpensive used unitsNo native GPIO/I²C, virtualization complexity, robotics stack less testedHigh-compute robots using MCU I/O bridges (Teensy, STM32)

Simple rule of thumb

  • Navigation-only rover → RPi 5

  • Vision / ML workloads → Jetson

  • General robotics compute (safe choice) → x86 mini-PC

  • Experimental high-performance low-power compute (with MCU hardware bridge) → Mac Mini

Here is an AI digest on the ARM-based Minis:

The Apple Mac mini lineup consists of several generations, with the current, compact models featuring powerful M4 and M4 Pro chips designed for Apple Intelligence, featuring 16GB+ RAM. Previous Apple Silicon versions include the M2/M2 Pro (2023) and M1 (2020), which replaced the earlier Intel-based models (2010–2018).
Current & Recent Mac mini Models (Apple Silicon)
  • Mac mini (M4 / M4 Pro, 2024): The latest model (released Oct/Nov 2024), featuring a smaller design, M4/M4 Pro chips, Thunderbolt 4 or 5, and 16GB RAM as base.
  • Mac mini (M2 / M2 Pro, 2023): Introduced in Jan 2023, offering high-performance alternatives to the M1 with faster speeds and more configuration options
    .
  • Mac mini (M1, 2020): The first Apple Silicon model, released in Nov 2020, featuring significant performance gains over Intel models.
Key Differences
  • Performance: M4/M4 Pro offers the best performance and AI capabilities, followed by M2 Pro, M2, and M1.
  • Memory/Storage: Newer models (M4) start at higher base RAM (16GB) compared to older M1/M2 (8GB) models.
  • Ports: M4 models feature Thunderbolt 4/5, while M1/M2 models use Thunderbolt/USB 4.
And the prices:

Used Mac mini prices for M1–M4 models range from roughly $250 for base M1 units up to over $1,300 for high-spec M4 Pro machines. M4 models (16GB/256GB) are frequently found used or open-box for $350–$500. Older M1/M2 models generally sell for $250–$400, depending on storage and RAM.
Used/Refurbished Price Estimates (as of early 2026)

Best Regards,
-- Sergei


--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages