$ rpicam-hello --list-cameras
Available cameras
-----------------
0 : imx219 [3280x2464 10-bit RGGB] (/base/axi/pcie@120000/rp1/i2c@88000/imx219@10)
Modes: 'SRGGB10_CSI2P' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]
1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]
3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
'SRGGB8' : 640x480 [103.33 fps - (1000, 752)/1280x960 crop]
1640x1232 [41.85 fps - (0, 0)/3280x2464 crop]
1920x1080 [47.57 fps - (680, 152)/1920x2160 crop]
3280x2464 [21.19 fps - (0, 0)/3280x2464 crop]
However, the other rpican-hello, -still, -vid run but do not save an image or video. Any ideas to investigate?
So I can now revise my repo Raspberry-Pi-Camera-ROS to reflect using a camera_ros debian install with Marco's ppa packages.
sudo dpkg -l |grep libcamera
ii gstreamer1.0-libcamera:arm64 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (GStreamer plugin)
ii libcamera-ipa:arm64 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (IPA modules)
ii libcamera-tools 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (tools)
ii libcamera-v4l2:arm64 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (V4L2 module)
ii libcamera0.6:arm64 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library
ii python3-libcamera 0.6.0+rpt20251202-1ubuntu1~marco1 arm64 complex camera support library (Python bindings)
ii ros-jazzy-camera-ros 0.5.2-1noble.20260124.190824 arm64 node for libcamera supported cameras (V4L2, Raspberry Pi Camera Modules)
ii rpicam-apps-core 1.11.0-1ubuntu1~marco1 arm64 Camera based applications for Raspberry Pi using the libcamera framework
Ross
--You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/61c183df-2896-4c33-be96-dd7a33c586a3%40gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/5a3e307f-b0cb-4849-9a4b-9d214be4a558%40app.fastmail.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/E7B8634D-485F-4BE4-8DEE-AEA4730B84C4%40gmail.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVP5J2iYKVyeL7Kw%3D1MCX35XcAGSanTGbjPtJy3d_9Um7w%40mail.gmail.com.
Here is a short practical comparison for on-board robot compute:
| Platform | Pros | Cons | Best use |
|---|---|---|---|
| Raspberry Pi 5 | Very low power, native GPIO/I²C, cheap, strong ROS2 community support | Limited CPU for perception/planning, no GPU acceleration | Small indoor robots, navigation-only systems |
| Jetson Orin Nano | Excellent GPU for AI/vision, ROS2 ecosystem support | Higher cost, higher power draw, still limited CPU for heavy planning | Vision-heavy robots, AI perception pipelines |
| x86 Mini-PC (NUC / similar) | Strong CPU performance, easy Ubuntu/ROS2 binary installs, good virtualization | Higher power draw, limited GPIO | Mid- to large robots needing planning/perception |
| Mac Mini (M1–M4) | Very high performance per watt, quiet, inexpensive used units | No native GPIO/I²C, virtualization complexity, robotics stack less tested | High-compute robots using MCU I/O bridges (Teensy, STM32) |
Simple rule of thumb
Navigation-only rover → RPi 5
Vision / ML workloads → Jetson
General robotics compute (safe choice) → x86 mini-PC
Experimental high-performance low-power compute (with MCU hardware bridge) → Mac Mini
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/59B91E17-E35A-4945-B90E-15A62AF6A28C%40gmail.com.