Look at sigyn_to_teensy for the PC-side code that sends ROS messages as ASCII to the Teensy and reads ASCII messages from the Teensy and turns them into ROS messages. It’s a trivial program.
And teensy_to_sigyn is the Teensy side program. It’s not complete yet and only deals with the battery monitor and cmd_vel stuff. It isn’t tuned yet but it’s small now and easy to see how it all works.
The package teensy_monitor/tm5 is the older, Micro ROS code with more functionality.
teensy_monitor/oldCode has the Teensy stuff with time-of-flight and SONAR tricky code to deal with lots of sensors efficiently to keep up high frame rates. It also has the old touch screen monitor code.
For those interested in what I did today, I took the old, ROS1 package for teleop_twist_keyboard and had AI rewrite it for ROS2 with some redesign and improvements. Importantly, it supports command-line options for default linear.x and angular.z values so you don’t have to always pound the keyboard a bunch of times to get to the rates you want. It supports a limit on speed and turn values. You can decide if you want stamped cmd_vel or unstamped cmd_vel. You can set the frame_id. You can decide the repeat rate. And when you stop pressing a key it stops sending cmd_vel messages.
Other interesting packages include:
* bluetooth_joystick — pairs with my Nimbus Steele Series joystick (from my Mac) to send cmd_vel messages, implement a deadman switch, operate my elevator and horizontal gripper, and open and close the testicle twister on my gripper.
* gripper — this is the Micro ROS Teensy side package to operate the elevator and horizontal extender.
* sgyn_behavior_trees — the beginning of my special code and custom behavior trees for my robot. Currently, it’s mostly a demo of how to extend and customize ROS2 behavior trees. I’ll fill in more later.
There is more, but I’m not going to describe it all here.
Other interesting repositories include:
* sigyn_testicle_twister - code that runs on a Pi 5 and subscribes to a cmd_vel topic and operates the opening and closing of the end-effect gripper. Also includes code to control a servo by hardware PWM on a Pi 4 (not a Pi 5 running Ubuntu), software PWM, and using a PCA9685 dongle.
* sigyn_video_server — shows the video stream from my v3 wide-angle Pi camera on a web page and includes a button to capture a frame as a JPG to a directory. Those images will later be used to train my object recognition AI.
* wifi_logger_visualizer — captures Wi-Fi data and communication speed from the robot to the desktop and saves it into a database for display as an RVIZ overlay. It also can generate a standalone heat map of the Wi-Fi data, showing you how well Wi-Fi is working throughout your house.
* min_max_curr_rviz_overlay — takes the Wi-Fi logger data and shows it as a costmap overlay in RVIZ so I can see how well the Wi-Fi performs on my robot throughout the house. It also has a widget for responding to some critical action, like a battery getting ready to fail.
* ros2_roboclaw_driver — a recent rewrite of a driver for the RoboClaw. Works on a PC and has special instructions for use on Raspberry Pis.
* laser_lines — looks at laser scans and detects lines, such as might be walls or edges of tables. Will be used to do localization by recognizing the shapes of each room of my house and figuring out where the robot is in the room.