A nice way to control my robot when I am not at my computer

26 views
Skip to first unread message

Pito Salas

unread,
Sep 8, 2025, 7:47:34 PM (13 days ago) Sep 8
to hbrob...@googlegroups.com
I am working out how to control my robot in ways other than using the shell to run launch files. It’s a small differential drive robot running ROS2. There is also a remote Desktop computer that likely will be part of the ROS2 system running one or more nodes. I don’t want to have to be sitting at my desktop computer.

I imagine primarily there will be “commands” with possibly some arguments. There might be some feedback. As an example commands could be:

Read map file and launch map server
Drive to target X
Robot is at start position
Restart navigation stack

Etc. No idea what they will be. For now my missions are very simple, like navigate around my house, go to a destination etc. Later it will be to do that without a map but build the map as it goes.
I know this is a common problem. You see the driver of a robot carrying around something that looks like an RC remote or an iPad. Or a laptop and their sitting in the shell with rviz running etc.
Here are the options that I see clearly:

• A laptop running full ROS2 on Linux that I can carry around.
• A laptop running something like Foxglove
• A special app on an ipad
• A playstation control where the colored buttons do certain things

Maybe there’s a utility running on the desktop which coordinates things and gets commands from an iPad app or something.

As I am a one person shop I will want to order my approach in a practical way. Start really simple and build on that.
What have you done? What ideas do you have? If you work with commercial products, what do they do?


Best,

Pito

Boston Robot Hackers &&
Comp. Sci Faculty, Brandeis University

Chris Albertson

unread,
Sep 8, 2025, 9:58:21 PM (13 days ago) Sep 8
to hbrob...@googlegroups.com
My thinking about a domestic robot is that "the house is the robot.”

Then I think about what I want it to do
1) unlock the front door when it sees I am there trying to get in
2) turn the lights off when a room is empty
3) If there is a delivery left on the porch, bring it in and put it where it goes.
4) Tell me where I put my car keys, or some other object I misplaced
5) vacuum the floor
6) Fetch me a beer

Basically, there is one AI agent and it controls multiple physical objects;  some of the objects are as simple as a light bulb, and another might be a humanoid robot.

So now control is different.  I don’t tell Fido the robo dog to go fetch the Amazon package.   I tell the system I want it brought in.  The AI makes a plan that might involve Fido or maybe using both Fido and Rover.  The security camera on the doorbell sees the package, the electric door bolt opens and so on.    Well that is the goal, someday in the next century.  But I do have tiny bits of it working.

Physical control devices:   This is great, I have two of them I’m using. (Link below).  You can wipe the firmware and reprogram the ESP32 just like any ESP32.  The device is AC powered and fits inside a standard light switch wall box, no battery.   There are two AC relays, two physical buttons and a color touch screen.   They are cheap enough that you can afford to put one or two in every room.  They connect to WiFi.  I can push data to them or have them send data.    You can create a complex menu tree or just use it as a light dimmer if you move a finger up or down on the screen.    It is completely "hackable" and will run “" esp home" open source software.

Mine is left in screen saver mode (dark screen) until I touch it, then it lights up with a status display, no buttons.  I swipe to change screens.  One screen is a menu tree but others are things like light dimmers.   I use the physical buttons for "all lights off" or on.

Maybe you have a rover security robot, then swipe to its main camera,    The other way to control a robot is voice.   I don't want to have to find the robo-dog to talk to it.   Siri on my watch should connect somehow.   I think this is possible bt I've not tried yet.   
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/6FA6A7AE-0CD5-4CA4-8F57-5AF2F436A91B%40gmail.com.

camp .

unread,
Sep 9, 2025, 11:19:22 AM (12 days ago) Sep 9
to hbrob...@googlegroups.com
>  As I am a one person shop I will want to order my approach in a practical way. Start really simple and build on that. What have you done? What ideas do you have? If you work with commercial products, what do they do?

    My MVP (Minimal Viable Product) is the "Smart Table." That is a tray table that can reliably get from one place to another. The tabletop IS the end effector.

    Mobile robots are all about delivery. Even a robot vacuum cleaner basically delivers a brush and a negative-pressure vacuum to every corner of the house.

    For example, a parent could prepare a snack for the kids, place it on the Smart Table, send it to the living room where the kids could eat the snack, and then return to the Kitchen later to deal with leftovers and dishes. It's human-assisted delivery.

    The Smart Table is a tray table that comes to you when you need a place to put your coffee cup. I need my computer, toolbox, CPAP machine, blood pressure monitor, and any other necessary items to be brought to me.

    I need my music to follow me around. It could be a trash can that comes to you, rather than you having to go to it. It could be a mobile valet with a hat and coat rack. It moves and waits with your stuff at the door (keys, phone, wallet, pocketknife, etc.) before you leave each day, reminding you of your appointments.

    When I get home at the end of a hard day, I'd like my robot to greet me at the door as a Smart Table so I can settle in by putting away my bags, keys, and mail. Artificial intelligence should be proactive and find me when something interesting happens. Future development may include voice labels that represent map locations. 

    This could be a godsend for older adults and individuals with disabilities. Eventually, we will get to pick-move-and-place; that is, mobile manipulation, but the starting point is the mobile base and human-assisted delivery. This is the beginning of a robotics lifestyle.

    With a tray table that can reliably transport items from one place to another, in the home or in the business, one day you might be saying, "How did we ever get along without this?"

Homebrewed Ultrasonic ROS2 Sensor



Enjoy,
Camp

camp .

unread,
Sep 9, 2025, 12:37:54 PM (12 days ago) Sep 9
to hbrob...@googlegroups.com
    For my audio AI, I've Velcro'd a battery-powered Google Mini on top (Google is integrating Gemini). And in the 'eating your own dog food' space, I've even delivered food from the kitchen to the lounge, eaten off the Smart Table, and then sent it back to the trash can in the kitchen, where I eventually unloaded it.

    Sure, it's contrived, but add voice control... and that might be something. In the meantime, let's get RViz on the phone!

Smart Table

Enjoy,
Camp


camp .

unread,
Sep 9, 2025, 12:58:43 PM (12 days ago) Sep 9
to hbrob...@googlegroups.com
    Let me add, this is an interim step; like the TurtleBot, the Smart Table takes advantage of the ubiquitous robot vacuum cleaning base. In the longer term, larger wheels are needed. These bases work fine for vacuuming, but can be challenging with a 10lb payload, especially with floor/carpet transitions.

    This is why we're all fans of https://github.com/linorobot/linorobot2 as you can scale up the LinoBot; that is, use bigger wheels.

Smarty the Smart Table


Thanks,
Camp

Thomas Messerschmidt

unread,
Sep 9, 2025, 3:38:34 PM (12 days ago) Sep 9
to hbrob...@googlegroups.com, hbrob...@googlegroups.com
It looks like you are living the dream!


Thomas



On Sep 9, 2025, at 9:37 AM, camp . <ca...@camppeavy.com> wrote:


Message has been deleted

Ed Katz

unread,
Sep 11, 2025, 4:27:54 AM (10 days ago) Sep 11
to HomeBrew Robotics Club
Chris,

If you look into "Ambient Intelligence" (AKA "AmI")  https://en.wikipedia.org/wiki/Ambient_intelligence  , you might find it to be very similar to want you've expressed. 

It came about in the 1990's and was rather popular 10-12 years ago.  The Internet (which is your friend) should have many resources on this topic.

Best regards,

--Ed

Ed Katz

unread,
Sep 11, 2025, 7:11:17 PM (10 days ago) Sep 11
to HomeBrew Robotics Club
Does the average person really want humanoid robots in their home?   (Yes, we know you do.)   

Many other people do not.   Read about it here:    https://spectrum.ieee.org/home-humanoid-robots-survey

Chris Albertson

unread,
Sep 13, 2025, 2:36:43 AM (9 days ago) Sep 13
to hbrob...@googlegroups.com
The problem with humanoid robots and also the reason why companies like to say that are working on them is that humans seem to be hard-wired to assume that human-looking objects are human-like.

So if a company wants to make their simple mined AI seem smart, that can wrap it in a humanoid form

A great example of this was Telsa’s demo it the movie studio in LA.  They had a humanoid hand out trinkets and saying things like “have a nice day”. People thought the roobot was intelligent.  But what if there was an industrail robot arm handing the sam trinkets and a speak nearby that said “have a nice day”.   No one would think the machine was smart.  But it is identical technology

My vision id “ambient intelligence (good to have a name for it, thanks) is that we don’t have to care if ther robot is humanoid or looks like a dog, has wheels or even flys like a drone.   The AI just figures out how to do what is needed given what is available, which might even mean asking the human for help.

We all can experiment with this be stating VErY simple.  Try turning the lights in your kitchen on and off.   How do you control the lights?    Yoiu very quickly find that no one wants to use a phone app and even voice is annoying most of the time.  Wall switches work but not always.   

Then try something harder.   I convinced my sister to let me install a way-complex lighting system in her open floorplan first floor.  We put in more than a dozen lights where each one can be set to any color from hot pink to pastel green to the more common “daylight” and “warm white”. Every lightbulb and LED strip is individually controlled and can smoothy change from one color or brightness to the next.      Clearly a wall switch can’t work.  You would need to put in 50 or 60 sliders like a recording studio mixer desk.    But still people expect a switch.    They like it for parties and they entertain quite a lot.  We can ask the system to do a Caribbean color pallet, heavy in piks for a baby shower bniut. with “daylight” in the kitchen nd at night turn on a very dim warm nightlight

IN the end I had it woorking using voice with Apple’s Siri and some custom wall mounted four-button remote controls that look like wall mounted light switches and a few motion sensors.

My next sensor will be radar.  Hue lights can now use their built-in radio to ping the room and if you have at least three lights they can detect a human by how they afftext radio propagation.  There is an API for this.  It wil be some months untill I can try it.

The better sensor will be the security camera and a local AI model.  The example project I’m reading counts the number of birds in a chicken coup every five minutes  but I think it could be adapted to count people in a kitchen and then guess what they are doing

One thing I learned is that any “ambient intelligence” has to be rock solid, 100% reliable and run for years without futzing around.  This means it is not C++ running on an old PC.   No one wants to reformat a hard drive at 3:00 am so they can turn on the lights to use the bathroom.  The system has to be fault-tolerant.

Summary:  If you can not control a light bulb and make it easy to use and fault-tolerant, you have no hope of controlling a domestic robot.   At present, I have the light bulb problem about 80% solved; the remaining 20% is the hard part.  I get a”B-“ as a grade.





Reply all
Reply to author
Forward
0 new messages