Open Claw - any good for ROS 2 robots?

224 views
Skip to first unread message

Sergei Grichine

unread,
Feb 19, 2026, 10:25:32 PMFeb 19
to hbrob...@googlegroups.com

There’s a lot of chatter about OpenClaw.

It “...clears your inbox, sends emails, manages your calendar, checks you in for flights — all from WhatsApp, Telegram, or any chat app you already use.”

For us, the interesting part is its agentic nature and its ability to create and execute workflows. It could be applied to robots as a “interpretive and creative” layer on top of ROS 2.

Here is a good explanation of its capabilities from Rob Braxman: https://youtu.be/CreaIkyZAd4

More info:
For AI thoughts on the topic follow this link

Has anybody tried it at home? Any thoughts?

Best Regards,
-- Sergei

Jeremy Williams

unread,
Feb 19, 2026, 10:27:55 PMFeb 19
to hbrob...@googlegroups.com

https://x.com/jeremynow/status/2024167561026310237?s=46

Interesting take on OpenClaw vulnerabilities by a friend of mine from Cisco.

I’ve not been brave enough to try it yet lol  

Jeremy 

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVPQ7PyeskKzn%2BfaUsFHgQ-F4mAvcZp0SHtOdqQZb5hW8Q%40mail.gmail.com.

Alan Timm

unread,
Feb 19, 2026, 11:21:53 PMFeb 19
to HomeBrew Robotics Club
My curiosity has finally gotten the best of me and I'm going to be setting one up this weekend.

I want speech to be the primary interface though, so tonight's goal was to get one of those speech to speech llm models running.  
I'm not having alot of luck with qwen 2.5 omni.
I think i'm going to revert to a dedicated vad + parakeet + llm + qwen3-tts as the frontend for my openclaw.

Alan

Thomas Messerschmidt

unread,
Feb 20, 2026, 12:29:07 AMFeb 20
to hbrob...@googlegroups.com
I read the article on OpenClaw. It sounds like it could easily get out of hand. 

Your thoughts? 

Ken Gregson

unread,
Feb 20, 2026, 1:40:14 AMFeb 20
to hbrob...@googlegroups.com
Havoc? Just keep in mind these issues which I'm sure you've seen already:

OpenClaw continues to be at the center of a major cybersecurity crisis, with multiple new incidents compounding existing vulnerabilities. As of February 20, 2026, researchers have confirmed a fresh wave of attacks, including infostealer malware targeting OpenClaw configuration files and gateway tokens, marking a shift toward direct exploitation of personal AI agent data. This follows a massive supply-chain attack dubbed ClawHavoc, which flooded the official ClawHub marketplace with over 1,184 malicious skills—many disguised as crypto tools or productivity plugins—designed to steal SSH keys, browser cookies, API keys, and crypto wallets. 

Further exacerbating the situation, more than 135,000 OpenClaw instances remain exposed to the public internet, many running with default settings that bind the bot to 0.0.0.0:18789, making them easy targets for remote code execution (RCE) attacks. A critical RCE vulnerability (CVE-2026-25253, CVSS 8.8) was patched in version 2026.1.29, but over 50,000 vulnerable instances are still active. Additionally, Moltbook, the social network for AI agents, suffered a data leak exposing 1.5 million agent tokens and 35,000 email addresses, with no ability to delete accounts, creating long-term risks. 

Security experts warn that OpenClaw’s architecture inherently amplifies risk—its ability to access APIs, cloud services, and internal systems means a single compromise can lead to full system takeover. Gartner and CSO Online now advise enterprises to block OpenClaw downloads, rotate all associated credentials, and treat it as a high-risk Shadow AI tool. The platform’s rapid rise has outpaced security, making it a persistent security nightmare for individuals and organizations alike.

-Ken

Sergei Grichine

unread,
Feb 20, 2026, 1:45:48 PMFeb 20
to hbrob...@googlegroups.com
one less thing to try...

I attempted to install OpenClaw on my Nvidia Jetson Nano (Dev Kit)—no joy. The app won't install on the native, updated Ubuntu 18.04 because it requires an incompatible version of Node.js. NVM didn't help here.

Using Docker (with Ubuntu 24.04 or Debian Bookworm images), I managed to reach the OpenClaw "curl..." step, but the host OS crashes and reboots—likely because 4GB of RAM isn't enough.

And yes, from a privacy/security standpoint, OpenClaw is a disaster by design (thanks for the link, Jeremy, and Ken for clarification!).
However, I’m not sure that having OpenClaw manage a robot's high-level functions presents much danger, especially if it's isolated within a network bubble.

Best Regards,
-- Sergei


Kyoung Choe

unread,
Feb 20, 2026, 2:56:23 PMFeb 20
to HomeBrew Robotics Club
Try nanobot, which is much smaller: https://github.com/HKUDS/nanobot

Sergei Grichine

unread,
Feb 21, 2026, 1:59:46 PMFeb 21
to hbrob...@googlegroups.com
I created a page describing the process of building and running a generic isolated Docker container on a desktop machine:


I'll be adding to it as I go. I am not an expert in Docker, let me know if you find anything wrong (or worth adding to it).

Best Regards,
-- Sergei


Marco Walther

unread,
Feb 21, 2026, 4:58:40 PMFeb 21
to hbrob...@googlegroups.com, Sergei Grichine
On 2/21/26 10:59, Sergei Grichine wrote:
> I created a page describing the process of building and running a
> generic isolated Docker container on a desktop machine:
>
> https://github.com/slgrobotics/articubot_one/wiki/Docker-on-a-Desktop
> <https://github.com/slgrobotics/articubot_one/wiki/Docker-on-a-Desktop>
>
> I'll be adding to it as I go. I am not an expert in Docker, let me know
> if you find anything wrong (or worth adding to it).

Thanks for trying this;-)

One quick comment on the Dockerfile. Each 'RUN' command in the
Dockerfile creates it's own 'layer' in the image as far as I know. So
you might want to combine related commands and their cleanup into one
RUN. Something like

RUN apt-get update && \
apt-get dist-upgrade -y && \
apt-get install -y net-tools avahi-daemon inetutils-ping \
curl git sudo python3 make g++ && \
curl -fsSL https://deb.nodesource.com/setup_22.x | \
sudo bash - && \
apt-get install -y nodejs &&
node --version && \
npm --version && \
apt-get clean && apt-get dist-clean

Another Q: Why should the claw user be able to become root? Isn't that
supposed to be a restricted user just for that one task?

Thanks,
-- Marco


>
> Best Regards,
> -- Sergei
>
>
> On Fri, Feb 20, 2026 at 1:56 PM Kyoung Choe <choe....@gmail.com
> <mailto:choe....@gmail.com>> wrote:
>
> Try nanobot, which is much smaller: https://github.com/HKUDS/nanobot
> <https://github.com/HKUDS/nanobot>
>
> On Friday, February 20, 2026 at 10:45:48 AM UTC-8 Sergei Grichine wrote:
>
> one less thing to try...
>
> I attempted to install OpenClaw on my/*Nvidia Jetson Nano*/ (Dev
> Kit)—no joy. The app won't install on the native, updated Ubuntu
> 18.04 because it requires an incompatible version of Node.js.
> NVM didn't help here.
>
> Using Docker (with Ubuntu 24.04 or Debian Bookworm images), I
> managed to reach the OpenClaw "curl..." step, but the host OS
> crashes and reboots—likely because 4GB of RAM isn't enough.
>
> And yes, from a privacy/security standpoint, OpenClaw is a
> disaster by design (thanks for the link, Jeremy, and Ken for
> clarification!).
> https://cantechit.com/2026/02/17/openclaw-the-passion-driven-ai-
> agent-thats-exploding-but-honestly-most-people-shouldnt-touch-
> it/ <https://cantechit.com/2026/02/17/openclaw-the-passion-
> driven-ai-agent-thats-exploding-but-honestly-most-people-
> shouldnt-touch-it/>
> 0.0.0.0:18789 <http://0.0.0.0:18789>, making them easy
> status/2024167561026310237?s=46 <https://x.com/
> jeremynow/status/2024167561026310237?s=46>
>
> Interesting take on OpenClaw vulnerabilities by
> a friend of mine from Cisco.
>
> I’ve not been brave enough to try it yet lol
>
> Jeremy
>
> On Thu, Feb 19, 2026 at 10:25 PM Sergei Grichine
> <vital...@gmail.com> wrote:
>
> There’s a lot of chatter about OpenClaw.
>
> It /“...clears your inbox, sends emails,
> manages your calendar, checks you in for
> flights — all from WhatsApp, Telegram, or
> any chat app you already use.”/
>
> For us, the interesting part is its /
> agentic/ nature and its ability to create
> and execute workflows. It could be applied
> to robots as a “interpretive and creative”
> layer on top of ROS 2.
>
> Here is a good explanation of its
> capabilities from Rob Braxman: https://
> youtu.be/CreaIkyZAd4 <https://youtu.be/
> CreaIkyZAd4>
>
> More info:
>
> * https://openclaw.ai/ <https://
> openclaw.ai/>  - home site
> * https://www.youtube.com/watch?
> v=KjxYpRkPT48 <https://www.youtube.com/
> watch?v=KjxYpRkPT48>  - OpenClaw on RPi 5
>
> For AI thoughts on the topic follow this
> link <https://github.com/slgrobotics/
> articubot_one/wiki/Conversations-with-
> Overlords#question-14>
>
> Has anybody tried it at home? Any thoughts?
>
> Best Regards,
> -- Sergei
>
> --
> You received this message because you are
> subscribed to the Google Groups "HomeBrew
> Robotics Club" group.
> To unsubscribe from this group and stop
> receiving emails from it, send an email to
> hbrobotics+...@googlegroups.com.
> To view this discussion visit https://
> groups.google.com/d/msgid/hbrobotics/
> CA%2BKVXVPQ7PyeskKzn%2BfaUsFHgQ-
> F4mAvcZp0SHtOdqQZb5hW8Q%40mail.gmail.com
> <https://groups.google.com/d/msgid/
> hbrobotics/CA%2BKVXVPQ7PyeskKzn%2BfaUsFHgQ-
> F4mAvcZp0SHtOdqQZb5hW8Q%40mail.gmail.com?
> utm_medium=email&utm_source=footer>.
>
> --
> You received this message because you are subscribed
> to the Google Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving
> emails from it, send an email to
> hbrobotics+...@googlegroups.com.
> To view this discussion visit https://
> groups.google.com/d/msgid/hbrobotics/
> bc9f6b06-1f12-4a6c-8bda-59fef8578af1n%40googlegroups.com <https://groups.google.com/d/msgid/hbrobotics/bc9f6b06-1f12-4a6c-8bda-59fef8578af1n%40googlegroups.com?utm_medium=email&utm_source=footer>.
>
> --
> You received this message because you are subscribed to
> the Google Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails
> from it, send an email to hbrobotics+...@googlegroups.com.
> To view this discussion visit https://groups.google.com/
> d/msgid/hbrobotics/
> CADyjTyYi5VMJNuer%2B3yFr4fhgu2PqxiSN8hPGtxEkV8XczNYug%40mail.gmail.com <https://groups.google.com/d/msgid/hbrobotics/CADyjTyYi5VMJNuer%2B3yFr4fhgu2PqxiSN8hPGtxEkV8XczNYug%40mail.gmail.com?utm_medium=email&utm_source=footer>.
>
> --
> You received this message because you are subscribed to the
> Google Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails
> from it, send an email to hbrobotics+...@googlegroups.com.
>
> To view this discussion visit https://groups.google.com/d/
> msgid/hbrobotics/
> CAArUYGv0fjn0aqSLACoA9k1VVaD6GVE4WXsRvG8_efa1VkX4OA%40mail.gmail.com <https://groups.google.com/d/msgid/hbrobotics/CAArUYGv0fjn0aqSLACoA9k1VVaD6GVE4WXsRvG8_efa1VkX4OA%40mail.gmail.com?utm_medium=email&utm_source=footer>.
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it,
> send an email to hbrobotics+...@googlegroups.com
> <mailto:hbrobotics+...@googlegroups.com>.
> To view this discussion visit https://groups.google.com/d/msgid/
> hbrobotics/84862b8e-6d56-4545-8f8a-23b482b13dedn%40googlegroups.com
> <https://groups.google.com/d/msgid/
> hbrobotics/84862b8e-6d56-4545-8f8a-23b482b13dedn%40googlegroups.com?
> utm_medium=email&utm_source=footer>.
>
> --
> You received this message because you are subscribed to the Google
> Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to hbrobotics+...@googlegroups.com
> <mailto:hbrobotics+...@googlegroups.com>.
> To view this discussion visit https://groups.google.com/d/msgid/
> hbrobotics/
> CA%2BKVXVMrUPsBhDGL%2B7g79ZJzr%3DSxA2TfHHd_Bg7x177tWY2fZA%40mail.gmail.com <https://groups.google.com/d/msgid/hbrobotics/CA%2BKVXVMrUPsBhDGL%2B7g79ZJzr%3DSxA2TfHHd_Bg7x177tWY2fZA%40mail.gmail.com?utm_medium=email&utm_source=footer>.

Sergei Grichine

unread,
Feb 21, 2026, 8:35:38 PMFeb 21
to Marco Walther, hbrob...@googlegroups.com
Thanks, Marco - I'm guilty as charged. Didn't know much about the layers. As I said, I am a Docker newbie...

My few RUNs were supposed to separate the section where I installed "application" (NodeJS) from the base OS installs.
As for the "claw" user being able to sudo - that was one of those "just-in-case" items to tip on how it is done. Should be removed for any practical use.

At the moment I'm not yet compelled to clean it up, as I really don't know yet where this whole exercise is taking me.

So far, I added a note after the Dockerfile:
---------------------------------------------------------

Note: (thanks, Marco Walther!)

  • In Docker, every RUN, COPY, and ADD instruction creates a new layer. Because Docker uses a Union File System, these layers are additive and read-only. My Dockerfile currently deviates from best practices, as it does not combine installation commands into a single RUN instruction.
  • Additionally, the claw user should be restricted; it should only have the permissions necessary for its specific task and should not be able to escalate to root.
--------------------------------------------------------

BTW, does anybody remember this book by John Edward Mullen, a former RSSC member: https://www.amazon.com/Digital-Dick-John-Mullen-ebook/dp/B010R13P6M
It is a good read and funny, I used to have a paper copy - but can't find it now.
Talk about a prophecy...

Best Regards,
-- Sergei

James H Phelan

unread,
Feb 21, 2026, 9:15:46 PMFeb 21
to hbrob...@googlegroups.com

Thanks for the book recommendation Digital Dick.  I got the Kindle edition free with Amazon media points!

James H Phelan
"Nihil est sine ratione cur potius sit quam non sit"
Leibniz

Alan Timm

unread,
Feb 22, 2026, 11:49:29 AMFeb 22
to HomeBrew Robotics Club
Are any of you running openclaw with a local model?  What's working best for you?

After you get everything set up, one of the first things you'll get to play with is how powerful prompt and context engineering can be.  You'll be adjusting the abilities and behaviors of your agent not with code, but with text.

Alan Timm

unread,
Feb 22, 2026, 3:59:38 PMFeb 22
to HomeBrew Robotics Club
Quick update on my end.

I was getting really substandard behavior even with local Qwen2.5-30B-A3B, not great instruction following, not great tool calling.

So I took a quick trip around https://openrouter.ai/ to try out a few models and have settled on sonnet 4.6.
It's like night and day.  Actually helpful, actually useful.  You owe it to yourself to at least try it so you have a good behavioral baseline for whatever else you're working with.

I'm keeping sonnet 4.6 for openclaw, but he can also call claude code to complete tasks which for me defaults to opus 4.6.  sweet.

There are a few other models that I am eventually goint to test that people are getting great results with.
  • sonnet 4.6 - the standard bearer
  • minimax m2.5
  • kimi k2.5?
  • glm5?

And Anthropic confirmed that you can use your existing OATH claude code key with openclaw as long as it's for personal use.
I'll probably flip back over to that in a few minutes when my 5 hour limit resets.  :-)

So. Much. Fun.

Sergei Grichine

unread,
Feb 22, 2026, 10:30:39 PMFeb 22
to hbrob...@googlegroups.com
I've set up an isolated local LLM (Large Language Model) on my machine, which is surprisingly easy to configure and appears to be very secure.

Although I'm not sure if it's directly related to OpenClaw or NanoBot, I see the potential value in exploring this technology - especially since I'm still learning about its applications. 

As a total newbie, covering the basics first seems like a good plan before thinking about integrating this knowledge into my robots.


BTW, the above text was grammar and style checked by this local genie. Yes, it added some "corporate" flavor to my draft line ;-) 

Best Regards,
-- Sergei


Sergei Grichine

unread,
May 8, 2026, 1:29:46 PM (7 days ago) May 8
to hbrob...@googlegroups.com
An interesting tutorial from DFRobots - for about $30 anybody can have a local OpenClaw AI on a robot.


"...it breaks through the traditional limitations of cloud dependency and the inability of devices to think independently, bringing the “intelligence” of AI agents down to the edge chip..."

Hardware - ESP32 UNIHIKER K10 AI Agent Coding Board:

Any takers or personal experience?

Best Regards,
-- Sergei

Alan Timm

unread,
May 8, 2026, 1:44:05 PM (7 days ago) May 8
to HomeBrew Robotics Club
What fun!  The real unlock there is the ESP-Claw package

Two surprises, 1 that it works, and 2 that its an officially supported package from the espressif folks.

2026 is the year of the 'claw!

Thomas Messerschmidt

unread,
May 8, 2026, 6:52:22 PM (7 days ago) May 8
to hbrob...@googlegroups.com
Are you saying that with this you would be able to use a cloud model such as ChatGPT or Claude to control a robot?

Just this week I was looking into software that would allow me to control my robots with the larger (for pay) cloud models for the simple reason that they’re larger, faster, and can do image analysis.


Thomas Messerschmidt

-  

Need something prototyped, built or coded? I’ve been building prototypes for companies for 15 years. I am now incorporating generative AI into products.

Contact me directly or through LinkedIn:   




On May 8, 2026, at 10:44 AM, Alan Timm <gest...@gmail.com> wrote:

What fun!  The real unlock there is the ESP-Claw package

Chris Albertson

unread,
May 8, 2026, 8:40:00 PM (7 days ago) May 8
to hbrob...@googlegroups.com
first off, I think ESP will take over from Raspberry pi.    Pi is expensive and a battery hog.  The company is 100% commeted to Open Source and very low pricing.  

The article is very disengeous.  It did not show even one simple motor being controlled by this “ESP Claw”.  What we saw as a simple pass-through to a chat bot.   Test-in, text-out.  Nothing was being controlled.  And they try to sell a $18 device for $60. (Go to Amazon and search for “ESP32 screen” and you get this https://www.amazon.com/dp/B0D92C9MMH/ref=sspa_dk_detail_0?pd_rd_i=B0D92C9MMH&th=1 ) and it will do what was shown in the article.


But can this  be used to allow an LLM to control a robot?   You could prompt the AI with “write a Python script that would move a ROS-controlled robot 1 meter forward”.  Then you get back some Python code.  But you have to then snip out the Python code and place it in a string called, maybe CoolAIscript.    Then from within your Python code you do “exec(CoolAIscript)”.  So yes, if it works.  It might.


Then, you can ask for something more complex than just moving one meter.     You might try “Write a python script for a ROS controlled root with this URDF to do the command spoken in this audio file”.  But I doubt the LLM is good enough to do that for you.   

The real intended use of ESP Claw is to say “write a code snippet to blink the LED on pin 5 when pin 6 is grounded”.  Doing a lot more than this is pushing past current limits.

But I really am a huge Espressif fan. You can buy an ESP32-S3 for under $10 delivered and have a WiFi-enabled 32-bit dual core processor and can program using native ESP kit, Arduino, or Micro Python.    I know for sure the little chip can drive 12 servo motors in real time and compute accelerations and produce smooth motion. and still have time to do other tasks.

Here is my latest favorite $6 “computer”’ and my new favorite $8 radar sensor.  And yes, you can say “send this data to radar and when this happens, send the data over that interface. and then you get to debug the LLMs mistakes and get it to work.    I think this is more suited for controlling the things in the photo below then a full-up humanoid robot.   But try it.  place the code it returns inside an “exec()” and then you have realtime control but this is not unique to ESP Claw.

It theory you could let the AI control your robot by asking it to create messages that you then dump into a ROS' DDS system but I don’t thing we are there yet.   Try “please blink an LED” first.  Even the chip on the right is slightly too complex for today.





IMG_2114.jpegIMG_2115.jpeg




Sergei Grichine

unread,
May 9, 2026, 1:49:49 AM (7 days ago) May 9
to hbrob...@googlegroups.com
Well, I am not very familiar with the *Claw variants, but it looks like they all can turn some spoken requests into calls to agents.

Consider a simple use case.
- I have a ROS2 robot able to navigate to X,Y coordinates in a house. It had mapped the house before.
- I designated certain places and named them - Fridge=4.7,8.16 етс.
- I ask the robot to approach the fridge - text, voice, whatever in any human language.

Could the Claw look up the fridge in a file, call the agent which is specifically written by me beforehand and the agent will just invoke a Behavior Tree plugin assigning a Nav2 goal?

With a bunch of such agents we can cover a lot of useful behaviors.

Am I wrong?

Best Regards, 
-- Sergei
   

IMG_2114.jpeg
IMG_2115.jpeg

Chris Albertson

unread,
May 9, 2026, 6:24:25 PM (6 days ago) May 9
to hbrob...@googlegroups.com
If using an AI to perform actions, at some point you have to put the AI’s text output into actions on a computer. With co-pilot, you compile the code and put the result in the computer yourself. With real-time agents like Claw, the code has to be placed in something like Python’s “exec()”. What ESP Claw does is build the code for you on a cloud server and then flash it into your chip over USB/serial, but what it builds is a Lua interpreter and some Lua code. The Lua code can do many things like look for a pin to be grounded, send a canned search to a search engine, or write to a screen or set an output pin.

Lua can control a device, but the complexity of the device you can control is very limited. An LED clearly works; a radar does not work


COuld it program a robot? In theory, I could say “pin 4 controls the left knee, pin 5 goes to an accelerometer, … paint my house using rollers and an airless sprayer”. But we are not there yet. But doing an Internet search and writing the result to the screen is okay, and blinking many LEDs in a nice pattern is there. Blinking an LED to show the time of day works too.

I think this ESP Claw could work well for a toaster oven controller. You would write maybe two dozen rules about how the control panel works. Claw stuffs the binary file into FLASH ROM for you. The code it puts in the FLASH is a Lua interpreter and your rules translated into Lua. Lua can do things like web searches.


Here is the problem: The code is stuffed in the FLASH with no human eye looking. Let’s say my toaster oven works, but one day a small child presses the 3 and 7 key at the same time five times in a row, and the toaster explodes because all the MOSFETs turn on and short the mains power. Who would have guessed this, and how could you test that every possible input is safe? I really can imagine how a Lua script might process all valid input correctly and then do random stuff on invalid inputs. And you can never test all invalid inputs.

A real-time agent runs the AI-geared code in real time without any possibility to test it. Even if you want to test it, every time it redeploys the code. The code is not stored on GitHub, only in RAM and then erased after it runs. This is the nature of “agents” that do a task in real time and might never do that task again. This makes testing hard.


> On May 8, 2026, at 10:49 PM, Sergei Grichine <vital...@gmail.com> wrote:
>
> Well, I am not very familiar with the *Claw variants, but it looks like they all can turn some spoken requests into calls to agents.
>
> Consider a simple use case.
> - I have a ROS2 robot able to navigate to X,Y coordinates in a house. It had mapped the house before.
> - I designated certain places and named them - Fridge=4.7,8.16 етс.
> - I ask the robot to approach the fridge - text, voice, whatever in any human language.

I thing the Claw can do whatever the internal Lua script can do. I doubt it ships with a Lua/ROS API library. The question is if you could write one and teach the AI to use it.


The current Lua library seems to have funtions like “write test to screen”, toggle an IO pin, ask a search engine, do an if/the/else block,…

To do your tests, I think all that might be needed is a geanral purpose interface to the ROS DDS system so you could send a massage to the ROS nav stack to go to a point.

I don’t think Claw will ever have a Lua funtion to decode an spoken command. But it might have one to send audio data to a URL and then do the Lua code that server returns


My current robot is my house. It has full time Internet connection and a bare mater hyperviser hosting a stack of small servrs and MANY sensors. Just the other day I was in the bathroom the doppler radar sensor turned on the lights, I closed the door, Then the lights went out because that sensor only sees moving objects. I thought “What a stupid computer! How could I have left the room without first opening the door and it knows the door is closed. So I add one more rule to the list. But it is imperfect bacuse I could open an shut the door without leaving the room and then stand motional and the computer will think I left. But we can confirm this because the RSSI on the nearby WiFi lightbulb should change is a person stands near it and that could confirem a person actualy left. You see the problem? Even if I only have to speak the rules in English There are an endless number of imperfect rules. And what if I decide RSSI is not enough and want to pull CSI from the Hue lightbulb. This is real-time phase and amplitude data over the whole radio spectrum. I’d be speaking pharagraphs of rules and still not cover cases where two peole pass in the hallway. Vibe coding with “Claw" is intractible except of simple stuff.

I’m not doing this to make the lights work, it is a computer science experiment in finding ways to program things that are intractable. I see no solution, none. But that is a good thing. But it works well ennough now that failures are surprising/interresting. Every month I send $100 or so on more sensors and such.

BTW this CSI stuff is real. All of your Wifi devices send CSI data to each other over the WiFi and this allows the router to adjust the phase at each antenna and shape the radiation pattern and cancel ecjhos and such to track your phone. Cell towers do this. It is ubiquitous. “CSI” is a list of about 1,000 complex numbers defined by the WiFi standards. A human body absorbs and refects radio signals and you can see huge (like 10 dB) signal change if you move a few feet. If you happen to have a dozon WiFi devices they are sending lots of detailed data. All you need to do is liten to it. Making sense of the data is a way-hard physics problem but people have figured it out. They have foigured it out well enough to decode gestures and arm movement and dtect breathing motion. All with zero hardware. Ill buy the commercial solution but ther is open source too. https://www.philips-hue.com/en-us/support/article/motionawaretm--transform-your-hue-lights-into-motion-sensors/000011

This might work for robots. After all if the ‘bot has a Raspberry Pi inside, your router is likey aiming a beam at the Pi computed from the CSI data the Pi WiFi chip sends back to the router. With effort the Pi could “point a finger” at 4 different Wifi devices and know where it is with no need for LIDAR.

This is idea for a humanoid robot because it adds no mass, only firmware to the robot and then it can navigate with no added sensors.



Reply all
Reply to author
Forward
0 new messages