Stanford Researchers Build $400 Self-Navigating Smart Cane

111 views
Skip to first unread message

Edward Katz

unread,
Oct 15, 2021, 1:25:01 AM10/15/21
to Homebrew Robotics Club of Silicon Valley
Researchers at Stanford University have now introduced an affordable robotic cane that guides people with visual impairments safely and efficiently through their environments.

Using tools from autonomous vehicles, the research team has built the augmented cane, which helps people detect and identify obstacles, move easily around those objects, and follow routes both indoors and out.  The augmented cane is equipped with a LIDAR sensor.  The cane has additional sensors including GPS, accelerometers, magnetometers, and gyroscopes, like those on a smartphone, that monitor the user’s position, speed, direction, and so forth. The cane makes decisions using artificial intelligence-based way finding and robotics algorithms like simultaneous localization and mapping (SLAM) and visual servoing — steering the user toward an object in an image.

The augmented cane is not the first smart cane. Research sensor canes can be heavy and expensive — weighing up to 50 pounds with a cost of around $6,000. Currently available sensor canes are technologically limited, only detecting objects right in front of the user. The augmented cane sports cutting-edge sensors, weighs only 3 pounds, can be built at home from off-the-shelf parts and free, open-source software, and costs $400.

The researchers hope their device will be an affordable and useful option for the more than 250 million people with impaired vision worldwide.


Brian Higgins

unread,
Oct 15, 2021, 10:07:20 AM10/15/21
to hbrob...@googlegroups.com
I Posed as a test subject for Patrick. 50 lb. Who would carry that. He couldn’t even afford a decent white cane and the wheel was so tiny it would only work indoors 
News stories can embellish
Brian 

Sent from my iPhone 12 Pro Max

On Oct 14, 2021, at 10:25 PM, 'Edward Katz' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:


--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/CAFMPTVQGZN7MZ1jmFSkUnLBEt-8SEd_fJ88TUSj3HVUjh4JkiA%40mail.gmail.com.

Brian Higgins

unread,
Oct 15, 2021, 10:17:16 AM10/15/21
to hbrob...@googlegroups.com
He’s using RP LiDAR A1 with a raspberry pie in a plastic colander vegetable strainer a cheap cane I guess academia can say anything.  
The tiny flimsy wheel would break the first time out

sorry if I’m getting riled up

Brian 

Sent from my iPhone 12 Pro Max

On Oct 14, 2021, at 10:25 PM, 'Edward Katz' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:



Brian Higgins

unread,
Oct 15, 2021, 10:18:40 AM10/15/21
to hbrob...@googlegroups.com
He didn’t have a smart phone on the one I tried 


Sent from my iPhone 12 Pro Max

On Oct 14, 2021, at 10:25 PM, 'Edward Katz' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:



Sergei Grichine

unread,
Oct 15, 2021, 10:59:36 AM10/15/21
to hbrob...@googlegroups.com
Remember Microsoft HoloLens? It has amazing accuracy in mapping the surroundings (at least indoors). It does have stereo audio, and can be programmed to do anything you want. Basically, it's a perfect tool for at least researching mapping ideas and perfecting audial feedback.

Oculus v2 and other VR glasses are available for relatively low price. I'd guess the problem now is in algorithms and feedback, and creating specialized hardware before that is understood seems counterproductive. Not to diminish the fine effort of Stanford researchers, of course, they are probably way ahead on this path.

Best Regards, 
-- Sergei Grichine
   

Chris Albertson

unread,
Oct 15, 2021, 12:15:46 PM10/15/21
to hbrob...@googlegroups.com
I found the link to the actual paper.  Brian is right. News articles are typically clueless.  The actual paper on this is here: https://www.science.org/doi/10.1126/scirobotics.abg6594

But this is supposed to be open source.  I'm looking for the design files and source code. Has anyone found this?

As to Brian's comment about it being to heavy and some other issues.  If this is open source then all of this can be modified.    If someone were to build custom PCBs in place of off-the-shelf parts you might get a four-times reduction in size and weight.  Or maybe the computer goes in a backpack and connects wireless to the cane.    That is the good part of open source.    Again, has anyone found the design files?

At first glance, this looks very well done.  I'd be interested to hear if the user interface works well.



--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/CAFMPTVQGZN7MZ1jmFSkUnLBEt-8SEd_fJ88TUSj3HVUjh4JkiA%40mail.gmail.com.


--

Chris Albertson
Redondo Beach, California

Chris Albertson

unread,
Oct 15, 2021, 12:27:44 PM10/15/21
to hbrob...@googlegroups.com
I'm trying to understand how VR glasses would help a blind person.  

Better, I think is this now "spacial audio"   I need to read up on it more but I think it is basically stereo that keeps the source field stationary as the listener moves.    So a person could hear her a drum kit was and point to it at the same location even as they walked around with headphones.   I need to read up on how it works and how the API works.

I'll read the paper over the weekend.  But I see Raspberry Pi and think it might be a bit clunky and use to much battery power.  A new Apple iPhone has 10x more "compute power" and storage and also IMU, GPS, and a full-time data connection using Blue Tooth, Wi-Fi, and Cell networks plus two cameras and a screen -- and the thing runs BSD UNIX.



Sergei Grichine

unread,
Oct 15, 2021, 2:32:12 PM10/15/21
to hbrob...@googlegroups.com
"How VR glasses would help a blind person" - when you ignore the visual part of the HoloLens, you are left with the Kinect-like 3D sensor, full IMU (sensing also head rotation), lots of built-in processing that is delivering full scene, huge CPU, and very sophisticated SDK for your own programming, tons of example code. And yes, the stereo audio that you can generate, already attached to person's ears. To me, that looks like a very good hardware piece to start with. One has to love .Net and Unity though :-)

Here are some pointers: 

Chris Albertson

unread,
Oct 15, 2021, 3:09:27 PM10/15/21
to hbrob...@googlegroups.com
Yes, you get a lot with the VR glasses but so much is wasted as the majority of the processing, bulk and power go to vision.    I think you get the same with a used iPhone 5.   I bought one with a broken home button for $2.  Yes $2 for a camera. microphone and IMU and dual core ARM processor.  junk-phones are bargains.   I can rubber-band this to my bicycle handlebars (actually use blue painter's tape) and have a cheap action camera.     I'm trying run the captured video through YOLO later to see if it can find obstacles that I already avoided.

Most of us may already have a collection of useless cell phones.  They are very powerful devices



Gmail

unread,
Oct 15, 2021, 5:08:07 PM10/15/21
to hbrob...@googlegroups.com
Interesting. I have a close friend who is blind. If anyone needs a real test subject let me know. 



Thomas

-
Want to learn more about ROBOTS?









On Oct 15, 2021, at 11:32 AM, Sergei Grichine <vital...@gmail.com> wrote:



Brain Higgins

unread,
Oct 15, 2021, 7:22:23 PM10/15/21
to hbrob...@googlegroups.com
I spent an hour using this white stick in different cardboard half hallways. I have a close up picture the rig I will find tonight 


Sent from my iPhone 12 Pro Max

On Oct 15, 2021, at 9:15 AM, Chris Albertson <alberts...@gmail.com> wrote:



Ed Katz

unread,
Oct 18, 2021, 1:49:33 AM10/18/21
to HomeBrew Robotics Club

I contacted Patrick Slade, the primary author of this article about accessing the open-source content.

Here is his reply:

Hi Ed,

Here's the link to the open source materials https://github.com/pslade2/AugmentedCane

It's in the data and code availability section of the paper at the end, so kinda tricky to find.

Best,
Patrick

Brain Higgins

unread,
Oct 18, 2021, 9:58:58 AM10/18/21
to hbrob...@googlegroups.com
Thanks 


Sent from my iPhone 12 Pro Max

On Oct 17, 2021, at 10:49 PM, 'Ed Katz' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:


--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages