bot hand

50 views
Skip to first unread message

A J

unread,
Sep 29, 2025, 11:00:28 PMSep 29
to HomeBrew Robotics Club
Some ETH-Zurich researchers are open-sourcing their bot hand class.

I marvel at the complexity of the human hand; it has served us well.


Stephen Williams

unread,
Sep 30, 2025, 2:19:26 AMSep 30
to hbrob...@googlegroups.com, A J

That is a great reference.

Helps validate whether all of the options have been considered, for the main mechanics anyway.
This only mentions the sensor side.

My current obsession is thinking about how much of a full working hand I can include in a single 3D FDM print, including a 2 material IDEX print.

Stephen
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/946db0d3-82fa-493e-8e19-9b8634f84597n%40googlegroups.com.
--

Stephen D. Williams
Founder: VolksDroid, Blue Scholar Foundation

Thomas Messerschmidt

unread,
Sep 30, 2025, 3:10:52 AMSep 30
to hbrob...@googlegroups.com
I personally like the three-finger hand—two fingers and an opposable thumb, with the thumb positioned between the two fingers. That configuration allows gripping almost anything, yet, is only 3/5 as complex. 




Thomas



On Sep 29, 2025, at 11:19 PM, 'Stephen Williams' via HomeBrew Robotics Club <hbrob...@googlegroups.com> wrote:



Dave Everett

unread,
Sep 30, 2025, 3:24:39 AMSep 30
to hbrob...@googlegroups.com
On Tue, 30 Sep 2025 at 17:10, Thomas Messerschmidt <thomas...@gmail.com> wrote:
I personally like the three-finger hand—two fingers and an opposable thumb, with the thumb positioned between the two fingers. That configuration allows gripping almost anything,

Except for a drinking straw, sheet of paper, sock, wire, wet noodlle :)

I have struct limits on my parallel gripper.  Drink can, jar, mug, plate, gold nuggets.

Dave

Stephen Williams

unread,
Sep 30, 2025, 10:21:52 PMSep 30
to hbrob...@googlegroups.com, Dave Everett

On 9/30/25 12:24 AM, Dave Everett wrote:



On Tue, 30 Sep 2025 at 17:10, Thomas Messerschmidt <thomas...@gmail.com> wrote:
I personally like the three-finger hand—two fingers and an opposable thumb, with the thumb positioned between the two fingers. That configuration allows gripping almost anything,

Except for a drinking straw, sheet of paper, sock, wire, wet noodlle :)
Good point!

I have struct limits on my parallel gripper.  Drink can, jar, mug, plate, gold nuggets.


If you can do 3 fingers well, 4-5 shouldn't be much harder.  While the uneven geometry of the human hand seems unnecessary, sometimes it comes in handy.  So to speak.  In any case, it is an interesting design & engineering problem.


sdw



Dave

A J

unread,
Oct 1, 2025, 12:29:56 PMOct 1
to HomeBrew Robotics Club
I heard the other day on the news or a podcast, a bot scientist saying that there is a huge library
of videos of hands doing various tasks. He mentioned that much of the RL training can be
driven by these captured tasks. 

Some of the VR glasses, like Apple's, create a near-real panorama of the user's environment. I 
wonder if future versions of GPT can connect hand movements and speech to generate instructions
to bot hands.

Chris Albertson

unread,
Oct 1, 2025, 1:52:00 PMOct 1
to hbrob...@googlegroups.com
If you put tape over your two smallest fingers and use the other three, you can do almost anything you could do with all five fingers.   Maybe with practice, you could even be good at it.  What makes this possible is that you are still using a human brain to control the three fingers.

A three, four, or five-fingered hand is easy to build.  We all have 3D printers, and if you can draw it, you can make it.    But what the robot hand lacks is the human brain.     I think if you had any kind of hand that looks like it might work, it likely will if only you could control it.  99 percent of the work is the software.   Once the hand is good enough to put three points of contact on an object, it is good enough.

In fact, control is such a hard problem that we should not even be asking what hand geometry could be best for graphing.  We should ask “which hand geometry is easiest to control."

By that measure, I think the Yale hands still win.





--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Kyoung Choe

unread,
Oct 1, 2025, 4:27:50 PMOct 1
to HomeBrew Robotics Club

Here is an example of how large-scale human hand videos can be used to train a robot hand policy with RL: https://ivl.cs.brown.edu/research/gigahands.html (see the "Application: Motion Retargeting (with Physics)" section at the bottom for the RL-trained policy).

Vision-language-action models are also trained using human hand videos, and these models seem to produce more human-hand-like motions than models trained with gripper videos: https://beingbeyond.github.io/Being-H0/

Stephen Williams

unread,
Oct 1, 2025, 6:31:36 PMOct 1
to hbrob...@googlegroups.com, Chris Albertson

The hardware should not be hard or expensive or bad.  But, so far, it mostly still is.  Seems like we can bridge that shortly.  I have some ideas to try now.

The whole point of trying to solve the hardware well is to get to the point where we can concentrate on control without having artificial gaps that make that even harder than it fundamentally is.  How many existing robotic hands would be competitive with a human typing on a keyboard, playing a piano, or even tying your shoestrings?  Even with perfect control software or a human remotely controlling.


Sdw

Stephen Williams

unread,
Oct 1, 2025, 6:32:35 PMOct 1
to hbrob...@googlegroups.com

Hoping to get this kind of training running on my ML server soon.  If anyone has run these, please share experience.


sdw

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Kyoung Whan Choe

unread,
Oct 1, 2025, 6:59:33 PMOct 1
to hbrob...@googlegroups.com

RL training can typically be done with a single 4090 (or 5090). ManipTrans (used in GigaHands, https://maniptrans.github.io/) mentions ~2 days of training with a 4090.

VLA training/fine-tuning requires more compute. The Being-H0 paper mentions using 32 x A800-80G GPUs.

Kyoung

You received this message because you are subscribed to a topic in the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/hbrobotics/5ik9dJT9lkQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to hbrobotics+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/hbrobotics/7180b578-d0cc-48c2-a0b7-ada019e0addd%40lig.net.

Dan

unread,
Oct 1, 2025, 7:07:23 PMOct 1
to hbrob...@googlegroups.com, Chris Albertson
Of course it still is HARD. The control would be easy if you could build the HARDware. 
There are 27 degrees of freedom in the human hand. Another 6 in the wrist.
I have seen lots of hands with less DOFs built with servos. IMHO using less DOFs just complicates the control. Additionally there are the thousands of touch points within the hand itself.
Duplicating the function of Merkel's discs and Meissner corpuscles is also needed within the hardware. Don't forget there is an incredible interconnected nervous system that need to be designed and built.
Artificial muscles and hydrogel skins will probably have the best chance.
We are years away within the material sciences realm.

But if you think you can build a servo based, four fingered hand with about a dozen DOFs that can do what a human hand can do then please build it !

Those that CAN....DO, those that DID....TEACH...
Everyone else is just a dreamer or a critic.





Thomas Messerschmidt

unread,
Oct 1, 2025, 8:55:44 PMOct 1
to hbrob...@googlegroups.com
This leads me to believe that we are a long way from dexterous humanoid robots. After spending a bit of time reading over the website, it looks like it would require a "moonshot" kind of attempt to make this work. 

Perhaps if all (?) of the videos available online (estimates are somewhere over 20 billion hours on youtube alone) were dissected and annotated, and used as examples, then fed into a vision action model as training material, robotics would proceed at a pace similar to LLMs. Of course you would have all of the video creators in an uproar, just like the writers and artists are now. (That is what the lawyers live for. ;)

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Stephen Williams

unread,
Oct 1, 2025, 9:24:16 PMOct 1
to hbrob...@googlegroups.com, Thomas Messerschmidt

I don't think there are any copyright issues at all for training an RL model on hand movements & activities from any video that you can legally obtain.  So almost everything on the Internet, YouTube, etc. should be fair game.  An RL model is not going to duplicate a movie or a book, unless we're talking sign language.

sdw

Stephen Williams

unread,
Oct 1, 2025, 9:30:05 PMOct 1
to hbrob...@googlegroups.com

2 very fast 96GB GPUs + very fast 768GB of RAM + very fast 64c128t CPU cores should compare reasonably well with 32 x A800-80G GPUs - just might take 12 times as long to compute.  Hope this kind of training checkpoints well.  However, server class cards with NVLink sharing memory may be much better than more serial processing on a couple PCIe-linked GPUs.  That's the kind of thing I need to find out in detail.

I wonder how feasible it is to do distributed GPU training.  The amount of data that needs to be exchanged and how often it needs to be exchanged will control that.


Stephen

Stephen Williams

unread,
Oct 1, 2025, 9:36:39 PMOct 1
to hbrob...@googlegroups.com

The hand is an interesting problem because there are so many features, details, but also constraints on most practical situations.

I'd like to get most of the degrees of freedom in the core hardware, but then be able to share motors in simplified / minimized versions.  Using spring return for fingers / hand (except perhaps forefinger + thumb) simplifies a lot and isn't too much of a loss as nearly all of human actuation can be mimicked with that.

sdw

Steve " 'dillo" Okay

unread,
Oct 1, 2025, 11:06:27 PMOct 1
to HomeBrew Robotics Club
On Wednesday, October 1, 2025 at 10:52:00 AM UTC-7 Chris Albertson wrote:
If you put tape over your two smallest fingers and use the other three, you can do almost anything you could do with all five fingers.   Maybe with practice, you could even be good at it.  What makes this possible is that you are still using a human brain to control the three fingers.

I've broken fingers while skateboarding/snowboarding and had to go through this exact experience. The worst was in 2013 when I fractured my pinkie at the socket. I had to get it taped to the next finger over and then was cast from the medial joint down past my wrist so it could sit back in the socket. 

 So I spent my first month or so at Willow Garage typing with one full hand and a single finger and a thumb on the other poking out of the cast. I could certainly type with one hand and just the one finger and thumb on the other, but I couldn't drive and certainly couldn't grip anything with the cast hand. 
My brain still kept trying to use the other fingers though even though they were immobilized.  
Which is a really long, roundabout way of saying I agree with you. Unless you have some sort of sensory feedback between the participating joints, it's MUCH harder to reliably and repeatedly manipulate things from all angles & orientations. 

'dillo

Chris Albertson

unread,
Oct 2, 2025, 1:31:33 AMOct 2
to Dan, hbrob...@googlegroups.com
Making a full copy of a human hand is not easy.  But it is not so hard to make a pretty usable hand. 

The people who have been working the hardest on human-like hands are those making prosthetic hands for amputees.  Some of these not only look human but can be used for things like cooking and eating, lifting luggage, and there is a video of a girl catching a ball one-handed with her 3D-printed prosthetic hand.     I think if these hands can work for a double amputee, they could work for a humanoid robot.  The hands are not able to do what a natural hand can do, but there are workarounds.   A hand that cannot flip a pencil end-for-end to use the eraser might do a two-handed flip or set the pencil down and re-grip it.     If you want human-like hands, look into the high-end prosthetics.

Some time ago, I found an open-source design for a good prosthetic hand.   About this time, Tilly Locker was about 15 years old and showing them off on TV and YouTube.   She lost both of her hands as a baby and does not remember having them.  She now uses a pair of these open-source hands.   Or she did; I think she now has a newer design.

She is a little older now and still showing what she can do with the newer hands: https://youtu.be/xXbqqeUU7js

Here is an early video that shows hands that are more like the open-source versions I downloaded: https://openbionics.com/how-to-use-a-hero-arm/

I moved the design files into Fusion360 and started work making one.


It very quickly became clear that controlling it would be nearly impossible.  It is not even clear what the inputs to a controller should be because the hand needs to be controlled in real-time.  Even specifying the problem is hard if you don’t know the weight and surface friction and  rigidity (or not) of the object.  Many times, we have to discover this by attempting to pick up an object.   Even we humans don’t pre-plan.  We make it up as we go.

As best I can figure, the controller has to be given some high-level goal.  “Fetch the beer can from the top shelf on the fridge, but it sees there is a milk bottle blocking access to the beer.”   The robot plans a sequence of actions, starting from the “current state.”   It then executes the plan for a millisecond and then recomputes from the now current state.  So the plan continuously evolves and is updated 100+ times per second.   This seems OK, but it is a very poor algorithm because it depends on reaching a goal linearly.  A better solution is to search through a larger range of options in a simulated environment and constantly recompute the search.     “MPC” is what they call it.  Intro here: https://www.mathworks.com/help/mpc/gs/what-is-mpc.html


But if all you need to do is pick up trash from the ground and place it in a trash can, then geometric hands and software like Moveit. Works today. https://moveit.ai/

Stephen Williams

unread,
Oct 2, 2025, 1:51:11 AMOct 2
to hbrob...@googlegroups.com

Stephen Williams

unread,
Oct 2, 2025, 2:12:20 AMOct 2
to hbrob...@googlegroups.com

For avatar use, there is a human in the loop, so the hard part of the control loop doesn't have to be solved.  That is a big use that people keep expecting to skip completely.  I expect it to be a big thing for a while.

One trick to simplifying hand control is to put a camera in the palm or over the knuckles.  Then the target is relative to both the hand and the camera at once, eliminating remote sensing & association.

Control will be difficult, although RL might rescue us.  I'd just be happy to be in that problem space with the hand hardware solved satisfactorily.  I have a lot of constraints that I want to solve for that: high-quality*, lightweight, slim, quiet, and cheap or very cheap.  Either repairable or so cheap it is just recycled & replaced when it fails.

*many degrees of freedom, precise (in a closed loop), strong, fast.


Stephen

On 10/1/25 10:31 PM, Chris Albertson wrote:

Chris Albertson

unread,
Oct 2, 2025, 7:00:22 PMOct 2
to hbrob...@googlegroups.com
E-Nable is a mechanical device.   It does not even use electricity.  I don’t think it is adaptable to a robot.     

I’m seeing that Open Bionics is not longer open source.  I have a Brunel hand imported to Fusion360 if anyone needs it.  But it is very complex, I think by necessity.  Human hands have lots of parts.

The Brunel hand build technique is good.   THere are 3D printed “bones” and then some female molds.   In some cases you place the “bone" in the mould and pour in  polyurethane resin and it makes joints and finger pads.  Kevlar strings are used to make the fingers mover.    Each finger, the thumb and the palm can move.    

But as I said, while you can build one in a week, no one has the software to make it work on a robot.   There is a bigger problem to solve first.

Steven Nelson

unread,
Oct 3, 2025, 2:06:07 PMOct 3
to HomeBrew Robotics Club

Here's a few more Robotic hands and projects. Even one of mine from a few years ago

"E-Nable" A 3D printed prosthetic Hand that I have used with the Imoov forearm servo mount. https://hub.e-nable.org/s/e-nable-forum/wiki/How+to+Make+an+e-NABLE+Device+for+Yourself

Mine is a hack of a hack basically I combined two 3D printed  designs to fit my evil purposes Mwa Ha ha... I added a home made DATA Glove that I made from 5 Adafruit flexable resistors, five 10,000 Ohm resistors a work glove, 1/4" heatshrink tubing all hot glued to the gloves fingers
Youtube videos of my E-nable hand, InMoov forearm and DATA Glove https://youtu.be/vgytoMCJ-Ns?si=c4C_nrt2epr_9k49

Another video of my project
https://youtu.be/-GQYPJoBU24?si=ly_-NnpIqamTKKxP

"InMoov" 
https://inmoov.fr/hand-and-forarm/
their very cool forearm with a 5 Servo mount

InMoov STL files for the whole robot with a parts menu...
https://inmoov.fr/inmoov-stl-parts-viewer/


Alt-Bionics
https://www.altbionics.com/
A very cool Commercial grade hand I just saw at the 2025 Bay Area Maker Faire that uses a modular design and linear actuators.
This is one of their hands at the 2025 Bay Area Maker faire dressed up like "Thing" from the Addams family riding a skateboard. You can see it is very maneuverable...
https://www.facebook.com/steven.nelson.1650/videos/1359176705921758/?mibextid=9drbnH

They say that it can handle 35 lbs. of weight. 

Inspection time: The period of time required to inspect a recently welded piece of metal using your bare hand.
The hotter the metal the shorter the period of inspection time

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Sergei Grichine

unread,
Oct 10, 2025, 11:23:47 PM (13 days ago) Oct 10
to hbrob...@googlegroups.com
Here is a video from Boston Dynamics with some explanation of reasons behind their three-finger hand design:






--
Best Regards,
-- Sergei

Stephen Williams

unread,
Oct 11, 2025, 10:57:20 AM (12 days ago) Oct 11
to hbrob...@googlegroups.com

That is a nice design, especially if the goal is maximum lift capacity.  Good argument for 7 degrees of freedom being enough for most things.  Because it is relatively large, they can fit significant gear motors right into the hand.  A smaller hand might not work as well.  But it simplifies robot design to have the hand be self-contained rather than forearm-driven, like humans mostly are.  Our thumb angle + finger spread uses local muscles - a good observation when designing a hand: An anthropomorphic design would have lifting strength via tendons + local lateral / angle selection by local motors.

I like the way the thumb/finger converts between an additional finger vs a fully opposable thumb.  It seems to be able to align with the middle finger or to oppose between the other fingers.  That allows configuration for both of those important modes.

It is not going to fit anthropomorphic accommodations very well, except large ones.  A more humanoid hand would still be better for those.


Stephen

Reply all
Reply to author
Forward
0 new messages