Our January 2026 *IN PERSON* RSSC Meetup @ CSULB!

35 views
Skip to first unread message

Alan Timm

unread,
Jan 6, 2026, 11:23:42 PMJan 6
to RSSC-List

Our Monthly Meetup @ CSULB!

Please join us this Saturday January 10th at 10:00am for our next *in person* Meetup!


Plug 1331 Palo Verde Ave, Long Beach, CA into your GPS and google will take you right to the front door.  :-)

We'll also post a sign at our old-new meeting place for anyone who wanders over there.


Here's what's on the agenda...


9:30 am: The Doors open. Stop by early and say hi! Or log in early to make sure your mic works. We'll be here.

9:45 am: Business meeting & Elections!


10:00 am:​ Teaching Alfie New Tricks with GR00T N1.6 Part 1:  The Training Pipeline
What does it take to fine-tune a humanoid foundation model on your own robot? In part one, we're breaking down the entire training pipeline—how to capture example behaviors on Alfie, wrangle them into the format GR00T N1.6 expects, and get everything staged for fine-tuning. Part two brings the payoff: teaching Alfie to spot, localize, and grab a soda can! (Alan)

11:00 am:​ See, Think, Do: A Practical Framework for VLM-Driven Robotics on NVIDIA Jetson​
In this talk e’ll explore how to add real-time vision intelligence to a robot using Vision-Language Models running on low-power NVIDIA Jetson hardware. We’ll go beyond perception and dive into what it actually takes to refactor existing robot behaviors into AI-callable tools, and how to design prompts that allow an LLM to reason, select actions, and chain behaviors intelligently. The result is a practical framework for building robots that can see, understand, and act autonomously in dynamic environments! (Jim)

12:00 pm: 30 minute grab-n-go Lunch Break

12:30 pm: Announcements / Show-n-Tell / Open Session. It's a great opportunity to ask questions and get advice. Our members will bring in their projects and share what we've been up to. Are you working on something? Bring it in!


For anyone unable to attend in person, here's the Zoom info for this Saturday's meeting.

Join Zoom Meeting
https://csulb.zoom.us/j/85130575860
Meeting ID: 851 3057 5860

See you there!

Alan Timm

unread,
Jan 9, 2026, 4:13:43 PM (11 days ago) Jan 9
to RSSC-List
Happy Friday!

I'm really excited about both of our talks this Saturday.  
There are a handful of us using Jetsons for on-device compute and they are pretty powerful for their size and price-point.

We're also having our 2026 elections tomorrow, so please let one of us know if you're interested in taking a more active role with the club.

For everyone participating in the Can-Do Challenge next month, our show'n'tell and open session is a good time to test everything out one more time before the contest.

And I'll have everything set up for anyone who wants to try tele-operating Alfie.  It's alot of fun!

See you tomorrow!

screenshot_30122025_140225.jpg

Alan Timm

unread,
Jan 10, 2026, 12:49:01 PM (11 days ago) Jan 10
to RSSC-List
We're all set up and ready to rock.

See you there!

For anyone unable to attend in person, here's the Zoom info for this Saturday's meeting.

Join Zoom Meeting
https://csulb.zoom.us/j/85130575860
Meeting ID: 851 3057 5860

screenshot_10012026_094807.jpg

Alan Timm

unread,
Jan 16, 2026, 7:13:12 PM (4 days ago) Jan 16
to RSSC-List
Hey there!

The recording of our latest meetup has been posted here:
https://www.youtube.com/watch?v=t6U9eNKnx88
screenshot_16012026_155433.jpg

Links to individual topics:

00:00:00 Club Meeting Introductions

00:01:24 Teaching Alfie New Tricks with GR00T N1.6 Part 1: The Training Pipeline What does it take to fine-tune a humanoid foundation model on your own robot? In part one, we're breaking down the entire training pipeline—how to capture example behaviors on Alfie, wrangle them into the format GR00T N1.6 expects, and get everything staged for fine-tuning. Part two brings the payoff: teaching Alfie to spot, localize, and grab a soda can! (Alan)

00:53:41 See, Think, Do: A Practical Framework for VLM-Driven Robotics on NVIDIA Jetson​ In this talk e’ll explore how to add real-time vision intelligence to a robot using Vision-Language Models running on low-power NVIDIA Jetson hardware. We’ll go beyond perception and dive into what it actually takes to refactor existing robot behaviors into AI-callable tools, and how to design prompts that allow an LLM to reason, select actions, and chain behaviors intelligently. The result is a practical framework for building robots that can see, understand, and act autonomously in dynamic environments! (Jim)

02:01:16 Show'n'Tell: (Andrew) Andrew talks about his autonomous rc car and ask for battery recommendations

02:13:58 Show'n'Tell: (Jim) Progress on his lego mindstorms robot for the competition next month!

02:18:44 Show'n'Tell: (JimU) Demo with his custom robot for the competition next month!

02:26:22 Show'n'Tell: (Brian) Demo with his custom robot for the competition next month!

02:33:49 Show'n'Tell: (Ben) Progress with designing the robot claw for the competition next month!

02:37:44 Show'n'Tell: (Alan) Progress on Alfie showing teleoperation, then having others take him for a spin!

Chat Transcript:

03:46:14 Alan Timm: sorry everyone we've been having some camera challenges today.  I'm rebooting the camera right now.

03:48:28 Jim DiNunzio: Battery Packs I recommend: 12v small: Talentcell Rechargeable 12V 3000mAh Lithium ion Battery Pack for LED Strip, CCTV Camera and More, DC 12V/5V USB Dual Output External Battery Power Bank with Charger, Black




04:41:17 Chris: Is it possible to swap the left / right views so we can see this in stereo with cross-eyed viewing?





On Tuesday, January 6, 2026 at 8:23:42 PM UTC-8 Alan Timm wrote:
Reply all
Reply to author
Forward
0 new messages