https://www.youtube.com/watch?v=jWv5ty0YmQw
After more than a year of stalling and waiting I spent some time this past weekend tying together some recent advances in AI to make Lloyd, my personal AI assistant.
For now he lives on my pc and in interact with him through speech and this simple gradio interface. But for the first time AI agents are capable of thinking and doing things in the real world.
Central to his capabilities is a shared memory system that I already use called Obsidian. It's my collection of notes and ideas that I can access anywhere. Well now Lloyd and I share it, and he's been reorganizing it to make it more useful for both of us. He uses my obsidian vault as his memory repo, so it's really easy to see what he's learned.
Under the cover it uses a new search engine called QMD which specializes in indexing and searching markdown files. It sounds crazy and simple -- but it works.
The current software stack:
- Openwakeword
- Moonshine ASR
- Qwen3 30B A3B 2507 Instruct for local processing
- OpenClaw w / Sonnet 4.6
- Qwen3-TTS for TTS
It all works together OK. I'm not happy using a wakeword but there's nothing else out there that makes it as easy to get his attention. He's completely interruptible and preemptible by wakeword, and I've coded several extensions to his memory system to make use of the knowledge graph features to connect and traverse ideas in there.
He also maintains a list of "skills", procedures to accomplish goals. Among other things he will use spin off a sub agent to use Claude Code to do anything complicated. It's kinda wild.
I've already made a bunch of changes to make him smarter and faster than he is out of the box.
Next steps are to write an android flutter app to connect him directly to my earbud for access anywhere.
2026 is turning out to be wild.