contenox/runtime: An open-source state machine for GenAI workflows in Go

76 views
Skip to first unread message

Alexander Ertli

unread,
Aug 30, 2025, 1:00:07 PM (6 days ago) Aug 30
to golang-nuts
Hi everyone,

I'm excited to share a project I've been heads-down on for the better part of this year: an LLM backend management and orchestration API written in Go.

After a few failed attempts and several months of work, it's reached a point where I'd genuinely value feedback from developers who understand this space.

In a nutshell, it's a system to manage multiple LLM backends (Ollama, vLLM, OpenAI, etc.), execute complex, conditional workflows ("Task Chains") that can branch based on model output, call external hooks, and handle a variety of data types.

Key Components:

- Unified API: Manage models, backends, and provider configs through a single OpenAPI 3.1 spec.

- Affinity Groups: Control exactly which models are available to which backends for routing and access control.

- Powerful Task Engine: Define workflows with multiple steps that can conditionally branch, parse responses (as numbers, scores, ranges, etc.), and integrate with external systems via hooks.

- OpenAI-Compatible: Includes endpoints that mimic the OpenAI API, making it easier to integrate with existing tools.

It's Apache 2.0 licensed and available on GitHub.

__
I'd be incredibly grateful if you could take a look, star it if it seems interesting, and open an issue with any thoughts, feedback, or questions—no matter how small.

GitHub: https://github.com/contenox/runtime

Docs & API Spec: https://github.com/contenox/runtime/blob/main/docs/api-reference.md

Thanks for your time and any feedback you might have.

Best,
Alexander Ertli

Reply all
Reply to author
Forward
0 new messages