Hi Changkun and apologies for the late reply.
As you wrote, Python-based tools and libraries play a huge role in creating models. At this time, the Go team has no vision or intention to compete in that area — Python serves users needs well, and is the right tool for that job. And infrastructure projects like
Ollama and
LocalAI do a great job of serving these models and making them available to services written in any other language, including Go.
Where the Go team sees an opportunity (and as you wrote) is in the scalable production components of AI-powered products. This may include tools for facilitating the serving of models — Ollama and LocalAI are both great examples of this — but in my talk I was speaking to the opportunity more broadly than just serving models. Rather, the two areas of opportunity I have in mind are: (a) the frameworks, tools, libraries, etc., that make creating and serving AI-powered Go applications faster, easier, and more reliable; and (b) the other, non-model-serving infrastructure that AI-powered applications use, like vector databases. (Separately, I also see a lot of room for tutorials, guides, etc., that can help Go developers get started with AI-powered applications, but I think that’s outside the scope of your question.)
All of that said, the Go team intends to support all these use cases, including those related to model serving, in two ways: First, we’ll continue to make Go more performant and well-suited for scalable production components in general. There is a lot of overlap in what makes Go well suited for AI-powered applications vs. other kinds of applications, so we’ll keep our focus there to ensure we and the ecosystem around us continue to shine. Eli Bendersky published
this blog post earlier today that touches on this point and more.
Second, we will also look at gaps that make building some kinds of AI-focused tools and capabilities difficult in Go. For example, we’ve heard some feedback on how Go’s lack of native support for vectorized instructions limits some kinds of infrastructure projects. As the community looks to build more of these AI-focused tools and libraries, we will want to know the frictions they encounter so that we can prioritize the work we can do to support them.
Cameron