Blue Gopher Meets Stochastic Parrot: About Go and AI

Go code can make use of LLMs and LLMs can make use of Go code, as I demonstrated in the previous two Spotlights. Because I wanted to see the concepts of LLM calling and MCP server clearly in front of me, I decided to use no library, SDK, or framework. These would only obscure what's going on behind the scenes. Pure, dependency-free Go code, on the other hand, reveals the underlying mechanics of Go-LLM communication.

This doesn't mean that packages, SDKs, or frameworks are useless. They can hide boilerplate code behind convenient APIs and save developers from reinventing the wheel. This Spotlight is about them.

The Gopher and the Parrot

You may have heard the term “stochastic parrots” being used for LLMs, as they statiscitally mimic thinking and reasoning without real understanding. Still, they're capable of doing tasks that few would have deemed possible until the release of GPT-3.5 caused worldwide attention.

With Go, you can tap the capabilites of LLMs as well as provide extra functionality to LLMs. Moreover, many useful applications and tools are written in Go (which, from a pure user perspectie, isn't thaaaat interesting, but as a Gopher, you can read the sources and maybe participate in the development in one way or another.)

The following list is by no way exhaustive; it shall serve as an entry point for digging deeper.

Libraries

I'll start with some libraries that you can use to enrich your apps with AI capabilites or build AI tools.

LangChainGo

The first one, LangChainGo is a classic. Created in 2023 (which is quite a while back, considering the rather short (public) history of LLMs) as a port of the Python project LangChain, LangChainGo enables Go applications to integrate with LLMs, process documents for LLM use, and develop agentic tools and workflows.

LangChainGo predates the model context protocol (MCP) but builds upon a similar idea: to chain (hence the name) LLMs, tools, and processing steps together to achieve what we would call today an agentic workflow. Built-in chains can call LLMs, chain steps together, do map-reduce operations to process large documents, chain LLM conversations together to build conversation memory, or provide document retrieval. A wealth of specialized components help with prompting, integrating models and embeddings, parsing output, splitting data into smaller chunks, retrieving text from vector databases, and constructing agents.

LangChainGo is a good choice if you seek LangChain-style composability and if you want to port LangChain project over to Go.

tmc/langchaingo: LangChain for Go, the easiest way to write LLM-based programs in Go

The MCP Go SDK

When building LLM agents today, there is no way around the Model Context Protocol, or MCP for short. Although made by a single company (Anthropic), it's open-source and has been adopted across vendors. Anthropic offers SDKs for multiple languages, Go included. The Go SDK is still in an early phase, however, so expect breaking changes and a fluctuating API. Still, for future projects, the official SDK is a robust choice.

modelcontextprotocol/go-sdk: The official Go SDK for Model Context Protocol servers and clients. Maintained in collaboration with Google.

mark3labs/mcp-go

If you want an MCP library that's more stable than the official SDK, mark3labs/mcp-go is a popular package and certainly a good alternative to the official SDK while it's in its pre-1.0 phase. The author of mcp-go, Ed Zynda, has been added to the Anthropic MCP steering committee back in May, which underscores mcp-go’s central position in the Go MCP landscape. Moreover, the official Go SDK is heavily inspired by mcp-go (alongside other SDKs such as mcp-golang and go-mcp.

mark3labs/mcp-go: A Go implementation of the Model Context Protocol (MCP), enabling seamless integration between LLM applications and external data sources and tools.

Related to mcp-go is mark3labs/mcphost. An MCP Host brings LLMs, MCP clients, and MCP servers together. Being a CLI tool, it's configured through JSON config files and can run interactively or in batch mode.

mark3labs/mcphost

Dive

Dive by Deep Noodle (I love the company name! Here's where it originates from) is a modern AI/LLM toolkit. If you want to build agent frameworks (including hierarchical multi-agent systems) and workflow automation with multiple LLM backends and streaming support, Dive is worth a closer look.

(BTW, Deep Noodle are also the makers of Rizor, an embeddable scripting language for Go.)

deepnoodle-ai/dive: Dive is an AI toolkit for Go that can be used to create specialized AI agents, automate workflows, and quickly integrate with the leading LLMs.

Chromem

If you want to build AI apps with retrieval-augmented generation (RAG) or other embedding-based features, you need a vector database. Chromem is an alternative to stand-alone vector databases, as you can embed it into your app—no extra running binary required. Moreover, Chromem has zero dependencies. Think “The SQLite of vector databases.”

philippgille/chromem-go: Embeddable vector database for Go with Chroma-like interface and zero third-party dependencies. In-memory with optional persistence.

Applications

Libraries and SDKs aren't the only way to make your AI-development life easier. Applications can provide infrastructure and tooling as an alternative to hosted services.

Ollama

For running LLMs on consumer hardware, Ollama is a great choice. With Ollama, installing and running an LLM locally is like pulling and running docker images. With a decent choice of open source LLMs, most offered at several sizes to match the available hardware, you can spend hours playing with models and watching your task monitor showing the GPU go 100%. The OpenAI-compatible /chat/completions API makes Ollama accessible to LLM clients that speak the OpenAI protocol.

ollama/ollama: Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.

LocalAI

LocalAI, together with its sister projects LocalAGI and LocalRecall, goes beyond pure text LLMs. LocalAI aims at providing a drop-in replacement for OpenAI’s API backed by locally running models. LocalAGI adds support for building no-code agents, and LocalRecall manages knowledge bases stored in local vector databases powered by Chromem (see above).

A unique selling point for LocalAI is its support of multimodal LLMs. Whether you want to generate audio from text, generate or analyze images, recognize speech, detect voice activity, or generate videos, LocalAI can run the required models and orchestrate your workflow.

mudler/LocalAI: 🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference

Crush

A local, terminal-based, open source, no-strings-attached tool for AI assisted coding, written in Go? Meet Crush, the AI coding TUI app that you can connect to LLMs and language servers of your choice (gopls comes preconfigured, yay) and let the LLM hack away.

Crush came into existence as an offspring of OpenCode after Charm hired OpenCode's main developer. OpenCode continues as a separate project under a new GitHub org. There has been some confusion about the two projects as Charm kept the original name for a while, causing two identically-named projects living side-by-side. But the name “OpenCode” wouldn't have been a good fit for Charm anyway, who name their software “Bubble Tea”, “Lip Gloss”, “Huh”, “Glamour”, and “VHS”, to list a few. So eventually, OpenCode became Crush.

Anyway, I tested Crush and I love it. Hope it'll help me crush(*) development times for the stockpile of apps I have in my head…

(*) haha, silly pun, haha, yes, pun intended, lol

charmbracelet/crush: The glamourous AI coding agent for your favourite terminal 💘

The Final Verdict™

Outside the realm of AI science and research (where researchers seem to love Python and Jupiter Notebooks to pieces), AI becomes increasingly language-agnostic, and Go, being a perfect glue language, steadily grows into a first-class AI apps language.

The above list of Go AI libs and tools shall pave the way for exploring AI with Go. If you try out one or more of these and suddenly find that time flew by in an instant… well, it wasn't me!! No, no, certainly not!