[ruby-talk:444789] [ANN] llm.rb 4.11.0 released

Hi all,

I’ve been working on a Ruby library called llm.rb and wanted to share today's
release (v4.11.0) because it has come a long way.

Most LLM libraries focus on requests and responses. In practice, once you build
anything non-trivial, you end up dealing with state, tool execution, streaming,
and concurrency as first-class concerns.

llm.rb is built around that layer.

It provides:

- Stateful contexts that can be serialized as JSON and restored to a Ruby object.
- Explicit tool execution (local, remote, and MCP)
- Streaming with structured callbacks and tool-call events
- The ability to start tool execution while streaming (overlapping latency)
- Explicit concurrency (threads, fibers, async tasks)
- Integration with MCP servers over stdio and HTTP
- A unified interface across providers (OpenAI, Anthropic, Google, Ollama, etc.)
- Runs on Ruby's stdlib by default, opt-in to optional features like async and net-http-persistent

The goal is to treat LLMs as part of your system architecture, not just API calls,
while keeping everything explicit and stdlib-friendly.

Repo: GitHub - llmrb/llm.rb: Ruby toolkit for multiple Large Language Models (LLMs) · GitHub
Docs: File: README — Documentation by YARD 0.9.36

Thanks

···

______________________________________________
ruby-talk mailing list -- ruby-talk@ml.ruby-lang.org
To unsubscribe send an email to ruby-talk-leave@ml.ruby-lang.org
ruby-talk info -- https://ml.ruby-lang.org/mailman3/postorius/lists/ruby-talk.ml.ruby-lang.org/