OpenWalrusOpenWalrus

OpenWalrus Documentation

Build and run autonomous AI agents locally with OpenWalrus — a Rust-powered, local-first agent runtime.

OpenWalrus is an open-source agent runtime that runs autonomous AI agents entirely on your machine. Built in Rust, it bundles LLM inference, tool execution, persistent memory, and messaging integrations into a single binary.

How it works

OpenWalrus runs a background daemon that manages agents, routes events, and dispatches tool calls. You interact with agents through the CLI, or connect them to Telegram and Discord.

[Sources] ──→ [Event Loop] ──→ [Agents]
 Socket         mpsc channel     Agent 1
 Telegram                        Agent 2
 Discord                         ...
 Tool calls

Every event flows through a single mpsc::unbounded channel. The event loop dispatches each event as its own async task — agents never block each other.

Architecture

The runtime is composed of focused crates:

CratePurpose
walrus-coreAgent execution, runtime, hooks, model trait
walrus-modelLLM providers (OpenAI, Claude, DeepSeek, local)
walrus-memoryPersistent memory (SQLite, filesystem, in-memory)
walrus-daemonBackground service, config, event loop
walrus-channelTelegram and Discord integrations
walrus-socketUnix domain socket transport
walrus-cliCLI interface and REPL

Each crate can be used independently. The daemon composes them all into a running system.

What you can do

On this page