An agent runtime for building tool-using LLM applications with Rust. It is:
-
Composable: Mentra gives you a runtime builder, provider abstraction, and async tool traits so you can assemble the agent loop you actually want.
-
Controllable: Builtin policy and authorization hooks let you run permissive demos, fail closed, or inspect tool requests before execution.
-
Persistent: Agents, teams, background work, and runtime state can live in a SQLite-backed store instead of disappearing after a single turn.
Crate README | API Docs | Examples | Issues
Mentra is a Rust runtime for applications where language models need to reason, call tools, and keep working across turns. At a high level, it provides a few major pieces:
-
A runtime builder for wiring model providers, persistence, policies, skills, and host application state.
-
Tool execution primitives, including builtin
shell,background_run,check_background,files,task, and team coordination tools. -
Provider integrations for OpenAI, OpenRouter, Anthropic, Gemini, Ollama, and LM Studio, with streaming responses and normalized token usage reporting.
-
Persistence and coordination for agents, subagents, teams, task boards, snapshots, memory compaction, and background notifications.
This repository is a small workspace:
mentra/: the publishable runtime crate.mentra-provider/: the publishable provider-core crate.examples/: runnable examples built on top of the runtime.docs/: design notes and feature-specific documentation.
Publish mentra-provider first, then mentra.
Add Mentra and Tokio to your Cargo.toml:
[dependencies]
mentra = "0.5.0"
tokio = { version = "1.50.0", features = ["macros", "rt-multi-thread"] }Then, in your main.rs:
use mentra::{BuiltinProvider, ContentBlock, ModelSelector, Runtime};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let runtime = Runtime::builder()
.with_provider(BuiltinProvider::OpenAI, std::env::var("OPENAI_API_KEY")?)
.build()?;
let model = runtime
.resolve_model(BuiltinProvider::OpenAI, ModelSelector::NewestAvailable)
.await?;
let mut agent = runtime.spawn("Assistant", model)?;
let message = agent
.send(vec![ContentBlock::text(
"Summarize why tool-using agents matter.",
)])
.await?;
println!("{}", message.text());
Ok(())
}More examples can be found in the examples/ workspace crate,
including:
quickstart: minimal single-agent setup.chat: interactive, persisted runtime with skills, policies, and multiple providers.custom_tool: registering a custom tool withToolSpec::builder(...)andToolExecutor.subagent_tool: disposable subagent delegation inside a tool.team_collaboration: persistent teammate workflows.openai_oauth: OpenAI OAuth-backed provider setup.
The builtin runtime shell uses /bin/sh on Unix hosts and cmd.exe on
Windows hosts. The OpenAI OAuth example keeps PersistentTokenStoreKind::Auto
platform-native as well: macOS uses Keychain, while Windows and Linux use the
file-backed store.
If you want to explore the workspace after cloning the repository, the quickest path is the example crate.
Run the lightweight quickstart example:
cargo run -p mentra-examples --example quickstart -- "Summarize the benefits of tool-using agents."Run the richer interactive example:
cargo run -p mentra-examples --example chatThe examples load environment variables from .env when available. Set
OPENAI_API_KEY for the OpenAI-backed quickstart, or OPENAI_API_KEY,
OPENROUTER_API_KEY, ANTHROPIC_API_KEY, and/or GEMINI_API_KEY for the
interactive chat example. You can also set MENTRA_MODEL to force a specific
OpenAI model instead of resolving the newest available OpenAI model.
First, check the crate README and the
API documentation. If you want more implementation detail, the
docs/ directory includes notes on file operations, memory, shell
safety, and parallel tool calls.
For release notes and migration guidance for the current tooling architecture, see CHANGELOG.md.
If the answer is not there, please open an issue on the issue tracker.
Thanks for helping improve Mentra.
Before sending changes, run the same checks as CI:
cargo fmt --all --check
cargo clippy --workspace --all-targets -- -D warnings
cargo test --workspaceMentra currently targets Rust 1.85 or newer.
This project is licensed under the MIT license.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Mentra by you shall be licensed as MIT, without any additional terms or conditions.