Skip to content

Commit b9b480a

Browse files
committed
Add commands
1 parent 0db6341 commit b9b480a

4 files changed

Lines changed: 289 additions & 59 deletions

File tree

assets/basics/aitools.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,9 @@
2121

2222
## AI Assistants
2323
```[openclaw](/man/openclaw)```
24+
```[zeroclaw](/man/zeroclaw)```
2425
```[nanobot](/man/nanobot)```
26+
```[picoclaw](/man/picoclaw)```
2527
```[nanoclaw](/man/nanoclaw)```
2628
```[leon](/man/leon)```
2729

assets/commands/nanobot.md

Lines changed: 43 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -1,98 +1,82 @@
11
# TAGLINE
22

3-
ultra-lightweight open-source AI assistant built in approximately 4,000 lines
3+
Ultra-lightweight personal AI assistant
44

55
# TLDR
66

7-
**Start an interactive session**
7+
**Initialize** configuration and set up credentials
88

9-
```nanobot```
9+
```nanobot onboard```
1010

11-
**Send a one-off command**
11+
**Start an interactive** CLI chat session with the AI agent
1212

13-
```nanobot "[task]"```
13+
```nanobot agent```
1414

15-
**Run with a specific local model**
15+
**Start the multi-channel gateway** for chat platform integrations
1616

17-
```nanobot --model [model_name]```
17+
```nanobot gateway```
1818

19-
**Start in server mode for Telegram integration**
19+
**Authenticate** with a chat platform (e.g. WhatsApp QR linking)
2020

21-
```nanobot serve --telegram```
22-
23-
**Schedule a recurring task**
24-
25-
```nanobot schedule "[task]" --cron "[cron_expression]"```
26-
27-
**List active scheduled tasks**
28-
29-
```nanobot tasks```
21+
```nanobot channels login```
3022

3123
# SYNOPSIS
3224

33-
**nanobot** [_options_] [_command_]
34-
35-
**nanobot** **serve** [_options_]
36-
37-
**nanobot** **schedule** [_options_]
25+
**nanobot** [_command_] [_options_]
3826

3927
# PARAMETERS
4028

41-
_COMMAND_
42-
> Natural language task or instruction for the AI assistant.
29+
**onboard**
30+
> Initialize configuration and set up the environment for first-time use. Creates config at **~/.nanobot/config.json**.
4331
44-
**serve**
45-
> Start nanobot in server mode for messaging platform integration.
32+
**agent**
33+
> Start an interactive CLI chat session with the AI agent.
4634
47-
**--telegram**
48-
> Enable Telegram bot integration.
35+
**gateway**
36+
> Run the multi-channel gateway to connect chat platforms such as Telegram, Discord, WhatsApp, Slack, and others.
4937
50-
**--whatsapp**
51-
> Enable WhatsApp integration.
38+
**channels login**
39+
> Authenticate with chat platforms, primarily used for WhatsApp QR-code linking.
5240
53-
**schedule**
54-
> Create a scheduled or recurring task.
55-
56-
**--cron** _EXPRESSION_
57-
> Cron expression for task scheduling.
58-
59-
**tasks**
60-
> List all active scheduled tasks.
61-
62-
**--model** _MODEL_
63-
> Specify the local LLM to use.
64-
65-
**--local**
66-
> Force fully local operation without external APIs.
41+
# DESCRIPTION
6742

68-
**--port** _PORT_
69-
> Port number for server mode (default: 3000).
43+
**nanobot** is an ultra-lightweight personal AI assistant that delivers core agent functionality in approximately 4,000 lines of Python code. It connects closed and open-source LLMs to a local coding agent that can run commands, read logs, execute scripts, and search files on your machine.
7044

71-
**--verbose**
72-
> Enable verbose logging output.
45+
Nanobot supports multiple LLM providers including OpenRouter, Anthropic (Claude), OpenAI (GPT), DeepSeek, Google Gemini, Groq, and custom OpenAI-compatible endpoints. It integrates with chat platforms such as Telegram, Discord, WhatsApp, Feishu, Slack, Email, QQ, DingTalk, and Mochat, allowing the agent to be accessed from various messaging services via the gateway command.
7346

74-
**--help**
75-
> Display help information.
47+
The tool uses MCP (Model Context Protocol) for extending capabilities with external tools and services.
7648

77-
**--version**
78-
> Display version information.
49+
# CONFIGURATION
7950

80-
# DESCRIPTION
51+
Configuration is stored in **~/.nanobot/config.json**. Run **nanobot onboard** for interactive setup.
8152

82-
**nanobot** is an ultra-lightweight open-source AI assistant built in approximately 4,000 lines of code. It supports interaction through messaging platforms including Telegram and WhatsApp, enabling voice and text-based communication from any device.
53+
Minimal configuration requires a provider API key and model selection:
8354

84-
The assistant runs local language models for privacy-focused operation and can execute scheduled tasks autonomously using cron expressions. It supports multi-step task automation, reminders, and conversational interaction without requiring cloud AI services.
55+
```
56+
{
57+
"providers": {
58+
"openrouter": {
59+
"apiKey": "sk-or-v1-xxx"
60+
}
61+
},
62+
"agents": {
63+
"defaults": {
64+
"model": "anthropic/claude-opus-4-5"
65+
}
66+
}
67+
}
68+
```
8569

86-
nanobot is designed to be simple to deploy and extend, with minimal dependencies and a small footprint suitable for running on low-resource hardware including Raspberry Pi devices.
70+
Supported providers: **openrouter**, **anthropic**, **openai**, **deepseek**, **groq**, **gemini**, **minimax**, and **custom** (any OpenAI-compatible endpoint).
8771

8872
# CAVEATS
8973

90-
Messaging platform integrations require API tokens from their respective services. Running local models requires sufficient RAM and compute resources. Scheduled tasks persist only while the server process is running unless configured with a process manager.
74+
Nanobot executes commands on your local machine with your user permissions. Always review agent actions before granting broad tool access. The tool requires network access for LLM API calls and chat platform integrations. Provider API keys are stored in plaintext in the config file.
9175

9276
# HISTORY
9377

94-
nanobot was created by **the HKUDS research group** (Data Intelligence Lab at the University of Hong Kong). It was designed as a minimalist alternative to larger AI assistant frameworks, emphasizing simplicity and local-first operation. The project gained attention for achieving broad AI assistant functionality in a remarkably small codebase.
78+
Nanobot was developed by **HKUDS** (Hong Kong University Data Science Lab) as an ultra-lightweight alternative to larger AI coding assistants. First released in **2025**, it aimed to provide core agent functionality with a minimal codebase, emphasizing research-readiness and a clean, modifiable architecture. The project gained traction as an accessible entry point for developers exploring agentic AI workflows.
9579

9680
# SEE ALSO
9781

98-
[openclaw](/man/openclaw)(1), [claude](/man/claude)(1), [ollama](/man/ollama)(1), [llm](/man/llm)(1)
82+
[picoclaw](/man/picoclaw)(1), [claude](/man/claude)(1)

assets/commands/picoclaw.md

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# TAGLINE
2+
3+
Ultra-lightweight AI assistant for resource-constrained devices
4+
5+
# TLDR
6+
7+
**Initialize** configuration and set up credentials
8+
9+
```picoclaw onboard```
10+
11+
**Start an interactive** CLI chat session with the AI agent
12+
13+
```picoclaw agent```
14+
15+
**Send a one-shot message** without entering interactive mode
16+
17+
```picoclaw agent -m "[question or task]"```
18+
19+
**Start the multi-channel gateway** for chat platform integrations
20+
21+
```picoclaw gateway```
22+
23+
# SYNOPSIS
24+
25+
**picoclaw** [_command_] [_options_]
26+
27+
# PARAMETERS
28+
29+
**onboard**
30+
> Initialize configuration and set up the environment for first-time use. Creates config at **~/.picoclaw/config.json**.
31+
32+
**agent**
33+
> Start an interactive CLI chat session with the AI agent.
34+
35+
**agent -m** _message_
36+
> Send a single message to the agent and receive a response without entering interactive mode.
37+
38+
**gateway**
39+
> Start the multi-channel gateway for chat platform integrations including Telegram, Discord, QQ, DingTalk, LINE, and WeCom.
40+
41+
# DESCRIPTION
42+
43+
**picoclaw** is an ultra-lightweight personal AI assistant written in Go, designed to run on extremely resource-constrained hardware. It uses less than 10MB of RAM and ships as a single self-contained binary with sub-second boot times.
44+
45+
PicoClaw supports RISC-V, ARM64, and x86 architectures natively, making it deployable on devices ranging from $10 RISC-V boards to powerful servers. Despite the minimal footprint, it provides full agent capabilities including planning, web search integration, command execution, and automation workflows.
46+
47+
The tool supports multiple LLM providers including OpenAI, Anthropic (Claude), Google Gemini, and Zhipu. It can integrate with chat platforms via the gateway command for bot deployments on Telegram, Discord, and other messaging services.
48+
49+
# CONFIGURATION
50+
51+
Configuration is stored in **~/.picoclaw/config.json**. Run **picoclaw onboard** for interactive setup.
52+
53+
Key configuration fields:
54+
55+
```
56+
{
57+
"agents": {
58+
"defaults": {
59+
"workspace": "/path/to/workspace",
60+
"model_name": "claude-opus-4-5",
61+
"max_tokens": 8192,
62+
"temperature": 0.7,
63+
"max_tool_iterations": 10
64+
}
65+
},
66+
"model_list": [
67+
{
68+
"model_name": "claude",
69+
"model": "claude-opus-4-5",
70+
"api_key": "sk-ant-xxx"
71+
}
72+
]
73+
}
74+
```
75+
76+
Web search can be enabled via **tools** configuration with support for Brave, Tavily, and DuckDuckGo search providers.
77+
78+
# CAVEATS
79+
80+
PicoClaw executes commands on your machine with your user permissions. API keys are stored in plaintext in the config file. The project was largely AI-bootstrapped with approximately 95% of the core system generated by an AI agent, which may affect code quality and maintainability. Being Go-based, it requires downloading precompiled binaries or building from source rather than installing via a package manager.
81+
82+
# HISTORY
83+
84+
PicoClaw was developed by **Sipeed**, a company known for RISC-V hardware, as an AI assistant optimized for their low-cost development boards. First released in **2025**, it was written in Go through a self-bootstrapping process where the AI agent itself drove the architectural design and code optimization. The project gained attention for running full agent capabilities on hardware costing as little as $10.
85+
86+
# SEE ALSO
87+
88+
[nanobot](/man/nanobot)(1), [claude](/man/claude)(1)

assets/commands/zeroclaw.md

Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
# TAGLINE
2+
3+
Autonomous AI agent runtime built in Rust
4+
5+
# TLDR
6+
7+
**Initialize** configuration with interactive setup wizard
8+
9+
```zeroclaw onboard --interactive```
10+
11+
**Start an interactive** CLI agent session
12+
13+
```zeroclaw agent```
14+
15+
**Send a single message** without entering interactive mode
16+
17+
```zeroclaw agent -m "[message]"```
18+
19+
**Start the full autonomous daemon** with channels and scheduler
20+
21+
```zeroclaw daemon```
22+
23+
**Check system health** and run diagnostics
24+
25+
```zeroclaw doctor```
26+
27+
**List all supported** LLM providers
28+
29+
```zeroclaw providers```
30+
31+
# SYNOPSIS
32+
33+
**zeroclaw** [_command_] [_subcommand_] [_options_]
34+
35+
# PARAMETERS
36+
37+
**onboard**
38+
> Initialize or reconfigure the workspace. Creates **~/.zeroclaw/config.toml** and scaffold files.
39+
40+
**onboard --interactive**
41+
> Run the full 9-step configuration wizard.
42+
43+
**onboard --api-key** _KEY_ **--provider** _PROVIDER_ [**--model** _MODEL_]
44+
> Non-interactive setup with API key and provider in one command.
45+
46+
**onboard --channels-only**
47+
> Repair or reconfigure channels and allowlists only.
48+
49+
**agent** [**-m** _MESSAGE_]
50+
> Run interactive CLI agent or send a single message with **-m**.
51+
52+
**agent --provider** _PROVIDER_
53+
> Override the default provider for this session.
54+
55+
**gateway** [**--port** _PORT_]
56+
> Start the HTTP/WebSocket server for external integrations. Default port: 42617.
57+
58+
**daemon**
59+
> Start the full autonomous runtime with gateway, channels, heartbeat, and scheduler.
60+
61+
**status**
62+
> Display comprehensive system status including provider, memory, channels, and security.
63+
64+
**doctor** [**models** | **traces**]
65+
> Run system diagnostics. Optionally check model catalogs or inspect runtime traces.
66+
67+
**service install** | **start** | **stop** | **status** | **restart**
68+
> Manage zeroclaw as a background system service (systemd or OpenRC).
69+
70+
**channel list** | **doctor** | **start**
71+
> List, health-check, or start configured messaging channels.
72+
73+
**channel add** _TYPE_ _JSON_CONFIG_
74+
> Add a new messaging channel configuration.
75+
76+
**channel bind-telegram** _USER_ID_
77+
> Add a Telegram user to the allowlist.
78+
79+
**auth login** _PROVIDER_ [_PROFILE_]
80+
> Authenticate via OAuth or device code flow.
81+
82+
**auth status** | **logout** _PROFILE_ID_
83+
> Show or remove authentication profiles.
84+
85+
**memory stats** | **list** | **search** _QUERY_ | **delete** _KEY_ | **prune**
86+
> Manage the built-in memory system.
87+
88+
**cron add** _NAME_ _SCHEDULE_ _MESSAGE_ [**--tz** _TIMEZONE_]
89+
> Schedule a recurring task with a cron expression.
90+
91+
**cron list** | **remove** _NAME_
92+
> List or remove scheduled tasks.
93+
94+
**skills list** | **install** _NAME_ | **remove** _NAME_
95+
> Manage agent skills and extensions.
96+
97+
**migrate openclaw** [**--dry-run**]
98+
> Import memory from an OpenClaw runtime.
99+
100+
**completions** _SHELL_
101+
> Generate shell completions for bash, zsh, fish, powershell, or nushell.
102+
103+
**providers**
104+
> List all supported LLM providers and aliases.
105+
106+
**estop** [**--resume**]
107+
> Engage or resume from emergency stop.
108+
109+
# DESCRIPTION
110+
111+
**zeroclaw** is a lightweight, security-first autonomous AI agent runtime built entirely in Rust. It serves as infrastructure for agentic workflows, abstracting models, tools, memory, and execution into a single binary that can be deployed across ARM, x86, and RISC-V architectures.
112+
113+
The runtime compiles to an approximately 9MB binary with cold start under 10 milliseconds and less than 5MB RAM usage at idle. It uses a trait-driven architecture where providers, channels, memory backends, and tools are swappable through configuration without code changes.
114+
115+
ZeroClaw includes a built-in hybrid search memory system combining vector embeddings with keyword search (SQLite-backed), requiring no external dependencies. It supports 28+ LLM providers including OpenRouter, Anthropic, OpenAI, Gemini, Ollama, and any OpenAI-compatible endpoint. Messaging integrations cover 70+ channels including Telegram, Discord, Slack, iMessage, Matrix, Signal, and WhatsApp.
116+
117+
The agent supports multi-turn conversations with context preservation, tool execution (shell, file, git, browser), scheduled tasks via cron expressions, and hardware peripheral control for IoT devices.
118+
119+
# CONFIGURATION
120+
121+
Configuration is stored in **~/.zeroclaw/config.toml**. Run **zeroclaw onboard --interactive** for guided setup.
122+
123+
```
124+
[providers]
125+
default = "openrouter"
126+
127+
[providers.openrouter]
128+
api_key = "sk-or-v1-xxx"
129+
model = "anthropic/claude-opus-4-5"
130+
131+
[memory]
132+
backend = "sqlite"
133+
134+
[runtime]
135+
kind = "native"
136+
137+
[channels.telegram]
138+
bot_token = "123:ABC..."
139+
allowed_users = ["123456789"]
140+
```
141+
142+
Supported memory backends: **sqlite** (default, hybrid search), **postgres**, **markdown**, and **none** (stateless).
143+
144+
Environment variables override config values: **ZEROCLAW_API_KEY**, **ZEROCLAW_PROVIDER**, **ZEROCLAW_MODEL**, **ZEROCLAW_WORKSPACE**.
145+
146+
# CAVEATS
147+
148+
ZeroClaw executes shell commands and file operations with your user permissions. The agent can modify files and run arbitrary commands when tool use is enabled. API keys are stored in **config.toml** and **auth-profiles.json** (encrypted at rest if secrets encryption is enabled). Channel integrations require the daemon to be running. Compilation from source requires at least 2GB RAM and 6GB disk space due to the Rust toolchain.
149+
150+
# HISTORY
151+
152+
ZeroClaw was created by **ZeroClaw Labs** and first released in **2025** as a Rust-based alternative to existing AI agent runtimes. It was designed around the principle of zero overhead and zero lock-in, targeting deployment on resource-constrained hardware while maintaining production-grade extensibility. The project introduced a trait-driven architecture allowing providers, channels, and tools to be swapped without code changes, and gained attention for achieving sub-10ms cold start times with a single-binary deployment model.
153+
154+
# SEE ALSO
155+
156+
[nanobot](/man/nanobot)(1), [picoclaw](/man/picoclaw)(1), [claude](/man/claude)(1), [openclaw](/man/openclaw)(1)

0 commit comments

Comments
 (0)