Skip to content

Commit 7bfb496

Browse files
author
Sami Rusani
committed
Reframe README around model-flexible continuity
1 parent b7cfb84 commit 7bfb496

1 file changed

Lines changed: 13 additions & 5 deletions

File tree

README.md

Lines changed: 13 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,9 @@ Alice fixes that.
2121

2222
It provides a **local-first memory and continuity engine** for capture, recall, resumption, open-loop tracking, and correction-aware, trust-aware memory, so you do not have to rebuild context from scratch every time work resumes.
2323

24-
**Works via CLI, MCP, OpenClaw import, Hermes integration, and local-first workflows.**
24+
**Bring your own models, keep one continuity layer.**
25+
26+
**Works across local, self-hosted, enterprise, and external-agent workflows via CLI, MCP, provider runtime, OpenClaw import, and Hermes integration.**
2527

2628
## Current phase
2729

@@ -33,8 +35,8 @@ Phase 11 is now the active planning and execution phase:
3335
- `P11-S2` Ollama + llama.cpp Adapters is shipped
3436
- `P11-S3` vLLM Adapter + Self-Hosted Performance Path is shipped
3537
- `P11-S4` Model Packs Tier 1 is shipped
36-
- `P11-S5` Azure Adapter + AutoGen Integration is the active sprint
37-
- later Phase 11 work adds tier-2 packs and launch-clarity assets
38+
- `P11-S5` Azure Adapter + AutoGen Integration is shipped
39+
- `P11-S6` Model Packs Tier 2 + Launch Clarity Assets is the active sprint
3840
- Historical planning and control docs: [docs/archive/planning/2026-04-08-context-compaction/README.md](docs/archive/planning/2026-04-08-context-compaction/README.md)
3941

4042
## Why Alice exists
@@ -101,6 +103,11 @@ That makes it easier to audit why an answer appeared, how it was derived, and ho
101103

102104
Alice Core runs locally and exposes the same continuity semantics through the CLI and MCP, so you can use it with your own workflows instead of being locked into a closed assistant product.
103105

106+
### Swap providers, not behavior
107+
108+
Alice is now model-flexible.
109+
You can switch or standardize model backends across local, self-hosted, enterprise, and external-agent environments without rewriting Alice's continuity, memory, approval, or provenance behavior.
110+
104111
## Use Alice with your existing agents
105112

106113
Alice is designed to be a **continuity layer**, not a closed assistant silo.
@@ -112,9 +119,10 @@ It already supports:
112119
- **Hermes integration paths**
113120
- **Hermes external memory provider**
114121
- **Provider runtime abstraction for workspace-scoped model/provider integration**
122+
- **Local, self-hosted, enterprise, and external-agent deployment paths**
115123
- imported workflow data from Markdown and ChatGPT exports
116124

117-
That means you can use Alice to upgrade an existing agent stack instead of rebuilding everything around a new runtime.
125+
That means you can use Alice as shared continuity infrastructure across providers and frameworks instead of rebuilding memory behavior per runtime.
118126

119127
## What ships today
120128

@@ -127,7 +135,7 @@ The current open-source surface includes:
127135
- shared explainability across recall, resume, open-loop review, and explain surfaces
128136
- scheduled archive maintenance, ops status reporting, and failure alerting
129137
- Hermes external memory provider for always-on continuity prefetch and Alice memory tools inside Hermes
130-
- provider runtime abstraction with workspace-scoped provider registration, capability snapshots, OpenAI-compatible base adapter, local Ollama/llama.cpp adapters, and Azure adapter support
138+
- provider runtime abstraction with workspace-scoped provider registration, capability snapshots, OpenAI-compatible base adapter, local Ollama/llama.cpp, self-hosted vLLM, enterprise Azure, model packs, and external-agent integration paths
131139
- importers for OpenClaw, Markdown, and ChatGPT exports
132140
- OpenClaw adapter and demo path
133141
- evaluation harness and integration docs

0 commit comments

Comments
 (0)