An open-source multi-agent research assistant that connects to your Zotero library. Search, summarize, and explore your research papers through a conversational interface.
- A Zotero account with API access
- An LLM provider: OpenAI, Anthropic, or Ollama (free, runs locally)
1. Install
pip install zoriInstall from source
git clone https://github.com/nazbn/zori.git
cd zori
uv syncWhen installed from source, prefix all commands with uv run (e.g. uv run zori init, uv run zori ingest, uv run zori).
2. Initialize
mkdir my-zori && cd my-zori
zori initzori init creates config.yaml and .env in the current directory. Always run zori from this directory.
3. Configure
Edit .env with your Zotero API key and library ID, and config.yaml to choose your LLM and embeddings provider (see LLM options and Embeddings options).
4. Ingest your library
zori ingestDownloads your Zotero PDFs, extracts text, and builds the search index in .zori/.
Run time depends on library size and embedding provider. You only need to do a full ingest once.
To index new or modified items added to Zotero since the last ingest, run zori ingest --sync.
5. Start the assistant
zoriZori supports natural language queries for searching and summarizing papers:
> papers on diffusion models
> papers by Vaswani
> papers from 2023 on neural radiance fields
> summarize the first one
> find attention is all you need
> summarize it
Queries use hybrid search (keyword + semantic). References to previous results are resolved in context (e.g. "the first one", "that paper").
Type exit to quit, --new-session to reset conversation history.
| Provider | config.yaml |
Requires |
|---|---|---|
| OpenAI | provider: openai |
OPENAI_API_KEY in .env |
| Anthropic | provider: anthropic |
ANTHROPIC_API_KEY in .env |
| Ollama (free, local) | provider: ollama |
Ollama running locally |
LLM and embeddings are configured independently — any combination works.
| Provider | config.yaml |
Setup |
|---|---|---|
| OpenAI | provider: openai, model: text-embedding-3-small |
OPENAI_API_KEY in .env |
| Ollama (free, local) | provider: ollama, model: nomic-embed-text |
Ollama running + ollama pull nomic-embed-text |
| HuggingFace (free, local) | provider: huggingface, model: <model> (e.g. all-MiniLM-L6-v2) |
pip install "zori[huggingface]" |
MIT — see LICENSE.
For questions, bug reports, or feature requests, open an issue on the GitHub issue tracker or reach out at nazanin.bagherinejad@rwth-aachen.de.
This repository was developed with the assistance of Claude (Anthropic).