Skip to content

nazbn/zori

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Zori

An open-source multi-agent research assistant that connects to your Zotero library. Search, summarize, and explore your research papers through a conversational interface.

Python LangChain LangGraph ChromaDB License PyPI

Requirements

  • A Zotero account with API access
  • An LLM provider: OpenAI, Anthropic, or Ollama (free, runs locally)

Setup

1. Install

pip install zori
Install from source
git clone https://github.com/nazbn/zori.git
cd zori
uv sync

When installed from source, prefix all commands with uv run (e.g. uv run zori init, uv run zori ingest, uv run zori).

2. Initialize

mkdir my-zori && cd my-zori
zori init

zori init creates config.yaml and .env in the current directory. Always run zori from this directory.

3. Configure

Edit .env with your Zotero API key and library ID, and config.yaml to choose your LLM and embeddings provider (see LLM options and Embeddings options).

4. Ingest your library

zori ingest

Downloads your Zotero PDFs, extracts text, and builds the search index in .zori/. Run time depends on library size and embedding provider. You only need to do a full ingest once. To index new or modified items added to Zotero since the last ingest, run zori ingest --sync.

5. Start the assistant

zori

Usage

Zori supports natural language queries for searching and summarizing papers:

> papers on diffusion models
> papers by Vaswani
> papers from 2023 on neural radiance fields
> summarize the first one
> find attention is all you need
> summarize it

Queries use hybrid search (keyword + semantic). References to previous results are resolved in context (e.g. "the first one", "that paper").

Type exit to quit, --new-session to reset conversation history.

LLM options

Provider config.yaml Requires
OpenAI provider: openai OPENAI_API_KEY in .env
Anthropic provider: anthropic ANTHROPIC_API_KEY in .env
Ollama (free, local) provider: ollama Ollama running locally

Embeddings options

LLM and embeddings are configured independently — any combination works.

Provider config.yaml Setup
OpenAI provider: openai, model: text-embedding-3-small OPENAI_API_KEY in .env
Ollama (free, local) provider: ollama, model: nomic-embed-text Ollama running + ollama pull nomic-embed-text
HuggingFace (free, local) provider: huggingface, model: <model> (e.g. all-MiniLM-L6-v2) pip install "zori[huggingface]"

License

MIT — see LICENSE.

Contact

For questions, bug reports, or feature requests, open an issue on the GitHub issue tracker or reach out at nazanin.bagherinejad@rwth-aachen.de.


This repository was developed with the assistance of Claude (Anthropic).

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages