Harbor is a containerized LLM toolkit — a large Docker Compose project with a CLI and a Tauri app for managing AI services. Not to be confused with Harbor container registry which is a completely different unrelated project. This repository, Harbor, is the LLM toolkit.
harbor.sh— main CLI (too large to read in full; search for specific functions)services/— all service directories and compose files (e.g.,services/ollama/,services/compose.ollama.yml)compose.yml— base compose file, always includedapp/— Tauri GUI appdocs/— service and user documentationroutines/— CLI internals rewritten in Deno.scripts/— dev scripts in Deno/Bash, run viaharbor dev <script>profiles/default.env— default config distributed to users
harbor ps # list running containers
harbor ls # list all available services
harbor up <service> # start service(s)
harbor down # stop and remove containers
harbor logs <service> # ⚠️ TAILS BY DEFAULT (HANGS AGENT). Use docker logs <container> instead
harbor build <service>
harbor shell <service> # interactive shell in container
harbor exec <service> <cmd>
harbor eject # output standalone Compose config for current selection
$(harbor cmd <service>) # raw docker compose command for a serviceharbor config get <KEY>
harbor config set <KEY> <VALUE>
harbor config update # propagate profiles/default.env → .env
harbor config search <query> # search config keys and valuesNever edit .env directly — always use harbor config get/set.
harbor env <service> # list override vars for a service
harbor env <service> <key> # get a specific var
harbor env <service> <key> <value> # set a specific varharbor dev scaffold <service_name> # scaffold a new service
harbor dev docs # regenerate docs
harbor dev seed # seed test data
harbor dev add-logos [--dry-run] # resolve and write service logosDev scripts live in .scripts/ and must be run via harbor dev, not deno run directly.
harbor routine <name> # run internal Deno routines (routines/)Use the new-service skill: .agents/skills/new-service/SKILL.md.
After editing profiles/default.env, run harbor config update to apply changes to the current .env. The two files are not automatically synced.
services/compose.x.<service>.<integration>.yml files are applied when multiple services run together. When a satellite service can use a backend (e.g., Ollama):
- Add
depends_onfor the backend - Mount config templates needed for the integration
- Set environment variables
- Override entrypoint if config rendering is needed at startup
Example: services/compose.x.photoprism.ollama.yml
- Default model:
HARBOR_<SERVICE>_MODELinprofiles/default.env - Config templates use
${HARBOR_*}vars rendered at container startup - Run
harbor config updateafter changingprofiles/default.env
After any change to service shape (volumes, config, integrations), update the corresponding doc in docs/ immediately. Cover all new env vars, startup behaviors, and integration steps.
Logos are static URL strings in app/src/serviceMetadata.ts, resolved once via:
harbor dev add-logos # resolve and write
harbor dev add-logos --dry-run # preview onlyResolution order: GitHub homepage favicon → dashboardicons.com → GitHub owner avatar.
- Comments only for non-obvious logic — never restate what the code does
- No emojis in UI or copy — use Lucide icons instead
When updating the ## News / changelog section in the README.md, always use a bulleted list format: - **vx.x.x** - one sentence. Do not use a table.