Skip to content

Commit d800cd4

Browse files
committed
feat(install): add comprehensive full-features install script
Created install-codegraph-full-features.sh with ALL available features: Embedding Providers: - embeddings-local (Candle CPU/GPU) - embeddings-openai (OpenAI API) - embeddings-ollama (Ollama local) - embeddings-jina (Jina AI cloud) - embeddings-lmstudio (LM Studio local) LLM Providers: - codegraph-ai/all-cloud-providers (Anthropic, OpenAI, xAI, Ollama, LM Studio) Features: - daemon (file watching & auto re-indexing) - ai-enhanced (agentic tools) - server-http (HTTP + SSE streaming) - autoagents-experimental (AutoAgents framework) - qwen-integration (Qwen model support) - codegraph-graph/surrealdb (SurrealDB backend) The script: - Validates prerequisites (Homebrew, Rust, SurrealDB) - Optionally checks for Ollama - Compiles with all features enabled - Provides comprehensive configuration examples for all providers - Shows both STDIO and HTTP server startup options - Documents daemon mode setup This is the maximum capability build - useful for development and testing all available features.
1 parent 8957ef2 commit d800cd4

1 file changed

Lines changed: 121 additions & 0 deletions

File tree

install-codegraph-full-features.sh

Lines changed: 121 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,121 @@
1+
#!/bin/bash
2+
# ABOUTME: Installs CodeGraph CLI with ALL available features enabled.
3+
# ABOUTME: Maximum capability build including all embedding providers, LLMs, server modes, and experimental features.
4+
5+
set -euo pipefail
6+
7+
# Comprehensive feature set - everything enabled
8+
FEATURES="daemon,ai-enhanced,embeddings-local,embeddings-openai,embeddings-ollama,embeddings-jina,embeddings-lmstudio,codegraph-vector/jina,codegraph-graph/surrealdb,codegraph-ai/all-cloud-providers,server-http,autoagents-experimental,qwen-integration"
9+
SURR_URL="${CODEGRAPH_SURREALDB_URL:-ws://localhost:3004}"
10+
SURR_NAMESPACE="${CODEGRAPH_SURREALDB_NAMESPACE:-ouroboros}"
11+
SURR_DATABASE="${CODEGRAPH_SURREALDB_DATABASE:-codegraph}"
12+
HTTP_PORT="${CODEGRAPH_HTTP_PORT:-3000}"
13+
INSTALL_PATH="${CARGO_HOME:-$HOME/.cargo}/bin"
14+
15+
info() { printf '[INFO] %s\n' "$1"; }
16+
warn() { printf '[WARN] %s\n' "$1"; }
17+
fail() { printf '[ERROR] %s\n' "$1"; exit 1; }
18+
19+
info "Preparing to install CodeGraph (FULL FEATURES)"
20+
info "This build includes ALL embedding providers, LLMs, and experimental features"
21+
22+
[[ "${OSTYPE:-}" == darwin* ]] || fail "This installer targets macOS."
23+
command -v brew >/dev/null 2>&1 || fail "Homebrew is required (https://brew.sh)."
24+
command -v cargo >/dev/null 2>&1 || fail "Rust is required (https://rustup.rs)."
25+
26+
# Check for SurrealDB
27+
if ! command -v surreal >/dev/null 2>&1; then
28+
warn "SurrealDB CLI not found; installing via Homebrew..."
29+
brew install surrealdb/tap/surreal >/dev/null
30+
info "SurrealDB CLI installed"
31+
else
32+
info "SurrealDB CLI detected"
33+
fi
34+
35+
# Check for Ollama (optional but recommended)
36+
if ! command -v ollama >/dev/null 2>&1; then
37+
warn "Ollama not found - local LLM/embedding support will be limited"
38+
warn "Install from: https://ollama.com/download"
39+
else
40+
info "Ollama detected"
41+
fi
42+
43+
export MACOSX_DEPLOYMENT_TARGET=11.0
44+
info "Compiling CodeGraph with ALL features..."
45+
info "Features: ${FEATURES}"
46+
info ""
47+
info "This may take 5-10 minutes depending on your machine..."
48+
49+
cargo install --path crates/codegraph-mcp --features "${FEATURES}" --force
50+
51+
info "CodeGraph (full features) installed to ${INSTALL_PATH}"
52+
cat <<EOF
53+
54+
✅ Installation Complete - Full Feature Build
55+
=============================================
56+
57+
Enabled Features:
58+
-----------------
59+
🔧 Daemon mode (file watching & auto re-indexing)
60+
🤖 AI-enhanced (agentic tools with tier-aware reasoning)
61+
🧠 All embedding providers:
62+
- Local: Candle (CPU/GPU), ONNX Runtime
63+
- Cloud: OpenAI, Jina AI
64+
- Local API: Ollama, LM Studio
65+
🗣️ All LLM providers:
66+
- Anthropic Claude (Sonnet, Opus)
67+
- OpenAI (GPT-4, GPT-5)
68+
- xAI (Grok)
69+
- Ollama (local)
70+
- LM Studio (local)
71+
- OpenAI-compatible endpoints
72+
🌐 HTTP server (SSE streaming support)
73+
🔬 AutoAgents framework (experimental)
74+
🗄️ SurrealDB backend with HNSW vector search
75+
76+
Next Steps
77+
----------
78+
1. Start SurrealDB with persistent storage:
79+
surreal start --bind 0.0.0.0:3004 --user root --pass root file://\$HOME/.codegraph/surreal.db
80+
81+
2. Configure your preferred providers in ~/.codegraph/config.toml:
82+
83+
[embedding]
84+
provider = "lmstudio" # or ollama, jina, openai, onnx, local
85+
model = "jina-embeddings-v4"
86+
lmstudio_url = "http://localhost:1234/v1"
87+
dimension = 2048
88+
89+
[llm]
90+
enabled = true
91+
provider = "anthropic" # or openai, xai, ollama, lmstudio
92+
model = "claude-sonnet-4"
93+
94+
[surrealdb]
95+
url = "${SURR_URL}"
96+
namespace = "${SURR_NAMESPACE}"
97+
database = "${SURR_DATABASE}"
98+
99+
3. Set API keys for cloud providers (if using):
100+
export ANTHROPIC_API_KEY=sk-ant-...
101+
export OPENAI_API_KEY=sk-...
102+
export JINA_API_KEY=jina_...
103+
104+
4. Index your repository:
105+
codegraph init .
106+
codegraph index . --languages rust,python,typescript
107+
108+
5. Start the MCP server:
109+
# STDIO mode (Claude Desktop, etc.)
110+
codegraph start stdio
111+
112+
# HTTP mode (SSE streaming)
113+
codegraph start http --host 127.0.0.1 --port ${HTTP_PORT}
114+
115+
6. Enable daemon mode for auto-indexing (optional):
116+
codegraph daemon start . --foreground
117+
118+
Ensure ${INSTALL_PATH} is on your PATH so editors can find the binary.
119+
120+
For more info: https://github.com/yourusername/codegraph-rust
121+
EOF

0 commit comments

Comments
 (0)