LangChain-based customer support agent with:
- Retrieval-augmented responses from a local knowledge base (
knowledge/*.txt, optional PDFs) - Optional backend API tooling for order/customer workflows
- Two local interfaces: CLI and Streamlit web chat
Repository: https://github.com/Stennis1/Customer-Service-AI-Agent
Live URL: https://customer-service-ai-agent-a97r.onrender.com/
- Tool-enabled agent (
search_knowledge_base,check_order_status,escalate_to_human) - Conversation memory (last 10 exchanges)
- Chroma vector store for semantic document retrieval
- Optional REST API integration for live order status/tickets/customers
- CLI mode for terminal usage
- Web mode for browser chat via Streamlit
- Python 3.10+
- LangChain
- OpenAI API
- ChromaDB
- Streamlit
- Requests
.
├── agent.py # Core agent class and tools wiring
├── app.py # CLI entrypoint
├── web_app.py # Streamlit web UI entrypoint
├── knowledge_base.py # Document loading + vector retrieval
├── api_tools.py # Optional external API client tools
├── knowledge/ # Support knowledge corpus (.txt/.pdf)
├── requirements.txt # Top-level dependencies
├── .env.example # Environment variable template
└── test_setup.py # Basic OpenAI connection check
- User sends a question in CLI or Web UI.
CustomerServiceAgentinagent.pyreceives the message.- Agent selects tools when needed:
search_knowledge_base-> semantic search via Chroma inknowledge_base.pycheck_order_status-> external API (api_tools.py) if configured, else mock fallbackescalate_to_human-> escalation response
- Agent returns final answer to the interface.
- Python 3.10 or newer
- OpenAI API key
pipand virtual environment support
- Clone and enter project:
git clone https://github.com/Stennis1/Customer-Service-AI-Agent.git
cd Customer-Service-AI-Agent- Create and activate a virtual environment:
python3 -m venv .venv
source .venv/bin/activate- Install dependencies:
pip install -r requirements.txt- Configure environment:
cp .env.example .env- Edit
.envwith at least:
OPENAI_API_KEY=your_openai_keyOptional API integration:
CUSTOMER_API_BASE_URL=https://api.example.com
CUSTOMER_API_KEY=your_customer_api_keypython app.pystreamlit run web_app.pyStreamlit will print a local URL (typically http://localhost:8501).
This repository includes a Render Blueprint file:
render.yamlin the project root
Live deployment:
Deploy steps:
- Push latest code to GitHub.
- In Render, choose
New->Blueprint. - Select this repository (Render reads
render.yamlautomatically). - Set required environment variables in Render:
OPENAI_API_KEY(required)OPENAI_MODEL(optional)CUSTOMER_API_BASE_URLandCUSTOMER_API_KEY(optional, for live backend API calls)
- Add/update support content under
knowledge/. - Supported loaders:
.txtviaTextLoader.pdfviaPyPDFLoader
- Chunks are embedded and stored in
./chromadb.
Basic OpenAI connection test:
python test_setup.pySyntax check:
python -m py_compile agent.py app.py web_app.py api_tools.py knowledge_base.pyIf CUSTOMER_API_BASE_URL and CUSTOMER_API_KEY are set:
check_order_statususes live HTTP calls.
If they are not set:
- Agent falls back to mock order-status behavior for
ORD*IDs.
Required:
OPENAI_API_KEY: OpenAI API key used by model and embeddings
Optional:
OPENAI_MODEL: Chat model identifier (default:openai:gpt-4o-mini)CUSTOMER_API_BASE_URL: Base URL for customer/order/support backendCUSTOMER_API_KEY: Bearer token for backend APILANGCHAIN_API_KEY: For LangSmith tracing (if you enable it)VECTOR_DB_URL: Reserved placeholder, not currently used in runtime
-
OPENAI_API_KEYmissing:- Symptom: agent initialization/search errors.
- Fix: set key in
.env.
-
Empty knowledge results:
- Symptom: poor/no retrieval output.
- Fix: ensure docs exist under
knowledge/and restart app.
-
API requests fail:
- Symptom: order/customer/ticket tool errors.
- Fix: verify
CUSTOMER_API_BASE_URL,CUSTOMER_API_KEY, backend reachability.
- Keep
app.pyandweb_app.pyas thin entrypoints; put logic inagent.py. knowledge_base.pyprovides a backward-compatible aliasKnowledegeBasefor legacy typoed imports.- Prefer adding tests before expanding tools/agent prompt complexity.
- Add
pytesttests for:- Tool behavior
- Fallback paths when env vars are missing
- Knowledge retrieval quality
- Add structured logging and request tracing
- Add Dockerfile and compose profile for reproducible local runtime
- Add CI workflow (lint + test + type checks)
No license file is currently included. Add LICENSE before public reuse/distribution.