A production-grade, CLI-based conversational AI agent that remembers important facts across conversations and responds intelligently using contextual memory.
This project is designed to be:
- Easy to clone & run
- Robust in the face of API failures
- Useful even across multiple sessions
The agent uses Gemini LLMs for response generation, FAISS for semantic memory retrieval, and SQLite for persistent chat and memory storage.
Unlike simple chatbots, this agent:
- 🧠 Remembers important facts (e.g., user name)
- 🗂 Separates short-term and long-term memory
- 🚫 Does NOT blindly load old chats
- 🎯 Uses memory only when relevant
- 💻 Runs entirely in the terminal (CLI)
- Important user facts are extracted and stored
- Relevant memories are retrieved using semantic similarity
- Memory persists across runs
- Uses vector embeddings for memory retrieval
- Ensures only relevant past context is injected
- Prevents prompt pollution
- Stores:
- Full chat history
- Structured long-term memories
- Zero external DB setup required
- Uses official
google-genaiSDK - Graceful handling of:
- Quota exhaustion
- API errors
- Clean fallback messages instead of crashes
- No UI required
- Commands:
/help/exit
- Designed for developers & terminal users
- Python 3.10+
- Gemini LLM (via
google-genai) - FAISS (vector similarity search)
- SQLite (persistent storage)
- dotenv (environment management)
git clone https://github.com/malikdeepak09/Conversation_Agent.git
cd Conversation_Agentpython3 -m venv myenv
source myenv/bin/activatepython -m venv myenv
myenv\Scripts\activatepip install -r requirements.txtGEMINI_API_KEY=your_gemini_api_key_herepython main.py🤖 Conversation Agent (CLI)
Type /help for commands, /exit to quit
----------------------------------------
You >If you'd like to contribute to this project, feel free to fork it and submit pull requests. Please ensure that your code follows the existing style and includes proper tests.
This project is licensed under the MIT License - see the LICENSE file for details.