π Groq AI Translator β Llama 3.1 + LangChain
This notebook demonstrates how to build a fast, accurate AI Translator using:
Groq Llama 3.1 8B Instant
LangChain LCEL pipeline
ChatPromptTemplate
System & Human Messages
OpenAI + Groq setup
You can translate ANY text into ANY target language.
π Features β Translate English β Turkish β Translate any text β any language β Dynamic LCEL pipeline: prompt β model β parser
β Uses high-speed Groq inference β Demonstrates real LangChain constructs
SystemMessage
HumanMessage
ChatPromptTemplate
StrOutputParser
model.invoke()
model | parser
π Project Structure groq-ai-translator/ β βββ translator.ipynb βββ README.md βββ requirements.txt βββ .env.example βββ .gitignore
π§ Installation
Create a virtual environment (optional):
python -m venv venv source venv/bin/activate # Mac/Linux venv\Scripts\activate # Windows
Install dependencies:
pip install -r requirements.txt
π Environment Setup
Duplicate .env.example β rename to .env
Add your keys:
OPENAI_API_KEY=your_openai_key GROQ_API_KEY=your_groq_key
Open translator.ipynb
π§ Example Usage πΈ Using message objects messages = [ SystemMessage(content="Translate this from English to Turkish"), HumanMessage(content="This is an AI translator app"), ] response = model.invoke(messages)
πΈ Using LCEL pipeline chain = model | parser chain.invoke(messages)
πΈ Using PromptTemplate (Dynamic Translation) chain = prompt | model | parser chain.invoke({"language": "french", "text": "Hello"})
π¨βπ» Author
Shehjad Patel AI Engineer | LangChain | LLMs | Groq