LLM-powered search router that chooses direct answers or live web retrieval, returning concise, validated responses with optional source citations via a modular Express + LangChain backend.
-
Updated
Feb 24, 2026 - TypeScript
LLM-powered search router that chooses direct answers or live web retrieval, returning concise, validated responses with optional source citations via a modular Express + LangChain backend.
LangChainRunnables is a Python-based project showcasing and learing five distinct LangChain workflows—Branch, Lambda, Parallel, Passthrough, and Sequence—using OpenRouter’s free API. It demonstrates AIdriven text processing tasks generating facts, summarizing reports, creating notes and quizzes, responding to sentiment feedback, and crafting jokes
Successfully developed a Multi-Domain AI Personal Assistant using LangChain, OpenAI, and Streamlit. The application seamlessly integrates multiple specialized capabilities, including document-based question answering (QA), Python code execution, debugging, explanation and optimization, web search, latest news retrieval, and currency conversion.
A collection of basic code snippets and experiments created while learning Generative AI fundamentals.
Add a description, image, and links to the runnables topic page so that developers can more easily learn about it.
To associate your repository with the runnables topic, visit your repo's landing page and select "manage topics."