Part of the Lamatic AgentKit ·
kits/embed/chat
An AI-powered Next.js app that converts raw meeting transcripts into structured insights (summary, action items, risks, next steps, follow-up email) and delivers them to Slack — all through a Lamatic chat widget embedded in the UI.
kits/embed/chat/
├── app/
│ ├── page.js # Landing page — Server Component
│ ├── layout.js # Root layout (Geist font + Vercel Analytics)
│ ├── globals.css # Tailwind v4 theme + CSS variables
│ └── Screenshots/ # Demo screenshots
│ ├── 1.png
│ ├── fromLamatic-Running.png
│ ├── FromwebPage-With Followup mail-Running.png
│ └── Slack_integrated-Summarizer.png
├── components/
│ ├── LamaticChat.js # Widget lifecycle — mounts root div + script
│ ├── HeroActions.jsx # Interactive hero buttons (Client Component)
│ ├── TranscriptPlayground.jsx # Textarea + Analyze button
│ └── ui/ # shadcn/ui primitives
├── flows/
│ └── embedded-chatbot-chatbot/ # Exported Lamatic flow JSON
├── .env.local # ← create this (see below)
└── package.json
- Node.js 18+
- A Lamatic account with your meeting intelligence flow deployed
- A Slack incoming webhook URL (configured in your Lamatic flow)
cd kits/embed/chat
npm installNEXT_PUBLIC_LAMATIC_PROJECT_ID=your_project_id
NEXT_PUBLIC_LAMATIC_FLOW_ID=your_flow_id
NEXT_PUBLIC_LAMATIC_API_URL=https://your-project.lamatic.devAll three values are available in Lamatic Studio → Project Settings → Embed Widget.
npm run dev
# Open http://localhost:3000# Option A — Vercel CLI
vercel --root kits/embed/chat
# Option B — click the button at the top of this fileAdd the same three NEXT_PUBLIC_* env vars in the Vercel dashboard.
After deploying, add your Vercel production URL to the Allowed Domains list in your Lamatic Chat Trigger node, then redeploy the flow.
| # | Node | Purpose |
|---|---|---|
| 1 | Chat Trigger | Receives widget messages; whitelist your domain here |
| 2 | Generate Text | LLM call — extracts summary, action items, risks, next steps, email |
| 3 | Generate JSON | Structures the LLM output for Slack formatting |
| 4a | Slack API | Sends formatted card to your Slack channel |
| 4b | Chat Response | Streams the result back to the widget UI |
In the Chat Trigger node config, add * for development or your exact domain for production. Without this, the widget returns 400 on every message.
| Web App | Lamatic Studio |
|---|---|
![]() |
![]() |
| Slack Output | Landing Page |
|---|---|
![]() |
![]() |
page.js (Server Component)
└── <LamaticChat /> (Client Component)
│
│ On mount (useEffect):
├── Creates <div id="lamatic-chat-root"
│ data-api-url="..."
│ data-flow-id="..."
│ data-project-id="..."
│ /> and appends to document.body
│
└── Injects <script type="module"
src="https://widget.lamatic.ai/chat-v2?projectId=...">
The widget's React app mounts into #lamatic-chat-root,
fetches chatConfig, creates an IndexedDB session, and
is ready to send messages.
Why bootstrap on mount (not on button click)?
The widget needs ~500 ms to fetch chatConfig and create a session. Bootstrapping immediately on page load means the widget is fully ready before the user clicks "Open Copilot", preventing the "unexpected error" on the first message send.
MIT — see LICENSE.
Built by Vijayshree Vaibhav for the Lamatic AgentKit Challenge.



