Skip to content

sohaibdevv/synapse-ai

Repository files navigation

Synapse - AI Conversational Chat

Synapse is a modern, responsive, and intuitive web-based chat application designed to showcase the power and capabilities of the Google Gemini API. It serves as a demonstration of how to integrate a powerful Large Language Model (LLM) into a user-friendly frontend interface, providing a seamless conversational experience.

Purpose

The primary goal of this project is to test and demonstrate the advanced features of the Google Gemini API, specifically the gemini-2.5-flash model. It aims to highlight:

  • Real-time Interaction: Streaming responses from the AI to create a fluid and dynamic conversation.
  • Rich Content Formatting: The AI's ability to generate and display formatted content like code blocks, lists, and other markdown elements.
  • Ease of Integration: How the @google/genai SDK can be cleanly integrated into a modern React application.
  • Quality of AI: The quality, speed, and helpfulness of the responses generated by the Gemini model.

Features

  • Real-time Streaming: Messages from the AI are streamed token-by-token, providing immediate feedback.
  • Markdown Support: Renders bold text, italics, code blocks, and more for clear and structured responses.
  • Responsive Design: A clean and modern UI that works seamlessly on both desktop and mobile devices.
  • Error Handling: Provides clear feedback to the user in case of API errors.
  • Easy to Use: A simple, single-page interface focused on the conversation.

Tech Stack

This project is built with a modern, lightweight tech stack, avoiding complex build configurations.

  • Frontend Library: React with TypeScript for a robust and type-safe component-based architecture.
  • AI Integration: Google Gemini API (@google/genai) using the gemini-2.5-flash model for fast and high-quality chat completions.
  • Styling: Tailwind CSS for a utility-first approach to building a custom and responsive design.
  • Module System: Native ES Modules with importmap for dependency management directly in the browser, using esm.sh as a CDN. This demonstrates a modern, buildless development approach.

Getting Started

To run this application, you need a valid Google Gemini API key.

  1. API Key: The application is designed to securely access the API key from a process.env.API_KEY environment variable, which must be configured in the execution environment.
  2. Serve the Files: Open the index.html file in a compatible web browser or serve it using a local static file server.

The application will initialize the chat session and be ready for interaction.

About

"Synapse - AI Conversational Chat"

Topics

Resources

License

Stars

Watchers

Forks

Contributors