Skip to content

Support for Custom LLM Providers (OpenAI Schema Compatible) #321

@antongulin

Description

@antongulin

Problem

Currently, the platform is limited to a few specific LLM providers. I would love to see an option to add custom LLM providers.

Since the vast majority of emerging AI platforms and local inference servers (such as Groq, Together AI, OpenRouter, LM Studio, and vLLM) have adopted the standard OpenAI API schema, adding a generic "Custom Provider" option would be incredibly powerful.

Proposed Solution

Add a "Custom OpenAI-Compatible Provider" option in the LLM settings where users can simply define:

  1. Base URL (e.g., https://api.groq.com/openai/v1)
  2. API Key
  3. Model Name (e.g., llama3-70b-8192)

This would instantly unlock dozens of external providers and local models without the maintainers needing to build and maintain native integrations for each one.

Alternatives Considered

No response

Additional Context

No response

Willing to Help Implement?

  • I would like to work/help with this

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions