Skip to content

Providers

LLM Client supports multiple LLM providers. Each provider has its own set of models and configuration options.

Supported Providers

  • OpenAI - GPT-4o, GPT-4o-mini, GPT-3.5-turbo
  • Groq - Llama 3.3, Mixtral (Ultra-fast inference)
  • Gemini - Google's Gemini Pro and Flash models
  • Ollama - Local LLMs (Llama 3, Phi-3, etc.)
  • Ollama Cloud - Access powerful models without a local GPU

How to get API Keys

To use most providers, you need an API key. Please refer to the specific provider page for instructions on how to obtain one: