Supported LLM Providers
Here is a list of all LLM chat providers available in the Otoroshi LLM Extension :
Native providers
These providers have dedicated implementations with full support for their specific APIs:
- Anthropic
- Azure OpenAI
- Azure AI Foundry
- Cloud Temple 🇫🇷 🇪🇺
- Cloudflare
- Cohere
- Deepseek
- Gemini
- Groq
- Huggingface 🇫🇷 🇪🇺
- JLama (Local Java inference)
- Mistral 🇫🇷 🇪🇺
- Ollama (Local Models)
- OpenAI
- OpenAI Compatible (any OpenAI-compatible API)
- OVH AI Endpoints 🇫🇷 🇪🇺
- Scaleway 🇫🇷 🇪🇺
- X.ai
OpenAI-compatible providers
These providers use an OpenAI-compatible API and are supported through the generic OpenAI compatibility layer:
- Abliteration
- AI/ML API
- Apertis
- AssemblyAI
- Cerebras
- Chutes
- CometAPI
- CompactifAI
- DeepInfra
- Empower
- Featherless AI
- Fireworks AI
- Friendli AI
- Galadriel
- GMI
- Helicone
- Hyperbolic
- Lambda AI
- LlamaGate
- Meta Llama API
- Minimax
- Morph
- Nano GPT
- Nebius AI Studio
- Novita AI
- Nscale
- Nvidia NIM
- OpenRouter
- Perplexity
- Poe
- SambaNova
- Sarvam
- Synthetic
- Together AI
- Venice AI
- Xiaomi Mimo
- Z.AI
Virtual providers
- Load Balancer - distributes requests across multiple providers (see Resilience)