Arch provides first-class support for multiple LLM providers through native integrations and OpenAI-compatible interfaces. This comprehensive guide covers all supported providers, their available chat models, and detailed configuration instructions.
..note::
**Model Support:** Arch supports all chat models from each provider, not just the examples shown in this guide. The configurations below demonstrate common models for reference, but you can use any chat model available from your chosen provider.
Configuration Structure
-----------------------
All providers are configured in the ``llm_providers`` section of your ``arch_config.yaml`` file:
..code-block:: yaml
version: v0.1
listeners:
egress_traffic:
address: 0.0.0.0
port: 12000
message_format: openai
timeout: 30s
llm_providers:
# Provider configurations go here
- model: provider/model-name
access_key: $API_KEY
# Additional provider-specific options
**Common Configuration Fields:**
-``model``: Provider prefix and model name (format: ``provider/model-name``)
-``access_key``: API key for authentication (supports environment variables)
-``default``: Mark a model as the default (optional, boolean)
-``name``: Custom name for the provider instance (optional)
-``base_url``: Custom endpoint URL (required for some providers)
Provider Categories
-------------------
**First-Class Providers**
Native integrations with built-in support for provider-specific features and authentication.
**OpenAI-Compatible Providers**
Any provider that implements the OpenAI API interface can be configured using custom endpoints.
Supported API Endpoints
------------------------
Arch supports the following standardized endpoints across providers:
..list-table::
:header-rows:1
:widths:30 30 40
* - Endpoint
- Purpose
- Supported Clients
* - ``/v1/chat/completions``
- OpenAI-style chat completions
- OpenAI SDK, cURL, custom clients
* - ``/v1/messages``
- Anthropic-style messages
- Anthropic SDK, cURL, custom clients
First-Class Providers
---------------------
OpenAI
~~~~~~
**Provider Prefix:**``openai/``
**API Endpoint:**``/v1/chat/completions``
**Authentication:** API Key - Get your OpenAI API key from `OpenAI Platform <https://platform.openai.com/api-keys>`_.
**Supported Chat Models:** All OpenAI chat models including GPT-5, GPT-4o, GPT-4, GPT-3.5-turbo, and all future releases.
..list-table::
:header-rows:1
:widths:30 20 50
* - Model Name
- Model ID for Config
- Description
* - GPT-5
-``openai/gpt-5``
- Next-generation model (use any model name from OpenAI's API)
* - GPT-4o
-``openai/gpt-4o``
- Latest multimodal model
* - GPT-4o mini
-``openai/gpt-4o-mini``
- Fast, cost-effective model
* - GPT-4
-``openai/gpt-4``
- High-capability reasoning model
* - GPT-3.5 Turbo
-``openai/gpt-3.5-turbo``
- Balanced performance and cost
* - o3-mini
-``openai/o3-mini``
- Reasoning-focused model (preview)
* - o3
-``openai/o3``
- Advanced reasoning model (preview)
**Configuration Examples:**
..code-block:: yaml
llm_providers:
# Latest models (examples - use any OpenAI chat model)
- model: openai/gpt-4o-mini
access_key: $OPENAI_API_KEY
default: true
- model: openai/gpt-4o
access_key: $OPENAI_API_KEY
# Use any model name from OpenAI's API
- model: openai/gpt-5
access_key: $OPENAI_API_KEY
Anthropic
~~~~~~~~~
**Provider Prefix:**``anthropic/``
**API Endpoint:**``/v1/messages``
**Authentication:** API Key - Get your Anthropic API key from `Anthropic Console <https://console.anthropic.com/settings/keys>`_.
**Supported Chat Models:** All Anthropic Claude models including Claude Sonnet 4, Claude 3.5 Sonnet, Claude 3.5 Haiku, Claude 3 Opus, and all future releases.
..list-table::
:header-rows:1
:widths:30 20 50
* - Model Name
- Model ID for Config
- Description
* - Claude Sonnet 4
-``anthropic/claude-sonnet-4``
- Next-generation model (use any model name from Anthropic's API)
* - Claude 3.5 Sonnet
-``anthropic/claude-3-5-sonnet-20241022``
- Latest high-performance model
* - Claude 3.5 Haiku
-``anthropic/claude-3-5-haiku-20241022``
- Fast and efficient model
* - Claude 3 Opus
-``anthropic/claude-3-opus-20240229``
- Most capable model for complex tasks
* - Claude 3 Sonnet
-``anthropic/claude-3-sonnet-20240229``
- Balanced performance model
* - Claude 3 Haiku
-``anthropic/claude-3-haiku-20240307``
- Fastest model
**Configuration Examples:**
..code-block:: yaml
llm_providers:
# Latest models (examples - use any Anthropic chat model)
- model: anthropic/claude-3-5-sonnet-20241022
access_key: $ANTHROPIC_API_KEY
- model: anthropic/claude-3-5-haiku-20241022
access_key: $ANTHROPIC_API_KEY
# Use any model name from Anthropic's API
- model: anthropic/claude-sonnet-4
access_key: $ANTHROPIC_API_KEY
DeepSeek
~~~~~~~~
**Provider Prefix:**``deepseek/``
**API Endpoint:**``/v1/chat/completions``
**Authentication:** API Key - Get your DeepSeek API key from `DeepSeek Platform <https://platform.deepseek.com/api_keys>`_.
**Supported Chat Models:** All DeepSeek chat models including DeepSeek-Chat, DeepSeek-Coder, and all future releases.
..list-table::
:header-rows:1
:widths:30 20 50
* - Model Name
- Model ID for Config
- Description
* - DeepSeek Chat
-``deepseek/deepseek-chat``
- General purpose chat model
* - DeepSeek Coder
-``deepseek/deepseek-coder``
- Code-specialized model
**Configuration Examples:**
..code-block:: yaml
llm_providers:
- model: deepseek/deepseek-chat
access_key: $DEEPSEEK_API_KEY
- model: deepseek/deepseek-coder
access_key: $DEEPSEEK_API_KEY
Mistral AI
~~~~~~~~~~
**Provider Prefix:**``mistral/``
**API Endpoint:**``/v1/chat/completions``
**Authentication:** API Key - Get your Mistral API key from `Mistral AI Console <https://console.mistral.ai/api-keys/>`_.
**Supported Chat Models:** All Mistral chat models including Mistral Large, Mistral Small, Ministral, and all future releases.
**Authentication:** API Key + Base URL - Get your Azure OpenAI API key from `Azure Portal <https://portal.azure.com/>`_ → Your OpenAI Resource → Keys and Endpoint.
**Supported Chat Models:** All Azure OpenAI chat models including GPT-4o, GPT-4, GPT-3.5-turbo deployed in your Azure subscription.
**Authentication:** AWS Bearer Token + Base URL - Get your API Keys from `AWS Bedrock Console <https://console.aws.amazon.com/bedrock/>`_ → Discover → API Keys.
**Supported Chat Models:** All Amazon Bedrock foundation models including Claude (Anthropic), Nova (Amazon), Llama (Meta), Mistral AI, and Cohere Command models.
**Authentication:** API Key + Base URL - Get your Qwen API key from `Qwen Portal <https://modelstudio.console.alibabacloud.com/>`_ → Your Qwen Resource → Keys and Endpoint.
**Supported Chat Models:** All Qwen chat models including Qwen3, Qwen3-Coder and all future releases.