From d2157248649d8a3720d174113f82a8598629bb04 Mon Sep 17 00:00:00 2001 From: Musa Date: Sun, 27 Jul 2025 09:26:12 -0700 Subject: [PATCH] Update llm_provider.rst (#543) --- docs/source/concepts/llm_provider.rst | 20 +++++++++++--------- 1 file changed, 11 insertions(+), 9 deletions(-) diff --git a/docs/source/concepts/llm_provider.rst b/docs/source/concepts/llm_provider.rst index 1c4b2aaf..eabdaa96 100644 --- a/docs/source/concepts/llm_provider.rst +++ b/docs/source/concepts/llm_provider.rst @@ -38,27 +38,29 @@ Adding custom LLM Provider We support any OpenAI compliant LLM for example mistral, openai, ollama etc. We also offer first class support for OpenAI, Anthropic, DeepSeek, Mistral, Groq, and Ollama based models. You can easily configure an LLM that communicates over the OpenAI API interface, by following the below guide. -For example following code block shows you how to add an ollama-supported LLM in the `arch_config.yaml` file. +For example following code block shows you how to add an ollama-supported LLM in the ``arch_config.yaml`` file. .. code-block:: yaml - llm_providers: - - model: some_custom_llm_provider/llama3.2 - provider_interface: openai - base_url: http://host.docker.internal:11434 -And in the following code block shows you how to add mistral llm provider in the `arch_config.yaml` file. + llm_providers: + - model: some_custom_llm_provider/llama3.2 + provider_interface: openai + base_url: http://host.docker.internal:11434 + +And in the following code block shows you how to add mistral llm provider in the ``arch_config.yaml`` file. .. code-block:: yaml + llm_providers: - - name: mistral/ministral-3b-latest - access_key: $MISTRAL_API_KEY + - name: mistral/ministral-3b-latest + access_key: $MISTRAL_API_KEY Example: Using the OpenAI Python SDK ------------------------------------ .. code-block:: python - from openai import OpenAI + from openai import OpenAI # Initialize the Arch client client = OpenAI(base_url="http://127.0.0.1:2000/")