mirror of
https://github.com/katanemo/plano.git
synced 2026-04-28 02:23:56 +02:00
Update llm_provider.rst (#543)
This commit is contained in:
parent
ac3fb4cb5b
commit
d215724864
1 changed files with 11 additions and 9 deletions
|
|
@ -38,27 +38,29 @@ Adding custom LLM Provider
|
|||
We support any OpenAI compliant LLM for example mistral, openai, ollama etc. We also offer first class support for OpenAI, Anthropic, DeepSeek, Mistral, Groq, and Ollama based models.
|
||||
You can easily configure an LLM that communicates over the OpenAI API interface, by following the below guide.
|
||||
|
||||
For example following code block shows you how to add an ollama-supported LLM in the `arch_config.yaml` file.
|
||||
For example following code block shows you how to add an ollama-supported LLM in the ``arch_config.yaml`` file.
|
||||
|
||||
.. code-block:: yaml
|
||||
llm_providers:
|
||||
- model: some_custom_llm_provider/llama3.2
|
||||
provider_interface: openai
|
||||
base_url: http://host.docker.internal:11434
|
||||
|
||||
And in the following code block shows you how to add mistral llm provider in the `arch_config.yaml` file.
|
||||
llm_providers:
|
||||
- model: some_custom_llm_provider/llama3.2
|
||||
provider_interface: openai
|
||||
base_url: http://host.docker.internal:11434
|
||||
|
||||
And in the following code block shows you how to add mistral llm provider in the ``arch_config.yaml`` file.
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
llm_providers:
|
||||
- name: mistral/ministral-3b-latest
|
||||
access_key: $MISTRAL_API_KEY
|
||||
- name: mistral/ministral-3b-latest
|
||||
access_key: $MISTRAL_API_KEY
|
||||
|
||||
Example: Using the OpenAI Python SDK
|
||||
------------------------------------
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from openai import OpenAI
|
||||
from openai import OpenAI
|
||||
|
||||
# Initialize the Arch client
|
||||
client = OpenAI(base_url="http://127.0.0.1:2000/")
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue