From 21e57ab02833a95a277f536da6fd4dd43580aa4c Mon Sep 17 00:00:00 2001
From: adilhafeez
We support any OpenAI compliant LLM for example mistral, openai, ollama etc. We offer first class support for openai and ollama. You can easily configure an LLM that communicates over the OpenAI API interface, by following the below guide.
+For example following code block shows you how to add an ollama-supported LLM in the arch_config.yaml file. +.. code-block:: yaml
++++
+- +
name: local-llama +provider_interface: openai +model: llama3.2 +endpoint: host.docker.internal:11434
For example following code block shows you how to add mistral llm provider in the arch_config.yaml file.
+- name: mistral-ai
+ provider_interface: openai
+ model: ministral-3b-latest
+ endpoint: api.mistral.ai:443
+ protocol: https
+from openai import OpenAI
@@ -238,6 +260,7 @@ make outbound LLM calls.