mirror of
https://github.com/katanemo/plano.git
synced 2026-05-09 07:42:43 +02:00
deploy: 38f7691163
This commit is contained in:
parent
2d2eccc7b4
commit
21e57ab028
2 changed files with 24 additions and 1 deletions
|
|
@ -198,6 +198,28 @@ abstracts the complexities of integrating with different LLM providers, providin
|
|||
calls, handling retries, managing rate limits, and ensuring seamless integration with cloud-based and on-premise
|
||||
LLMs. Simply configure the details of the LLMs your application will use, and Arch offers a unified interface to
|
||||
make outbound LLM calls.</p>
|
||||
<section id="adding-custom-llm-provider">
|
||||
<h2>Adding custom LLM Provider<a @click.prevent="window.navigator.clipboard.writeText($el.href); $el.setAttribute('data-tooltip', 'Copied!'); setTimeout(() => $el.setAttribute('data-tooltip', 'Copy link to this element'), 2000)" aria-label="Copy link to this element" class="headerlink" data-tooltip="Copy link to this element" href="#adding-custom-llm-provider" x-intersect.margin.0%.0%.-70%.0%="activeSection = '#adding-custom-llm-provider'"><svg height="1em" viewbox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><path d="M3.9 12c0-1.71 1.39-3.1 3.1-3.1h4V7H7c-2.76 0-5 2.24-5 5s2.24 5 5 5h4v-1.9H7c-1.71 0-3.1-1.39-3.1-3.1zM8 13h8v-2H8v2zm9-6h-4v1.9h4c1.71 0 3.1 1.39 3.1 3.1s-1.39 3.1-3.1 3.1h-4V17h4c2.76 0 5-2.24 5-5s-2.24-5-5-5z"></path></svg></a></h2>
|
||||
<p>We support any OpenAI compliant LLM for example mistral, openai, ollama etc. We offer first class support for openai and ollama. You can easily configure an LLM that communicates over the OpenAI API interface, by following the below guide.</p>
|
||||
<p>For example following code block shows you how to add an ollama-supported LLM in the <cite>arch_config.yaml</cite> file.
|
||||
.. code-block:: yaml</p>
|
||||
<blockquote>
|
||||
<div><ul class="simple">
|
||||
<li><p>name: local-llama
|
||||
provider_interface: openai
|
||||
model: llama3.2
|
||||
endpoint: host.docker.internal:11434</p></li>
|
||||
</ul>
|
||||
</div></blockquote>
|
||||
<p>For example following code block shows you how to add mistral llm provider in the <cite>arch_config.yaml</cite> file.</p>
|
||||
<div class="highlight-yaml notranslate"><div class="highlight"><pre><span></span><code><span id="line-1"><span class="p p-Indicator">-</span><span class="w"> </span><span class="nt">name</span><span class="p">:</span><span class="w"> </span><span class="l l-Scalar l-Scalar-Plain">mistral-ai</span>
|
||||
</span><span id="line-2"><span class="w"> </span><span class="nt">provider_interface</span><span class="p">:</span><span class="w"> </span><span class="l l-Scalar l-Scalar-Plain">openai</span>
|
||||
</span><span id="line-3"><span class="w"> </span><span class="nt">model</span><span class="p">:</span><span class="w"> </span><span class="l l-Scalar l-Scalar-Plain">ministral-3b-latest</span>
|
||||
</span><span id="line-4"><span class="w"> </span><span class="nt">endpoint</span><span class="p">:</span><span class="w"> </span><span class="l l-Scalar l-Scalar-Plain">api.mistral.ai:443</span>
|
||||
</span><span id="line-5"><span class="w"> </span><span class="nt">protocol</span><span class="p">:</span><span class="w"> </span><span class="l l-Scalar l-Scalar-Plain">https</span>
|
||||
</span></code></pre></div>
|
||||
</div>
|
||||
</section>
|
||||
<section id="example-using-the-openai-python-sdk">
|
||||
<h2>Example: Using the OpenAI Python SDK<a @click.prevent="window.navigator.clipboard.writeText($el.href); $el.setAttribute('data-tooltip', 'Copied!'); setTimeout(() => $el.setAttribute('data-tooltip', 'Copy link to this element'), 2000)" aria-label="Copy link to this element" class="headerlink" data-tooltip="Copy link to this element" href="#example-using-the-openai-python-sdk" x-intersect.margin.0%.0%.-70%.0%="activeSection = '#example-using-the-openai-python-sdk'"><svg height="1em" viewbox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><path d="M3.9 12c0-1.71 1.39-3.1 3.1-3.1h4V7H7c-2.76 0-5 2.24-5 5s2.24 5 5 5h4v-1.9H7c-1.71 0-3.1-1.39-3.1-3.1zM8 13h8v-2H8v2zm9-6h-4v1.9h4c1.71 0 3.1 1.39 3.1 3.1s-1.39 3.1-3.1 3.1h-4V17h4c2.76 0 5-2.24 5-5s-2.24-5-5-5z"></path></svg></a></h2>
|
||||
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><code><span id="line-1"><span class="kn">from</span> <span class="nn">openai</span> <span class="kn">import</span> <span class="n">OpenAI</span>
|
||||
|
|
@ -238,6 +260,7 @@ make outbound LLM calls.</p>
|
|||
</div></div><aside class="hidden text-sm xl:block" id="right-sidebar">
|
||||
<div class="sticky top-16 -mt-10 max-h-[calc(100vh-5rem)] overflow-y-auto pt-6 space-y-2"><p class="font-medium">On this page</p>
|
||||
<ul>
|
||||
<li><a :data-current="activeSection === '#adding-custom-llm-provider'" class="reference internal" href="#adding-custom-llm-provider">Adding custom LLM Provider</a></li>
|
||||
<li><a :data-current="activeSection === '#example-using-the-openai-python-sdk'" class="reference internal" href="#example-using-the-openai-python-sdk">Example: Using the OpenAI Python SDK</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
|
|
|
|||
File diff suppressed because one or more lines are too long
Loading…
Add table
Add a link
Reference in a new issue