Fix llm_routing provider element (#382)

* Fix llm_routing provider element

We replaced provider with provider_interface to make it more clear to developers about provider api/backend being used. During that upgrade we removed support for mistral in provider to encourage developers to start using provider_interface. But this demo was not updated to use provider_interface as it was using mistral. This code change fixes it by replacing provider with provider_interface.

Signed-off-by: Adil Hafeez <adil.hafeez@gmail.com>

* fix the path

* move

* add more details

* fix

* Apply suggestions from code review

* fix

* fix

---------

Signed-off-by: Adil Hafeez <adil.hafeez@gmail.com>
This commit is contained in:
Adil Hafeez 2025-01-24 16:34:11 -08:00 committed by GitHub
parent 84af476c75
commit 2c67fa3bc0
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
13 changed files with 60 additions and 21 deletions

View file

@ -9,7 +9,7 @@ listener:
# Centralized way to manage LLMs, manage keys, retry logic, failover and limits in a central way
llm_providers:
- name: OpenAI
provider: openai
provider_interface: openai
access_key: $OPENAI_API_KEY
model: gpt-4o
default: true