plano/tests/archgw/arch_config.yaml
Adil Hafeez 2c67fa3bc0
Fix llm_routing provider element (#382)
* Fix llm_routing provider element

We replaced provider with provider_interface to make it more clear to developers about provider api/backend being used. During that upgrade we removed support for mistral in provider to encourage developers to start using provider_interface. But this demo was not updated to use provider_interface as it was using mistral. This code change fixes it by replacing provider with provider_interface.

Signed-off-by: Adil Hafeez <adil.hafeez@gmail.com>

* fix the path

* move

* add more details

* fix

* Apply suggestions from code review

* fix

* fix

---------

Signed-off-by: Adil Hafeez <adil.hafeez@gmail.com>
2025-01-24 16:34:11 -08:00

69 lines
1.8 KiB
YAML

version: "0.1-beta"
listener:
address: 0.0.0.0
port: 10000
message_format: huggingface
connect_timeout: 0.005s
endpoints:
weather_forecast_service:
endpoint: host.docker.internal:51001
connect_timeout: 0.005s
llm_providers:
- name: gpt-4o-mini
access_key: $OPENAI_API_KEY
provider_interface: openai
model: gpt-4o-mini
default: true
- name: gpt-3.5-turbo-0125
access_key: $OPENAI_API_KEY
provider_interface: openai
model: gpt-3.5-turbo-0125
- name: gpt-4o
access_key: $OPENAI_API_KEY
provider_interface: openai
model: gpt-4o
system_prompt: |
You are a helpful assistant.
prompt_guards:
input_guards:
jailbreak:
on_exception:
message: Looks like you're curious about my abilities, but I can only provide assistance for weather forecasting.
prompt_targets:
- name: get_current_weather
description: Get current weather at a location.
parameters:
- name: location
description: The location to get the weather for
required: true
type: string
format: city, state
- name: days
description: the number of days for the request
required: true
type: string
endpoint:
name: weather_forecast_service
path: /weather
http_method: POST
- name: default_target
default: true
description: This is the default target for all unmatched prompts.
endpoint:
name: weather_forecast_service
path: /default_target
http_method: POST
system_prompt: |
You are a helpful assistant! Summarize the user's request and provide a helpful response.
# if it is set to false arch will send response that it received from this prompt target to the user
# if true arch will forward the response to the default LLM
auto_llm_dispatch_on_response: false