plano/demos/samples_python/weather_forecast
Salman Paracha fb0581fd39
add support for v1/messages and transformations (#558)
* pushing draft PR

* transformations are working. Now need to add some tests next

* updated tests and added necessary response transformations for Anthropics' message response object

* fixed bugs for integration tests

* fixed doc tests

* fixed serialization issues with enums on response

* adding some debug logs to help

* fixed issues with non-streaming responses

* updated the stream_context to update response bytes

* the serialized bytes length must be set in the response side

* fixed the debug statement that was causing the integration tests for wasm to fail

* fixing json parsing errors

* intentionally removing the headers

* making sure that we convert the raw bytes to the correct provider type upstream

* fixing non-streaming responses to tranform correctly

* /v1/messages works with transformations to and from /v1/chat/completions

* updating the CLI and demos to support anthropic vs. claude

* adding the anthropic key to the preference based routing tests

* fixed test cases and added more structured logs

* fixed integration tests and cleaned up logs

* added python client tests for anthropic and openai

* cleaned up logs and fixed issue with connectivity for llm gateway in weather forecast demo

* fixing the tests. python dependency order was broken

* updated the openAI client to fix demos

* removed the raw response debug statement

* fixed the dup cloning issue and cleaned up the ProviderRequestType enum and traits

* fixing logs

* moved away from string literals to consts

* fixed streaming from Anthropic Client to OpenAI

* removed debug statement that would likely trip up integration tests

* fixed integration tests for llm_gateway

* cleaned up test cases and removed unnecessary crates

* fixing comments from PR

* fixed bug whereby we were sending an OpenAIChatCompletions request object to llm_gateway even though the request may have been AnthropicMessages

---------

Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-4.local>
Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-9.local>
Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-10.local>
Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-41.local>
Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-136.local>
2025-09-10 07:40:30 -07:00
..
hurl_tests add preliminary support for llm agents (#432) 2025-03-19 15:21:34 -07:00
arch_config.yaml add support for v1/messages and transformations (#558) 2025-09-10 07:40:30 -07:00
docker-compose-honeycomb.yaml refactor demos (#398) 2025-02-07 18:45:42 -08:00
docker-compose-jaeger.yaml refactor demos (#398) 2025-02-07 18:45:42 -08:00
docker-compose-logfire.yaml refactor demos (#398) 2025-02-07 18:45:42 -08:00
docker-compose-signoz.yaml refactor demos (#398) 2025-02-07 18:45:42 -08:00
docker-compose.yaml fixed issue with groq LLMs that require the openai in the /v1/chat/co… (#460) 2025-04-13 14:00:16 -07:00
Dockerfile refactor demos (#398) 2025-02-07 18:45:42 -08:00
main.py Integrate Arch-Function-Chat (#449) 2025-04-15 14:39:12 -07:00
poetry.lock refactor demos (#398) 2025-02-07 18:45:42 -08:00
pyproject.toml refactor demos (#398) 2025-02-07 18:45:42 -08:00
README.md Tweak readme docs for minor nits (#461) 2025-04-12 23:52:20 -07:00
run_demo.sh refactor demos (#398) 2025-02-07 18:45:42 -08:00

Function calling

This demo shows how you can use Arch's core function calling capabilities.

Starting the demo

  1. Please make sure the pre-requisites are installed correctly

  2. Start Arch

  3. sh run_demo.sh
    
  4. Navigate to http://localhost:18080/

  5. You can type in queries like "how is the weather?"

Observability

Arch gateway publishes stats endpoint at http://localhost:19901/stats. In this demo we are using prometheus to pull stats from arch and we are using grafana to visalize the stats in dashboard. To see grafana dashboard follow instructions below,

  1. Start grafana and prometheus using following command
    docker compose --profile monitoring up
    
  2. Navigate to http://localhost:3000/ to open grafana UI (use admin/grafana as credentials)
  3. From grafana left nav click on dashboards and select "Intelligent Gateway Overview" to view arch gateway stats

Here is a sample interaction, image

Tracing

To see a tracing dashboard follow instructions below,

  1. For Jaeger, you can either use the default run_demo.sh script or run the following command:
sh run_demo.sh jaeger
  1. For Logfire, first make sure to add a LOGFIRE_API_KEY to the .env file. You can either use the default run_demo.sh script or run the following command:
sh run_demo.sh logfire
  1. For Signoz, you can either use the default run_demo.sh script or run the following command:
sh run_demo.sh signoz

If using Jaeger, navigate to http://localhost:16686/ to open Jaeger UI

If using Signoz, navigate to http://localhost:3301/ to open Signoz UI

If using Logfire, navigate to your logfire dashboard that you got the write key from to view the dashboard

Stopping Demo

  1. To end the demo, run the following command:
    sh run_demo.sh down