* pushing draft PR * transformations are working. Now need to add some tests next * updated tests and added necessary response transformations for Anthropics' message response object * fixed bugs for integration tests * fixed doc tests * fixed serialization issues with enums on response * adding some debug logs to help * fixed issues with non-streaming responses * updated the stream_context to update response bytes * the serialized bytes length must be set in the response side * fixed the debug statement that was causing the integration tests for wasm to fail * fixing json parsing errors * intentionally removing the headers * making sure that we convert the raw bytes to the correct provider type upstream * fixing non-streaming responses to tranform correctly * /v1/messages works with transformations to and from /v1/chat/completions * updating the CLI and demos to support anthropic vs. claude * adding the anthropic key to the preference based routing tests * fixed test cases and added more structured logs * fixed integration tests and cleaned up logs * added python client tests for anthropic and openai * cleaned up logs and fixed issue with connectivity for llm gateway in weather forecast demo * fixing the tests. python dependency order was broken * updated the openAI client to fix demos * removed the raw response debug statement * fixed the dup cloning issue and cleaned up the ProviderRequestType enum and traits * fixing logs * moved away from string literals to consts * fixed streaming from Anthropic Client to OpenAI * removed debug statement that would likely trip up integration tests * fixed integration tests for llm_gateway * cleaned up test cases and removed unnecessary crates * fixing comments from PR * fixed bug whereby we were sending an OpenAIChatCompletions request object to llm_gateway even though the request may have been AnthropicMessages --------- Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-4.local> Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-9.local> Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-10.local> Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-41.local> Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-136.local> |
||
|---|---|---|
| .. | ||
| hurl_tests | ||
| arch_config.yaml | ||
| docker-compose-honeycomb.yaml | ||
| docker-compose-jaeger.yaml | ||
| docker-compose-logfire.yaml | ||
| docker-compose-signoz.yaml | ||
| docker-compose.yaml | ||
| Dockerfile | ||
| main.py | ||
| poetry.lock | ||
| pyproject.toml | ||
| README.md | ||
| run_demo.sh | ||
Function calling
This demo shows how you can use Arch's core function calling capabilities.
Starting the demo
-
Please make sure the pre-requisites are installed correctly
-
Start Arch
-
sh run_demo.sh -
Navigate to http://localhost:18080/
-
You can type in queries like "how is the weather?"
Observability
Arch gateway publishes stats endpoint at http://localhost:19901/stats. In this demo we are using prometheus to pull stats from arch and we are using grafana to visalize the stats in dashboard. To see grafana dashboard follow instructions below,
- Start grafana and prometheus using following command
docker compose --profile monitoring up - Navigate to http://localhost:3000/ to open grafana UI (use admin/grafana as credentials)
- From grafana left nav click on dashboards and select "Intelligent Gateway Overview" to view arch gateway stats
Here is a sample interaction,
Tracing
To see a tracing dashboard follow instructions below,
- For Jaeger, you can either use the default run_demo.sh script or run the following command:
sh run_demo.sh jaeger
- For Logfire, first make sure to add a LOGFIRE_API_KEY to the .env file. You can either use the default run_demo.sh script or run the following command:
sh run_demo.sh logfire
- For Signoz, you can either use the default run_demo.sh script or run the following command:
sh run_demo.sh signoz
If using Jaeger, navigate to http://localhost:16686/ to open Jaeger UI
If using Signoz, navigate to http://localhost:3301/ to open Signoz UI
If using Logfire, navigate to your logfire dashboard that you got the write key from to view the dashboard
Stopping Demo
- To end the demo, run the following command:
sh run_demo.sh down