Run plano natively by default (#744)

This commit is contained in:
Adil Hafeez 2026-03-05 07:35:25 -08:00 committed by GitHub
parent 198c912202
commit f63d5de02c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
56 changed files with 1557 additions and 256 deletions

View file

@ -3,7 +3,47 @@
Deployment
==========
This guide shows how to deploy Plano directly using Docker without the ``plano`` CLI, including basic runtime checks for routing and health monitoring.
Plano can be deployed in two ways: **natively** on the host (default) or inside a **Docker container**.
Native Deployment (Default)
---------------------------
Plano runs natively by default. Pre-compiled binaries (Envoy, WASM plugins, brightstaff) are automatically downloaded on the first run and cached at ``~/.plano/``.
Supported platforms: Linux (x86_64, aarch64), macOS (Apple Silicon).
Start Plano
~~~~~~~~~~~~
.. code-block:: bash
planoai up plano_config.yaml
Options:
- ``--foreground`` — stay attached and stream logs (Ctrl+C to stop)
- ``--with-tracing`` — start a local OTLP trace collector
Runtime files (rendered configs, logs, PID file) are stored in ``~/.plano/run/``.
Stop Plano
~~~~~~~~~~
.. code-block:: bash
planoai down
Build from Source (Developer)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If you want to build from source instead of using pre-compiled binaries, you need:
- `Rust <https://rustup.rs>`_ with the ``wasm32-wasip1`` target
- OpenSSL dev headers (``libssl-dev`` on Debian/Ubuntu, ``openssl`` on macOS)
.. code-block:: bash
planoai build --native
Docker Deployment
-----------------
@ -53,6 +93,13 @@ Check container health and logs:
docker compose ps
docker compose logs -f plano
You can also use the CLI with Docker mode:
.. code-block:: bash
planoai up plano_config.yaml --docker
planoai down --docker
Runtime Tests
-------------

View file

@ -2,9 +2,9 @@ version: v0.3.0
agents:
- id: weather_agent
url: http://host.docker.internal:10510
url: http://localhost:10510
- id: flight_agent
url: http://host.docker.internal:10520
url: http://localhost:10520
model_providers:
- model: openai/gpt-4o

View file

@ -2,16 +2,16 @@ version: v0.3.0
agents:
- id: rag_agent
url: http://host.docker.internal:10505
url: http://localhost:10505
filters:
- id: query_rewriter
url: http://host.docker.internal:10501
url: http://localhost:10501
# type: mcp # default is mcp
# transport: streamable-http # default is streamable-http
# tool: query_rewriter # default name is the filter id
- id: context_builder
url: http://host.docker.internal:10502
url: http://localhost:10502
model_providers:
- model: openai/gpt-4o-mini

View file

@ -4,15 +4,15 @@ version: v0.3.0
# External HTTP agents - API type is controlled by request path (/v1/responses, /v1/messages, /v1/chat/completions)
agents:
- id: weather_agent # Example agent for weather
url: http://host.docker.internal:10510
url: http://localhost:10510
- id: flight_agent # Example agent for flights
url: http://host.docker.internal:10520
url: http://localhost:10520
# MCP filters applied to requests/responses (e.g., input validation, query rewriting)
filters:
- id: input_guards # Example filter for input validation
url: http://host.docker.internal:10500
url: http://localhost:10500
# type: mcp (default)
# transport: streamable-http (default)
# tool: input_guards (default - same as filter id)

View file

@ -1,31 +1,31 @@
agents:
- id: weather_agent
url: http://host.docker.internal:10510
url: http://localhost:10510
- id: flight_agent
url: http://host.docker.internal:10520
url: http://localhost:10520
endpoints:
app_server:
connect_timeout: 0.005s
endpoint: 127.0.0.1
port: 80
flight_agent:
endpoint: host.docker.internal
endpoint: localhost
port: 10520
protocol: http
input_guards:
endpoint: host.docker.internal
endpoint: localhost
port: 10500
protocol: http
mistral_local:
endpoint: 127.0.0.1
port: 8001
weather_agent:
endpoint: host.docker.internal
endpoint: localhost
port: 10510
protocol: http
filters:
- id: input_guards
url: http://host.docker.internal:10500
url: http://localhost:10500
listeners:
- address: 0.0.0.0
agents:
@ -130,6 +130,6 @@ prompt_targets:
required: true
type: int
tracing:
opentracing_grpc_endpoint: http://host.docker.internal:4317
opentracing_grpc_endpoint: http://localhost:4317
random_sampling: 100
version: v0.3.0