mirror of
https://github.com/rowboatlabs/rowboat.git
synced 2026-04-25 00:16:29 +02:00
5.3 KiB
5.3 KiB
RowBoat Labs
Please visit https://www.rowboatlabs.com to learn more about RowBoat Labs
Agents
Overview
- RowBoat Agents is a multi-agent framework which powers agentic workflows. The best way to configure these workflows is via the RowBoat Studio (UI), the source code for which is at rowboatlabs/rowboat
- The Rowboat Agents framework has been built upon OpenAI Swarm, with modifications and improvements. Please see the
NOTICE.mdfile in this directory, for attribution notes and more details. OpenAI Swarm is available under the MIT license as of the time of this writing. - Multi-agent systems are typically implemented as graphs, where each agent is a node in the graph. At every turn of conversation, the graph is traversed based on the a)
statewhich contains currently active agent and agent-level histories and b) the current set ofmessages. - RowBoat Agents is a stateless implementation of such a graph-based system (specifically, a DAG or directed acyclic graph). The incoming request JSON (corresponding to a turn of conversation) is parsed to extract
messages,stateand theworkflow.- The
workflowis a representation of the DAG containing agents, with each agent having a set of tools and connected children agents. Seetests/sample_requests/default_example.jsonfor an example of a complete request JSON from an upstream service.
- The
- At each turn of conversation, the agent graph object is created from scratch. The graph is then run, which produces the next set of
messagesandstate. Themessageswill be shown to the user by the upstream service. Additionally, if themessagescontain tool calls, then the upstream service must invoke the necessary tools and send the results back to the framework as the next turn.
Using the framework
Set up conda env
Standard conda env setup process:
conda create -n myenv python=3.12conda activate myenv- Note: python>=3.10
Install dependencies
Install either using poetry or using pip
If using poetry
pip install poetrypoetry install
If using pip
pip install -r requirements.txt
Set up .env file
Copy .env.example to .env and add your API keys
Run interactive test
python -m tests.interactive --config default_config.json --sample_request default_example.json --load_messages
--config: Config json filename, underconfigsfolder--sample_request: Path to the sample request file, undertests/sample_requestsfolder--load_messages: If set, it will additionally load the initial set of messages from the sample request file. Else, user input will be required starting from the first message.
Set up app server
- For local testing:
flask --app src.app.main run --port=4040 - To set up the server on remote:
gunicorn -b 0.0.0.0:4040 src.app.main:app
Run test client
python -m tests.app_client --sample_request default_example.json
--sample_request: Path to the sample request file, undertests/sample_requestsfolder
More details
Specifics
- Format: Uses OpenAI's messages format when passing messages.
- LLMs: Currently, only OpenAI LLMs (e.g. gpt-4o, gpt-4o-mini) are supported. Easy to expand to other LLMs like Claude, Gemini or self-hosted models.
- Responses: Here are some examples of responses that the framework can return:
- A list of one user-facing message
- A list of one or more tool calls
- A list of one user-facing message and one or more tool calls
- Errors: Errors are thrown as a tool call
raise_errorwith the error message as the argument. Real-time error handling will be managed by the upstream service.
Limitations
- Does not support streaming currently.
- Cannot respond with multiple user-facing messages in the same turn.
Important directories and files
src/: Contains all source code for the agents appsrc/app/: Contains Flask app which exposes the framework as a servicesrc/graph/: Contains logic to run every turn of the conversationsrc/graph/core.py: Core graph implementation which parses the workflow config, creates agents from it and runs the turn of conversation (through therun_turnfunction)
src/swarm/: RowBoat's custom implementation of OpenAI Swarm, which is used bysrc/graph/core.py
tests/: Contains sample requests, an interactive client and a test client which mocks an upstream serviceconfigs/: Contains graph configurations (changed infrequently)tests/sample_requests/: Contains sample request files for the agents app
High-level flow
app/main.pyreceives the request JSON from an upstream service, parses it and sends it tosrc/graph/core.pysrc/graph/core.pycreates the agent graph object from scratch and usessrc/swarm/core.pyto run the turnsrc/swarm/core.pyruns the turn by performing actual LLM calls and internal tool invocations to transitiion between agentssrc/graph/core.pyreturns the response messages and the new state toapp/main.py, which relays it back to the upstream service- The upstream services appends any new user messages to the history of messages and sends the messages back along with the new state to
app/main.pyas part of the next request. The process repeats until the upstream service completes its conversation with the user.