mirror of
https://github.com/rowboatlabs/rowboat.git
synced 2026-04-25 16:36:22 +02:00
5.7 KiB
5.7 KiB
🤖 Agents
📝 Overview
- RowBoat Agents is a multi-agent framework that powers conversations using agentic workflows.
- Built on top of OpenAI Swarm with custom enhancements and improvements. Check the NOTICE for attribution and licensing details (MIT license).
🕸️ Graph-based Framework
- Multi-agent systems are represented as graphs, where each agent is a node in the graph.
- RowBoat Agents accepts Directed Acyclic Graph (DAG) workflows, which define agents, tools, and their connections.
- Configure workflows using the RowBoat Studio (UI) with the help of an AI copilot. Setup instructions can be found in the main README.
- The framework is stateless, meaning that it requires the upstream service to pass in the current
stateandmessagesin every turn. - At each conversation turn:
- The agents are initialized using the current
state. - The graph is traversed based on
messages,state, andworkflow - Response
messagesand a newstateare generated. - If
messagescontain tool calls, the upstream service must invoke the necessary tools and send the tool results back to continue the interaction.
- The agents are initialized using the current
🗂️ Key Request and Response Fields
📤 Request
messages: List of user messagesstate: Active agent state and historiesworkflow: Graph of agents, tools, and connections
Example JSON: tests/sample_requests/default_example.json
📥 Response
messages: List of response messages (may contain tool calls)state: Updated state to pass in the next request (since the framework is stateless)
Example JSON: tests/sample_responses/default_example.json
🛠️ Using the Framework
Ensure you are in this directory (cd apps/agents from the root directory of this repo) before running any of the below commands.
⚙️ Set Up Conda Environment
conda create -n myenv python=3.12conda activate myenv- Note: Python >= 3.10 required
📦 Install Dependencies
If using poetry
pip install poetrypoetry install
If using pip
pip install -r requirements.txt
🔑 Set up .env file
Copy .env.example to .env and add your API keys
🧪 Run interactive test
python -m tests.interactive --config default_config.json --sample_request default_example.json --load_messages
--config: Config json filename, underconfigsfolder--sample_request: Path to the sample request file, undertests/sample_requestsfolder--load_messages: If set, it will additionally load the initial set of messages from the sample request file. Else, user input will be required starting from the first message.
🌐 Set up server
- First, add this directory to your PYTHONPATH, using:
export PYTHONPATH=$PYTHONPATH:$(pwd) - For local testing:
flask --app src.app.main run --port=4040 - To set up the server on a remote machine:
gunicorn -b 0.0.0.0:4040 src.app.main:app
🖥️ Run test client
python -m tests.app_client --sample_request default_example.json --api_key test
--sample_request: Path to the sample request file, undertests/sample_requestsfolder--api_key: API key to use for authentication. This is the same key as the one in the.envfile.
📖 More details
🔍 Specifics
- Format: Uses OpenAI's messages format when passing messages.
- LLMs: Currently, only OpenAI LLMs (e.g. gpt-4o, gpt-4o-mini) are supported. Easy to expand to other LLMs like Claude, Gemini or self-hosted models.
- Responses: Here are some examples of responses that the framework can return:
- A list of one user-facing message
- A list of one or more tool calls
- A list of one user-facing message and one or more tool calls
- ⚠️ Errors: Errors are thrown as a tool call
raise_errorwith the error message as the argument. Real-time error handling will be managed by the upstream service.
🗂️ Important directories and files
src/: Contains all source code for the agents appsrc/app/: Contains Flask app which exposes the framework as a servicesrc/graph/: Contains logic to run every turn of the conversationsrc/graph/core.py: Core graph implementation which parses the workflow config, creates agents from it and runs the turn of conversation (through therun_turnfunction)
src/swarm/: RowBoat's custom implementation of OpenAI Swarm, which is used bysrc/graph/core.py
tests/: Contains sample requests, an interactive client and a test client which mocks an upstream serviceconfigs/: Contains graph configurations (changed infrequently)tests/sample_requests/: Contains sample request files for the agents app
🔄 High-level flow
app/main.pyreceives the request JSON from an upstream service, parses it and sends it tosrc/graph/core.pysrc/graph/core.pycreates the agent graph object from scratch and usessrc/swarm/core.pyto run the turnsrc/swarm/core.pyruns the turn by performing actual LLM calls and internal tool invocations to transitiion between agentssrc/graph/core.pyreturns the response messages and the new state toapp/main.py, which relays it back to the upstream service- The upstream services appends any new user messages to the history of messages and sends the messages back along with the new state to
app/main.pyas part of the next request. The process repeats until the upstream service completes its conversation with the user.
🚫 Limitations
- Does not support streaming currently.
- Cannot respond with multiple user-facing messages in the same turn.
RowBoat Labs
🌐 Visit RowBoat Labs to learn more!