- RowBoat Agents is a multi-agent framework that powers conversations using agentic workflows.
- Built on top of [OpenAI Swarm](https://github.com/openai/swarm) with custom enhancements and improvements. Check the `NOTICE.md` for attribution and licensing details (MIT license).
- RowBoat Agents accepts Directed Acyclic Graph (DAG) workflows, which define agents, tools, and their connections.
- Configure workflows using the RowBoat Studio (UI) with the help of an AI copilot. Setup instructions can be found in the [main README](https://github.com/rowboatlabs/rowboat/tree/dev).
- The framework is stateless, meaning that it requires the upstream service to pass in the current `state` and `messages` in every turn.
-`--config`: Config json filename, under `configs` folder
-`--sample_request`: Path to the sample request file, under `tests/sample_requests` folder
-`--load_messages`: If set, it will additionally load the initial set of messages from the sample request file. Else, user input will be required starting from the first message.
- **Format**: Uses OpenAI's messages format when passing messages.
- **LLMs**: Currently, only OpenAI LLMs (e.g. gpt-4o, gpt-4o-mini) are supported. Easy to expand to other LLMs like Claude, Gemini or self-hosted models.
- **Responses**: Here are some examples of responses that the framework can return:
- ⚠️ **Errors**: Errors are thrown as a tool call `raise_error` with the error message as the argument. Real-time error handling will be managed by the upstream service.
-`src/`: Contains all source code for the agents app
-`src/app/`: Contains Flask app which exposes the framework as a service
-`src/graph/`: Contains logic to run every turn of the conversation
-`src/graph/core.py`: Core graph implementation which parses the workflow config, creates agents from it and runs the turn of conversation (through the `run_turn` function)
-`src/swarm/`: RowBoat's custom implementation of OpenAI Swarm, which is used by `src/graph/core.py`
-`tests/`: Contains sample requests, an interactive client and a test client which mocks an upstream service
- The upstream services appends any new user messages to the history of messages and sends the messages back along with the new state to `app/main.py` as part of the next request. The process repeats until the upstream service completes its conversation with the user.