This commit is contained in:
nocxcloud-oss 2026-04-15 19:30:22 +08:00 committed by GitHub
commit a0ec92cd15
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
63 changed files with 3678 additions and 764 deletions

View file

@ -20,12 +20,16 @@ jobs:
- name: Install dependencies
run: npm ci
working-directory: apps/rowboat
- name: Verify Rowboat
run: npm run verify
working-directory: apps/rowboat
- name: Build Rowboat
run: npm run build
working-directory: apps/rowboat
build-rowboatx:
build-cli:
runs-on: ubuntu-latest
steps:
@ -34,7 +38,7 @@ jobs:
- name: Setup Node.js
uses: actions/setup-node@v6
with:
cache-dependency-path: 'apps/rowboat/package-lock.json'
cache-dependency-path: 'apps/cli/package-lock.json'
node-version: '24'
cache: 'npm'
@ -42,6 +46,32 @@ jobs:
run: npm ci
working-directory: apps/cli
- name: Build Rowboat
run: npm run build
working-directory: apps/cli
- name: Verify CLI
run: npm run verify
working-directory: apps/cli
test-desktop-core:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- name: Setup pnpm
uses: pnpm/action-setup@v4
with:
version: 9
- name: Setup Node.js
uses: actions/setup-node@v6
with:
node-version: '24'
cache: 'pnpm'
cache-dependency-path: 'apps/x/pnpm-lock.yaml'
- name: Install dependencies
run: pnpm install --frozen-lockfile
working-directory: apps/x
- name: Verify desktop app
run: pnpm run verify
working-directory: apps/x

View file

@ -27,11 +27,8 @@ jobs:
run: npm ci
working-directory: apps/cli
# optional: run tests
# - run: npm test
- name: Build
run: npm run build
- name: Verify
run: npm run verify
working-directory: apps/cli
- name: Pack
@ -40,4 +37,4 @@ jobs:
- name: Publish to npm
run: npm publish --access public
working-directory: apps/cli
working-directory: apps/cli

89
ARCHITECTURE.md Normal file
View file

@ -0,0 +1,89 @@
# Architecture
This repository contains multiple Rowboat product surfaces. The quickest way to get oriented is to start from the table below instead of treating the repo as a single application.
## Product Map
| Surface | Path | Status | Purpose |
|---|---|---|---|
| Desktop app | `apps/x` | Primary | Local-first Electron app with Markdown memory, knowledge graph sync, and on-device workflows |
| Hosted web app | `apps/rowboat` | Active | Next.js platform with project-scoped agents, RAG, jobs, billing, and integrations |
| CLI/runtime | `apps/cli` | Active | Local HTTP runtime, workflow packaging, and npm-distributed `rowboatx` tooling |
| New frontend | `apps/rowboatx` | Active, evolving | Static Next.js UI that talks to the local runtime and shell-provided APIs |
| Docs | `apps/docs` | Active | Mintlify documentation site |
| Python SDK | `apps/python-sdk` | Supporting | Thin Python client for the hosted chat API |
| Experiments | `apps/experimental` | Experimental | Prototypes and one-off services not considered part of the core product |
## How The Pieces Fit Together
### `apps/x`
- Nested `pnpm` workspace for the desktop product.
- `apps/main` runs the Electron main process.
- `apps/preload` exposes the validated IPC bridge.
- `apps/renderer` contains the React UI.
- `packages/shared` holds shared schemas and IPC contracts.
- `packages/core` contains workspace, knowledge graph, agent, and integration logic.
### `apps/rowboat`
- Hosted or self-hosted Next.js application.
- Uses MongoDB, Redis, Qdrant, uploads storage, background workers, and external providers.
- Organized into `application`, `entities`, `infrastructure`, and `interface-adapters` layers.
### `apps/cli` + `apps/rowboatx`
- `apps/cli` provides the local API and runtime for runs, tools, permissions, and event streaming.
- `apps/rowboatx` is the browser UI that expects a runtime behind `/api/stream`, `/api/rowboat/*`, or a configured `window.config.apiBase`.
## Shared Runtime Concepts
- Local data lives under `~/.rowboat` by default.
- The desktop product stores knowledge as Markdown files and maintains Git-backed history for those notes.
- The hosted app uses project-scoped data stores instead of the desktop Markdown vault.
- Both the desktop and hosted surfaces rely on model/provider abstraction, tool calling, and external integrations.
## Recommended Entry Points
- Working on desktop memory, sync, or Electron UX: start in `apps/x`
- Working on hosted APIs, jobs, RAG, or project management: start in `apps/rowboat`
- Working on local runtime, SSE events, or packaging flows: start in `apps/cli`
- Working on the newer dashboard UI for the local runtime: start in `apps/rowboatx`
## Common Commands
### Desktop app
```bash
cd apps/x
pnpm install
npm run verify
npm run dev
npm run test
```
### Hosted web app
```bash
cd apps/rowboat
npm install
npm run verify
npm run dev
```
### CLI runtime
```bash
cd apps/cli
npm install
npm run verify
npm run server
```
### Local runtime frontend
```bash
cd apps/rowboatx
npm install
npm run dev
```
## Contributor Rules Of Thumb
- Prefer `apps/x` when the change is local-first or knowledge-vault oriented.
- Prefer `apps/rowboat` when the change requires server-side persistence, auth, billing, or hosted APIs.
- Treat `apps/experimental` as non-core unless you are intentionally working on a prototype.
- When adding documentation, update the README closest to the surface you changed.

View file

@ -81,6 +81,18 @@ All API key files use the same format:
}
```
## Repository Map
This repository contains multiple Rowboat surfaces. Start with `ARCHITECTURE.md` if you are contributing or trying to understand which app owns a feature.
- `apps/x` - primary local-first Electron desktop app
- `apps/rowboat` - hosted Next.js platform and APIs
- `apps/cli` - local runtime and npm-distributed CLI package
- `apps/rowboatx` - newer frontend for the local runtime
- `apps/docs` - Mintlify docs site
- `apps/python-sdk` - Python API client
- `apps/experimental` - prototypes and non-core experiments
## What it does
Rowboat is a **local-first AI coworker** that can:

45
apps/cli/README.md Normal file
View file

@ -0,0 +1,45 @@
# Rowboat CLI And Local Runtime
`apps/cli` contains the npm-distributed `@rowboatlabs/rowboatx` package and the local HTTP runtime used by the newer frontend.
## What Lives Here
- Hono server for runs, messages, permissions, and SSE streaming
- Model and MCP configuration repositories under `~/.rowboat`
- Workflow import and export helpers
- Packaged CLI entrypoint in `bin/app.js`
## Local Development
Install and build:
```bash
npm install
npm run verify
```
Run the local server:
```bash
npm run server
```
## Key Commands
- `npm run build` - compile TypeScript into `dist/`
- `npm run lint` - run CLI lint checks
- `npm run typecheck` - run TypeScript checks without emitting
- `npm run server` - start the local Hono runtime
- `npm run verify` - run lint, typecheck, and tests together
- `npm run migrate-agents` - run bundled agent migration script
## Data Location
The CLI/runtime stores configuration and runtime state in `~/.rowboat` by default.
## Related Surfaces
- `apps/rowboatx` provides the newer frontend that talks to this runtime
- `apps/x` has its own local-first desktop runtime and is the primary desktop product
See the root `ARCHITECTURE.md` for the repo-level map.

View file

@ -0,0 +1,37 @@
import js from "@eslint/js";
import globals from "globals";
import tseslint from "typescript-eslint";
import { defineConfig, globalIgnores } from "eslint/config";
export default defineConfig([
globalIgnores(["dist/**"]),
{
files: ["src/**/*.ts"],
extends: [js.configs.recommended, ...tseslint.configs.recommended],
languageOptions: {
globals: {
...globals.node,
},
},
rules: {
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-unused-expressions": "off",
"@typescript-eslint/no-unused-vars": "off",
"no-case-declarations": "off",
"no-useless-escape": "off",
"prefer-const": "off",
},
},
{
files: ["test/**/*.mjs", "bin/**/*.js"],
extends: [js.configs.recommended],
languageOptions: {
globals: {
...globals.node,
},
},
rules: {
"no-unused-vars": "off",
},
},
]);

File diff suppressed because it is too large Load diff

View file

@ -4,8 +4,11 @@
"main": "index.js",
"type": "module",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"lint": "eslint src test bin",
"typecheck": "tsc --noEmit",
"test": "npm run build && node ./test/run-tests.mjs",
"build": "rm -rf dist && tsc",
"verify": "npm run lint && npm run typecheck && npm test",
"server": "node dist/server.js",
"migrate-agents": "node dist/scripts/migrate-agents.js"
},
@ -21,8 +24,12 @@
"license": "Apache-2.0",
"description": "",
"devDependencies": {
"@eslint/js": "^9.39.2",
"@types/node": "^24.9.1",
"@types/react": "^18.3.12",
"eslint": "^9.39.2",
"globals": "^16.5.0",
"typescript-eslint": "^8.50.1",
"typescript": "^5.9.3"
},
"dependencies": {

View file

@ -1,6 +1,5 @@
import { WorkDir } from "../config/config.js";
import fs from "fs/promises";
import { glob } from "node:fs/promises";
import path from "path";
import z from "zod";
import { Agent } from "./agents.js";
@ -19,11 +18,32 @@ export interface IAgentsRepo {
export class FSAgentsRepo implements IAgentsRepo {
private readonly agentsDir = path.join(WorkDir, "agents");
private async listMarkdownFiles(dir: string, prefix: string = ""): Promise<string[]> {
const entries = await fs.readdir(dir, { withFileTypes: true });
const results: string[] = [];
for (const entry of entries) {
const relativePath = prefix ? path.posix.join(prefix, entry.name) : entry.name;
const absolutePath = path.join(dir, entry.name);
if (entry.isDirectory()) {
results.push(...await this.listMarkdownFiles(absolutePath, relativePath));
continue;
}
if (entry.isFile() && entry.name.endsWith(".md")) {
results.push(relativePath);
}
}
return results;
}
async list(): Promise<z.infer<typeof Agent>[]> {
const result: z.infer<typeof Agent>[] = [];
// list all md files in workdir/agents/
const matches = await Array.fromAsync(glob("**/*.md", { cwd: this.agentsDir }));
const matches = await this.listMarkdownFiles(this.agentsDir);
for (const file of matches) {
try {
const agent = await this.parseAgentMd(path.join(this.agentsDir, file));
@ -79,16 +99,20 @@ export class FSAgentsRepo implements IAgentsRepo {
async create(agent: z.infer<typeof Agent>): Promise<void> {
const { instructions, ...rest } = agent;
const contents = `---\n${stringify(rest)}\n---\n${instructions}`;
await fs.writeFile(path.join(this.agentsDir, `${agent.name}.md`), contents);
const filePath = path.join(this.agentsDir, `${agent.name}.md`);
await fs.mkdir(path.dirname(filePath), { recursive: true });
await fs.writeFile(filePath, contents);
}
async update(id: string, agent: z.infer<typeof UpdateAgentSchema>): Promise<void> {
const { instructions, ...rest } = agent;
const contents = `---\n${stringify(rest)}\n---\n${instructions}`;
await fs.writeFile(path.join(this.agentsDir, `${id}.md`), contents);
const filePath = path.join(this.agentsDir, `${id}.md`);
await fs.mkdir(path.dirname(filePath), { recursive: true });
await fs.writeFile(filePath, contents);
}
async delete(id: string): Promise<void> {
await fs.unlink(path.join(this.agentsDir, `${id}.md`));
}
}
}

View file

@ -12,7 +12,7 @@ import { Agent } from "./agents/agents.js";
import { McpServerConfig, McpServerDefinition } from "./mcp/schema.js";
import { Example } from "./entities/example.js";
import { z } from "zod";
import { Flavor } from "./models/models.js";
import { Flavor } from "./models/schema.js";
import { examples } from "./examples/index.js";
import container from "./di/container.js";
import { IModelConfigRepo } from "./models/repo.js";

View file

@ -2,14 +2,29 @@ import path from "path";
import fs from "fs";
import { homedir } from "os";
// Resolve app root relative to compiled file location (dist/...)
export const WorkDir = path.join(homedir(), ".rowboat");
function resolveWorkDir(): string {
const configured = process.env.ROWBOAT_WORKDIR;
if (!configured) {
return path.join(homedir(), ".rowboat");
}
const expanded = configured === "~"
? homedir()
: (configured.startsWith("~/") || configured.startsWith("~\\"))
? path.join(homedir(), configured.slice(2))
: configured;
return path.resolve(expanded);
}
export const WorkDir = resolveWorkDir();
function ensureDirs() {
const ensure = (p: string) => { if (!fs.existsSync(p)) fs.mkdirSync(p, { recursive: true }); };
ensure(WorkDir);
ensure(path.join(WorkDir, "agents"));
ensure(path.join(WorkDir, "config"));
ensure(path.join(WorkDir, "runs"));
}
ensureDirs();
ensureDirs();

View file

@ -12,9 +12,10 @@ export interface IMcpConfigRepo {
export class FSMcpConfigRepo implements IMcpConfigRepo {
private readonly configPath = path.join(WorkDir, "config", "mcp.json");
private readonly initPromise: Promise<void>;
constructor() {
this.ensureDefaultConfig();
this.initPromise = this.ensureDefaultConfig();
}
private async ensureDefaultConfig(): Promise<void> {
@ -25,18 +26,25 @@ export class FSMcpConfigRepo implements IMcpConfigRepo {
}
}
private async ensureInitialized(): Promise<void> {
await this.initPromise;
}
async getConfig(): Promise<z.infer<typeof McpServerConfig>> {
await this.ensureInitialized();
const config = await fs.readFile(this.configPath, "utf8");
return McpServerConfig.parse(JSON.parse(config));
}
async upsert(serverName: string, config: z.infer<typeof McpServerDefinition>): Promise<void> {
await this.ensureInitialized();
const conf = await this.getConfig();
conf.mcpServers[serverName] = config;
await fs.writeFile(this.configPath, JSON.stringify(conf, null, 2));
}
async delete(serverName: string): Promise<void> {
await this.ensureInitialized();
const conf = await this.getConfig();
delete conf.mcpServers[serverName];
await fs.writeFile(this.configPath, JSON.stringify(conf, null, 2));

View file

@ -8,33 +8,9 @@ import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { IModelConfigRepo } from "./repo.js";
import container from "../di/container.js";
import z from "zod";
import { Flavor, Provider, ModelConfig } from "./schema.js";
export const Flavor = z.enum([
"rowboat [free]",
"aigateway",
"anthropic",
"google",
"ollama",
"openai",
"openai-compatible",
"openrouter",
]);
export const Provider = z.object({
flavor: Flavor,
apiKey: z.string().optional(),
baseURL: z.string().optional(),
headers: z.record(z.string(), z.string()).optional(),
});
export const ModelConfig = z.object({
providers: z.record(z.string(), Provider),
defaults: z.object({
provider: z.string(),
model: z.string(),
}),
});
export { Flavor, Provider, ModelConfig };
const providerMap: Record<string, ProviderV2> = {};
@ -116,4 +92,4 @@ export async function getProvider(name: string = ""): Promise<ProviderV2> {
throw new Error(`Provider ${name} not found`);
}
return providerMap[name];
}
}

View file

@ -1,4 +1,4 @@
import { ModelConfig, Provider } from "./models.js";
import { ModelConfig, Provider } from "./schema.js";
import { WorkDir } from "../config/config.js";
import fs from "fs/promises";
import path from "path";
@ -25,9 +25,10 @@ const defaultConfig: z.infer<typeof ModelConfig> = {
export class FSModelConfigRepo implements IModelConfigRepo {
private readonly configPath = path.join(WorkDir, "config", "models.json");
private readonly initPromise: Promise<void>;
constructor() {
this.ensureDefaultConfig();
this.initPromise = this.ensureDefaultConfig();
}
private async ensureDefaultConfig(): Promise<void> {
@ -38,12 +39,18 @@ export class FSModelConfigRepo implements IModelConfigRepo {
}
}
private async ensureInitialized(): Promise<void> {
await this.initPromise;
}
async getConfig(): Promise<z.infer<typeof ModelConfig>> {
await this.ensureInitialized();
const config = await fs.readFile(this.configPath, "utf8");
return ModelConfig.parse(JSON.parse(config));
}
private async setConfig(config: z.infer<typeof ModelConfig>): Promise<void> {
await this.ensureInitialized();
await fs.writeFile(this.configPath, JSON.stringify(config, null, 2));
}
@ -67,4 +74,4 @@ export class FSModelConfigRepo implements IModelConfigRepo {
};
await this.setConfig(conf);
}
}
}

View file

@ -0,0 +1,27 @@
import z from "zod";
export const Flavor = z.enum([
"rowboat [free]",
"aigateway",
"anthropic",
"google",
"ollama",
"openai",
"openai-compatible",
"openrouter",
]);
export const Provider = z.object({
flavor: Flavor,
apiKey: z.string().optional(),
baseURL: z.string().optional(),
headers: z.record(z.string(), z.string()).optional(),
});
export const ModelConfig = z.object({
providers: z.record(z.string(), Provider),
defaults: z.object({
provider: z.string(),
model: z.string(),
}),
});

View file

@ -1,4 +1,4 @@
import { Run } from "./runs.js";
import { Run } from "./schema.js";
import z from "zod";
import { IMonotonicallyIncreasingIdGenerator } from "../application/lib/id-gen.js";
import { WorkDir } from "../config/config.js";
@ -37,6 +37,7 @@ export class FSRunsRepo implements IRunsRepo {
}
async appendEvents(runId: string, events: z.infer<typeof RunEvent>[]): Promise<void> {
await fsp.mkdir(path.join(WorkDir, 'runs'), { recursive: true });
await fsp.appendFile(
path.join(WorkDir, 'runs', `${runId}.jsonl`),
events.map(event => JSON.stringify(event)).join("\n") + "\n"
@ -141,4 +142,4 @@ export class FSRunsRepo implements IRunsRepo {
...(nextCursor ? { nextCursor } : {}),
};
}
}
}

View file

@ -5,6 +5,7 @@ import { AskHumanResponseEvent, RunEvent, ToolPermissionResponseEvent } from "..
import { CreateRunOptions, IRunsRepo } from "./repo.js";
import { IAgentRuntime } from "../agents/runtime.js";
import { IBus } from "../application/lib/bus.js";
import { Run } from "./schema.js";
export const ToolPermissionAuthorizePayload = ToolPermissionResponseEvent.pick({
subflow: true,
@ -18,12 +19,7 @@ export const AskHumanResponsePayload = AskHumanResponseEvent.pick({
response: true,
});
export const Run = z.object({
id: z.string(),
createdAt: z.iso.datetime(),
agentId: z.string(),
log: z.array(RunEvent),
});
export { Run };
export async function createRun(opts: z.infer<typeof CreateRunOptions>): Promise<z.infer<typeof Run>> {
const repo = container.resolve<IRunsRepo>('runsRepo');
@ -67,4 +63,4 @@ export async function replyToHumanInputRequest(runId: string, ev: z.infer<typeof
export async function stop(runId: string): Promise<void> {
throw new Error('Not implemented');
}
}

View file

@ -0,0 +1,10 @@
import z from "zod";
import { RunEvent } from "../entities/run-events.js";
export const Run = z.object({
id: z.string(),
createdAt: z.iso.datetime(),
agentId: z.string(),
log: z.array(RunEvent),
});

View file

@ -4,193 +4,208 @@ import { streamSSE } from 'hono/streaming'
import { describeRoute, validator, resolver, openAPIRouteHandler } from "hono-openapi"
import z from 'zod';
import container from './di/container.js';
import { executeTool, listServers, listTools } from "./mcp/mcp.js";
import { ListToolsResponse, McpServerDefinition, McpServerList } from "./mcp/schema.js";
import { IMcpConfigRepo } from './mcp/repo.js';
import { IModelConfigRepo } from './models/repo.js';
import { ModelConfig, Provider } from "./models/models.js";
import { IAgentsRepo } from "./agents/repo.js";
import { Agent } from "./agents/agents.js";
import { AskHumanResponsePayload, authorizePermission, createMessage, createRun, replyToHumanInputRequest, Run, stop, ToolPermissionAuthorizePayload } from './runs/runs.js';
import { IRunsRepo, CreateRunOptions, ListRunsResponse } from './runs/repo.js';
import { IBus } from './application/lib/bus.js';
import { cors } from 'hono/cors';
import { pathToFileURL } from 'node:url';
let id = 0;
export interface ServerDependencies {
createMessage(runId: string, message: string): Promise<string>;
authorizePermission(runId: string, payload: z.infer<typeof ToolPermissionAuthorizePayload>): Promise<void>;
replyToHumanInputRequest(runId: string, payload: z.infer<typeof AskHumanResponsePayload>): Promise<void>;
stop(runId: string): Promise<void>;
subscribeToEvents(listener: (event: unknown) => Promise<void>): Promise<() => void>;
}
const routes = new Hono()
.post(
'/runs/:runId/messages/new',
describeRoute({
summary: 'Create a new message',
description: 'Create a new message',
responses: {
200: {
description: 'Message created',
content: {
'application/json': {
schema: resolver(z.object({
messageId: z.string(),
})),
const defaultDependencies: ServerDependencies = {
createMessage,
authorizePermission,
replyToHumanInputRequest,
stop,
subscribeToEvents: async (listener) => {
const bus = container.resolve<IBus>('bus');
return bus.subscribe('*', listener);
},
};
export function createApp(deps: ServerDependencies = defaultDependencies): Hono {
const routes = new Hono()
.post(
'/runs/:runId/messages/new',
describeRoute({
summary: 'Create a new message',
description: 'Create a new message',
responses: {
200: {
description: 'Message created',
content: {
'application/json': {
schema: resolver(z.object({
messageId: z.string(),
})),
},
},
},
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', z.object({
message: z.string(),
})),
async (c) => {
const messageId = await createMessage(c.req.valid('param').runId, c.req.valid('json').message);
return c.json({
messageId,
});
}
)
.post(
'/runs/:runId/permissions/authorize',
describeRoute({
summary: 'Authorize permission',
description: 'Authorize a permission',
responses: {
200: {
description: 'Permission authorized',
content: {
'application/json': {
schema: resolver(z.object({
success: z.literal(true),
})),
},
}
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', ToolPermissionAuthorizePayload),
async (c) => {
const response = await authorizePermission(
c.req.valid('param').runId,
c.req.valid('json')
);
return c.json({
success: true,
});
}
)
.post(
'/runs/:runId/human-input-requests/:requestId/reply',
describeRoute({
summary: 'Reply to human input request',
description: 'Reply to a human input request',
responses: {
200: {
description: 'Human input request replied',
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', AskHumanResponsePayload),
async (c) => {
const response = await replyToHumanInputRequest(
c.req.valid('param').runId,
c.req.valid('json')
);
return c.json({
success: true,
});
}
)
.post(
'/runs/:runId/stop',
describeRoute({
summary: 'Stop run',
description: 'Stop a run',
responses: {
200: {
description: 'Run stopped',
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
async (c) => {
const response = await stop(c.req.valid('param').runId);
return c.json({
success: true,
});
}
)
.get(
'/stream',
describeRoute({
summary: 'Subscribe to run events',
description: 'Subscribe to run events',
}),
async (c) => {
return streamSSE(c, async (stream) => {
const bus = container.resolve<IBus>('bus');
let id = 0;
let unsub: (() => void) | null = null;
let aborted = false;
stream.onAbort(() => {
aborted = true;
if (unsub) {
unsub();
}
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', z.object({
message: z.string(),
})),
async (c) => {
const messageId = await deps.createMessage(c.req.valid('param').runId, c.req.valid('json').message);
return c.json({
messageId,
});
}
)
.post(
'/runs/:runId/permissions/authorize',
describeRoute({
summary: 'Authorize permission',
description: 'Authorize a permission',
responses: {
200: {
description: 'Permission authorized',
content: {
'application/json': {
schema: resolver(z.object({
success: z.literal(true),
})),
},
}
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', ToolPermissionAuthorizePayload),
async (c) => {
await deps.authorizePermission(
c.req.valid('param').runId,
c.req.valid('json')
);
return c.json({
success: true,
});
}
)
.post(
'/runs/:runId/human-input-requests/:requestId/reply',
describeRoute({
summary: 'Reply to human input request',
description: 'Reply to a human input request',
responses: {
200: {
description: 'Human input request replied',
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
validator('json', AskHumanResponsePayload),
async (c) => {
await deps.replyToHumanInputRequest(
c.req.valid('param').runId,
c.req.valid('json')
);
return c.json({
success: true,
});
}
)
.post(
'/runs/:runId/stop',
describeRoute({
summary: 'Stop run',
description: 'Stop a run',
responses: {
200: {
description: 'Run stopped',
},
},
}),
validator('param', z.object({
runId: z.string(),
})),
async (c) => {
await deps.stop(c.req.valid('param').runId);
return c.json({
success: true,
});
}
)
.get(
'/stream',
describeRoute({
summary: 'Subscribe to run events',
description: 'Subscribe to run events',
}),
async (c) => {
return streamSSE(c, async (stream) => {
let eventId = 0;
let unsub: (() => void) | null = null;
let aborted = false;
// Subscribe to your bus
unsub = await bus.subscribe('*', async (event) => {
if (aborted) return;
await stream.writeSSE({
data: JSON.stringify(event),
event: "message",
id: String(id++),
stream.onAbort(() => {
aborted = true;
if (unsub) {
unsub();
}
});
unsub = await deps.subscribeToEvents(async (event) => {
if (aborted) return;
await stream.writeSSE({
data: JSON.stringify(event),
event: "message",
id: String(eventId++),
});
});
while (!aborted) {
await stream.sleep(1000);
}
});
}
);
// Keep the function alive until the client disconnects
while (!aborted) {
await stream.sleep(1000); // any interval is fine
}
});
}
)
;
const app = new Hono()
.use("/*", cors())
.route("/", routes)
.get(
"/openapi.json",
openAPIRouteHandler(routes, {
documentation: {
info: {
title: "Hono",
version: "1.0.0",
description: "RowboatX API",
return new Hono()
.use("/*", cors())
.route("/", routes)
.get(
"/openapi.json",
openAPIRouteHandler(routes, {
documentation: {
info: {
title: "Hono",
version: "1.0.0",
description: "RowboatX API",
},
},
},
}),
);
}),
);
}
// export default app;
export const app = createApp();
serve({
fetch: app.fetch,
port: Number(process.env.PORT) || 3000,
});
export function startServer(port: number = Number(process.env.PORT) || 3000): void {
serve({
fetch: app.fetch,
port,
});
}
const isMain = process.argv[1] ? import.meta.url === pathToFileURL(process.argv[1]).href : false;
if (isMain) {
startServer();
}
// GET /skills
// POST /skills/new

View file

@ -2,7 +2,7 @@ import { createParser } from "eventsource-parser";
import { Agent } from "../agents/agents.js";
import { AskHumanResponsePayload, Run, ToolPermissionAuthorizePayload } from "../runs/runs.js";
import { ListRunsResponse } from "../runs/repo.js";
import { ModelConfig } from "../models/models.js";
import { ModelConfig } from "../models/schema.js";
import { RunEvent } from "../entities/run-events.js";
import z from "zod";

View file

@ -0,0 +1,83 @@
import test from "node:test";
import assert from "node:assert/strict";
import fs from "node:fs/promises";
import path from "node:path";
import { WorkDir } from "../dist/config/config.js";
import { FSModelConfigRepo } from "../dist/models/repo.js";
import { FSMcpConfigRepo } from "../dist/mcp/repo.js";
import { FSAgentsRepo } from "../dist/agents/repo.js";
import { FSRunsRepo } from "../dist/runs/repo.js";
test("uses ROWBOAT_WORKDIR override and eagerly creates expected directories", async () => {
assert.equal(WorkDir, process.env.ROWBOAT_WORKDIR);
for (const dirName of ["agents", "config", "runs"]) {
const stats = await fs.stat(path.join(WorkDir, dirName));
assert.equal(stats.isDirectory(), true);
}
});
test("FSModelConfigRepo returns defaults on a fresh workspace", async () => {
const repo = new FSModelConfigRepo();
const config = await repo.getConfig();
assert.equal(config.defaults.provider, "openai");
assert.equal(config.defaults.model, "gpt-5.1");
assert.equal(config.providers.openai?.flavor, "openai");
});
test("FSMcpConfigRepo returns an empty config on a fresh workspace", async () => {
const repo = new FSMcpConfigRepo();
const config = await repo.getConfig();
assert.deepEqual(config, { mcpServers: {} });
});
test("FSAgentsRepo can create and read nested agent files", async () => {
const repo = new FSAgentsRepo();
await repo.create({
name: "team/copilot",
description: "Team helper",
provider: "openai",
model: "gpt-5.1",
instructions: "Be helpful.",
});
const fetched = await repo.fetch("team/copilot");
assert.equal(fetched.name, "team/copilot");
assert.equal(fetched.description, "Team helper");
assert.equal(fetched.instructions, "Be helpful.");
});
test("FSRunsRepo creates, fetches, and lists runs", async () => {
let nextId = 0;
const repo = new FSRunsRepo({
idGenerator: {
next: async () => `run-${++nextId}`,
},
});
const first = await repo.create({ agentId: "copilot" });
await repo.appendEvents(first.id, [{
type: "message",
runId: first.id,
subflow: [],
messageId: "msg-1",
message: {
role: "user",
content: "hello",
},
}]);
const second = await repo.create({ agentId: "planner" });
const fetched = await repo.fetch(first.id);
assert.equal(fetched.id, first.id);
assert.equal(fetched.agentId, "copilot");
assert.equal(fetched.log.length, 2);
assert.equal(fetched.log[1].type, "message");
const listed = await repo.list();
assert.deepEqual(listed.runs.map((run) => run.id), [second.id, first.id]);
});

View file

@ -0,0 +1,31 @@
import { mkdtemp, rm } from "node:fs/promises";
import { tmpdir } from "node:os";
import path from "node:path";
import { spawn } from "node:child_process";
import { fileURLToPath } from "node:url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const packageDir = path.resolve(__dirname, "..");
const tempRoot = await mkdtemp(path.join(tmpdir(), "rowboat-cli-test-"));
const testWorkDir = path.join(tempRoot, "workspace");
try {
const exitCode = await new Promise((resolve, reject) => {
const child = spawn(process.execPath, ["--test", "./test/repos.test.mjs", "./test/server.test.mjs"], {
cwd: packageDir,
stdio: "inherit",
env: {
...process.env,
ROWBOAT_WORKDIR: testWorkDir,
},
});
child.on("error", reject);
child.on("exit", (code) => resolve(code ?? 1));
});
process.exitCode = Number(exitCode);
} finally {
await rm(tempRoot, { recursive: true, force: true });
}

View file

@ -0,0 +1,131 @@
import test from "node:test";
import assert from "node:assert/strict";
import { createApp } from "../dist/server.js";
test("message endpoint creates a message and returns its id", async () => {
const calls = [];
const app = createApp({
createMessage: async (runId, message) => {
calls.push({ runId, message });
return "msg-123";
},
authorizePermission: async () => {},
replyToHumanInputRequest: async () => {},
stop: async () => {},
subscribeToEvents: async () => () => {},
});
const response = await app.request("/runs/run-1/messages/new", {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify({ message: "hello" }),
});
assert.equal(response.status, 200);
assert.deepEqual(await response.json(), { messageId: "msg-123" });
assert.deepEqual(calls, [{ runId: "run-1", message: "hello" }]);
});
test("permission endpoint validates payload and calls dependency", async () => {
const calls = [];
const app = createApp({
createMessage: async () => "unused",
authorizePermission: async (runId, payload) => {
calls.push({ runId, payload });
},
replyToHumanInputRequest: async () => {},
stop: async () => {},
subscribeToEvents: async () => () => {},
});
const response = await app.request("/runs/run-2/permissions/authorize", {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify({
subflow: ["child"],
toolCallId: "tool-1",
response: "approve",
}),
});
assert.equal(response.status, 200);
assert.deepEqual(await response.json(), { success: true });
assert.deepEqual(calls, [{
runId: "run-2",
payload: {
subflow: ["child"],
toolCallId: "tool-1",
response: "approve",
},
}]);
});
test("invalid message payload returns a validation error", async () => {
const app = createApp({
createMessage: async () => "unused",
authorizePermission: async () => {},
replyToHumanInputRequest: async () => {},
stop: async () => {},
subscribeToEvents: async () => () => {},
});
const response = await app.request("/runs/run-1/messages/new", {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify({}),
});
assert.equal(response.status, 400);
});
test("openapi endpoint is exposed", async () => {
const app = createApp({
createMessage: async () => "unused",
authorizePermission: async () => {},
replyToHumanInputRequest: async () => {},
stop: async () => {},
subscribeToEvents: async () => () => {},
});
const response = await app.request("/openapi.json");
const body = await response.json();
assert.equal(response.status, 200);
assert.equal(body.info.title, "Hono");
assert.ok(body.paths["/runs/{runId}/messages/new"]);
});
test("stream endpoint emits SSE payloads and unsubscribes on cancel", async () => {
let listener;
let unsubscribed = false;
const app = createApp({
createMessage: async () => "unused",
authorizePermission: async () => {},
replyToHumanInputRequest: async () => {},
stop: async () => {},
subscribeToEvents: async (fn) => {
listener = fn;
return () => {
unsubscribed = true;
};
},
});
const response = await app.request("/stream");
assert.equal(response.status, 200);
assert.equal(response.headers.get("content-type"), "text/event-stream");
await listener({ type: "message", data: { hello: "world" } });
const reader = response.body.getReader();
const chunk = await reader.read();
const text = new TextDecoder().decode(chunk.value);
assert.match(text, /event: message/);
assert.match(text, /"hello":"world"/);
await reader.cancel();
await new Promise((resolve) => setTimeout(resolve, 0));
assert.equal(unsubscribed, true);
});

View file

@ -2,7 +2,7 @@
"$schema": "https://mintlify.com/docs.json",
"theme": "maple",
"name": "Rowboat",
"description": "Rowboat is an open-source platform for building multi-agent systems. It helps you orchestrate tools, RAG, memory, and deployable agents with ease.",
"description": "Rowboat is a local-first AI coworker with transparent Markdown memory, agent workflows, tools, and optional hosted services.",
"favicon": "/favicon.ico",
"colors": {
"primary": "#6366F1",
@ -57,4 +57,4 @@
]
}
}

View file

@ -1,6 +1,16 @@
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
# Rowboat Web App
## Getting Started
`apps/rowboat` is the hosted or self-hosted Next.js application in this repository. It is the server-backed Rowboat surface with project-scoped conversations, data sources, jobs, integrations, billing hooks, and RAG infrastructure.
## What Lives Here
- Next.js 15 App Router application
- Project and conversation APIs under `app/api`
- Dependency injection container in `di/container.ts`
- Layered backend code under `src/application`, `src/entities`, `src/infrastructure`, and `src/interface-adapters`
- Background workers for jobs and RAG ingestion in `app/scripts`
## Local Development
Install dependencies:
@ -8,29 +18,45 @@ Install dependencies:
npm install
```
First, run the development server:
Run the app:
```bash
npm run dev
```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
Useful commands:
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
```bash
npm run verify
npm run build
npm run lint
npm run typecheck
npm run rag-worker
npm run jobs-worker
npm run setupQdrant
npm run deleteQdrant
```
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
## Infrastructure Dependencies
## Learn More
This app can depend on several services, depending on the features you enable:
To learn more about Next.js, take a look at the following resources:
- MongoDB for primary application data
- Redis for caching, pub/sub, and job coordination
- Qdrant for vector search
- Local uploads or S3 for document storage
- External model providers and integrations configured through environment variables
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
The root `docker-compose.yml` is the easiest way to see the expected service topology.
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js/) - your feedback and contributions are welcome!
## Environment
## Deploy on Vercel
Start from the repository `.env.example` and add the services you need. Common feature flags include auth, RAG, uploads, scraping, billing, and chat widget support.
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
## Architectural Notes
Check out our [Next.js deployment documentation](https://nextjs.org/docs/deployment) for more details.
- Route handlers stay thin and resolve controllers from the DI container.
- Use cases and repositories are split by domain.
- Workers in `app/scripts` handle asynchronous processing such as document ingestion and recurring jobs.
If you are trying to understand where a feature belongs in the repo, read the root `ARCHITECTURE.md` first.

View file

@ -1,22 +1,40 @@
import { NextRequest } from "next/server";
import { z } from "zod";
import { ApiResponse } from "@/app/lib/types/api_types";
import { ApiRequest } from "@/app/lib/types/api_types";
import { PrefixLogger } from "../../../../lib/utils";
import { container } from "@/di/container";
import { IRunTurnController } from "@/src/interface-adapters/controllers/conversations/run-turn.controller";
type ChatRouteContext = { params: Promise<{ projectId: string }> };
interface LoggerLike {
log(message: string): void;
}
interface ChatRouteDependencies {
createLogger(requestId: string): LoggerLike;
resolveRunTurnController(): IRunTurnController | Promise<IRunTurnController>;
}
const defaultDependencies: ChatRouteDependencies = {
createLogger: (requestId) => new PrefixLogger(`${requestId}`),
resolveRunTurnController: async () => {
const { container } = await import("@/di/container");
return container.resolve<IRunTurnController>("runTurnController");
},
};
// get next turn / agent response
export async function POST(
req: NextRequest,
{ params }: { params: Promise<{ projectId: string }> }
export async function handlePostChat(
req: Request,
{ params }: ChatRouteContext,
deps: ChatRouteDependencies = defaultDependencies,
): Promise<Response> {
const { projectId } = await params;
const requestId = crypto.randomUUID();
const logger = new PrefixLogger(`${requestId}`);
const logger = deps.createLogger(requestId);
// parse and validate the request body
let data;
let data: z.infer<typeof ApiRequest>;
try {
const body = await req.json();
data = ApiRequest.parse(body);
@ -26,7 +44,7 @@ export async function POST(
}
const { conversationId, messages, mockTools, stream } = data;
const runTurnController = container.resolve<IRunTurnController>("runTurnController");
const runTurnController = await deps.resolveRunTurnController();
// get assistant response
const response = await runTurnController.execute({
@ -81,3 +99,7 @@ export async function POST(
};
return Response.json(responseBody);
}
export async function POST(req: Request, context: ChatRouteContext): Promise<Response> {
return handlePostChat(req, context);
}

View file

@ -1,162 +1,14 @@
import { asClass, createContainer, InjectionMode } from "awilix";
import { createContainer, InjectionMode } from "awilix";
// Services
import { RedisPubSubService } from "@/src/infrastructure/services/redis.pub-sub.service";
import { S3UploadsStorageService } from "@/src/infrastructure/services/s3.uploads-storage.service";
import { LocalUploadsStorageService } from "@/src/infrastructure/services/local.uploads-storage.service";
import { RunConversationTurnUseCase } from "@/src/application/use-cases/conversations/run-conversation-turn.use-case";
import { MongoDBConversationsRepository } from "@/src/infrastructure/repositories/mongodb.conversations.repository";
import { RunCachedTurnController } from "@/src/interface-adapters/controllers/conversations/run-cached-turn.controller";
import { CreatePlaygroundConversationController } from "@/src/interface-adapters/controllers/conversations/create-playground-conversation.controller";
import { CreateConversationUseCase } from "@/src/application/use-cases/conversations/create-conversation.use-case";
import { RedisCacheService } from "@/src/infrastructure/services/redis.cache.service";
import { CreateCachedTurnUseCase } from "@/src/application/use-cases/conversations/create-cached-turn.use-case";
import { FetchCachedTurnUseCase } from "@/src/application/use-cases/conversations/fetch-cached-turn.use-case";
import { CreateCachedTurnController } from "@/src/interface-adapters/controllers/conversations/create-cached-turn.controller";
import { RunTurnController } from "@/src/interface-adapters/controllers/conversations/run-turn.controller";
import { RedisUsageQuotaPolicy } from "@/src/infrastructure/policies/redis.usage-quota.policy";
import { ProjectActionAuthorizationPolicy } from "@/src/application/policies/project-action-authorization.policy";
import { MongoDBProjectMembersRepository } from "@/src/infrastructure/repositories/mongodb.project-members.repository";
import { MongoDBApiKeysRepository } from "@/src/infrastructure/repositories/mongodb.api-keys.repository";
import { MongodbProjectsRepository } from "@/src/infrastructure/repositories/mongodb.projects.repository";
import { MongodbComposioTriggerDeploymentsRepository } from "@/src/infrastructure/repositories/mongodb.composio-trigger-deployments.repository";
import { CreateComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/create-composio-trigger-deployment.use-case";
import { ListComposioTriggerDeploymentsUseCase } from "@/src/application/use-cases/composio-trigger-deployments/list-composio-trigger-deployments.use-case";
import { FetchComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/fetch-composio-trigger-deployment.use-case";
import { DeleteComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/delete-composio-trigger-deployment.use-case";
import { ListComposioTriggerTypesUseCase } from "@/src/application/use-cases/composio-trigger-deployments/list-composio-trigger-types.use-case";
import { HandleCompsioWebhookRequestUseCase } from "@/src/application/use-cases/composio/webhook/handle-composio-webhook-request.use-case";
import { MongoDBJobsRepository } from "@/src/infrastructure/repositories/mongodb.jobs.repository";
import { CreateComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/create-composio-trigger-deployment.controller";
import { DeleteComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/delete-composio-trigger-deployment.controller";
import { ListComposioTriggerDeploymentsController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/list-composio-trigger-deployments.controller";
import { FetchComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/fetch-composio-trigger-deployment.controller";
import { ListComposioTriggerTypesController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/list-composio-trigger-types.controller";
import { HandleComposioWebhookRequestController } from "@/src/interface-adapters/controllers/composio/webhook/handle-composio-webhook-request.controller";
import { JobsWorker } from "@/src/application/workers/jobs.worker";
import { JobRulesWorker } from "@/src/application/workers/job-rules.worker";
import { ListJobsUseCase } from "@/src/application/use-cases/jobs/list-jobs.use-case";
import { ListJobsController } from "@/src/interface-adapters/controllers/jobs/list-jobs.controller";
import { ListConversationsUseCase } from "@/src/application/use-cases/conversations/list-conversations.use-case";
import { ListConversationsController } from "@/src/interface-adapters/controllers/conversations/list-conversations.controller";
import { FetchJobUseCase } from "@/src/application/use-cases/jobs/fetch-job.use-case";
import { FetchJobController } from "@/src/interface-adapters/controllers/jobs/fetch-job.controller";
import { FetchConversationUseCase } from "@/src/application/use-cases/conversations/fetch-conversation.use-case";
import { FetchConversationController } from "@/src/interface-adapters/controllers/conversations/fetch-conversation.controller";
// Projects
import { CreateProjectUseCase } from "@/src/application/use-cases/projects/create-project.use-case";
import { CreateProjectController } from "@/src/interface-adapters/controllers/projects/create-project.controller";
import { DeleteComposioConnectedAccountUseCase } from "@/src/application/use-cases/projects/delete-composio-connected-account.use-case";
import { DeleteComposioConnectedAccountController } from "@/src/interface-adapters/controllers/projects/delete-composio-connected-account.controller";
import { CreateComposioManagedConnectedAccountUseCase } from "@/src/application/use-cases/projects/create-composio-managed-connected-account.use-case";
import { CreateCustomConnectedAccountUseCase } from "@/src/application/use-cases/projects/create-custom-connected-account.use-case";
import { SyncConnectedAccountUseCase } from "@/src/application/use-cases/projects/sync-connected-account.use-case";
import { ListComposioToolkitsUseCase } from "@/src/application/use-cases/projects/list-composio-toolkits.use-case";
import { GetComposioToolkitUseCase } from "@/src/application/use-cases/projects/get-composio-toolkit.use-case";
import { ListComposioToolsUseCase } from "@/src/application/use-cases/projects/list-composio-tools.use-case";
import { AddCustomMcpServerUseCase } from "@/src/application/use-cases/projects/add-custom-mcp-server.use-case";
import { RemoveCustomMcpServerUseCase } from "@/src/application/use-cases/projects/remove-custom-mcp-server.use-case";
import { CreateComposioManagedConnectedAccountController } from "@/src/interface-adapters/controllers/projects/create-composio-managed-connected-account.controller";
import { CreateCustomConnectedAccountController } from "@/src/interface-adapters/controllers/projects/create-custom-connected-account.controller";
import { SyncConnectedAccountController } from "@/src/interface-adapters/controllers/projects/sync-connected-account.controller";
import { ListComposioToolkitsController } from "@/src/interface-adapters/controllers/projects/list-composio-toolkits.controller";
import { GetComposioToolkitController } from "@/src/interface-adapters/controllers/projects/get-composio-toolkit.controller";
import { ListComposioToolsController } from "@/src/interface-adapters/controllers/projects/list-composio-tools.controller";
import { AddCustomMcpServerController } from "@/src/interface-adapters/controllers/projects/add-custom-mcp-server.controller";
import { RemoveCustomMcpServerController } from "@/src/interface-adapters/controllers/projects/remove-custom-mcp-server.controller";
// Scheduled Job Rules
import { MongoDBScheduledJobRulesRepository } from "@/src/infrastructure/repositories/mongodb.scheduled-job-rules.repository";
import { CreateScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/create-scheduled-job-rule.use-case";
import { FetchScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/fetch-scheduled-job-rule.use-case";
import { ListScheduledJobRulesUseCase } from "@/src/application/use-cases/scheduled-job-rules/list-scheduled-job-rules.use-case";
import { DeleteScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/delete-scheduled-job-rule.use-case";
import { UpdateScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/update-scheduled-job-rule.use-case";
import { CreateScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/create-scheduled-job-rule.controller";
import { FetchScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/fetch-scheduled-job-rule.controller";
import { ListScheduledJobRulesController } from "@/src/interface-adapters/controllers/scheduled-job-rules/list-scheduled-job-rules.controller";
import { DeleteScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/delete-scheduled-job-rule.controller";
import { UpdateScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/update-scheduled-job-rule.controller";
// Recurring Job Rules
import { MongoDBRecurringJobRulesRepository } from "@/src/infrastructure/repositories/mongodb.recurring-job-rules.repository";
import { CreateRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/create-recurring-job-rule.use-case";
import { FetchRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/fetch-recurring-job-rule.use-case";
import { ListRecurringJobRulesUseCase } from "@/src/application/use-cases/recurring-job-rules/list-recurring-job-rules.use-case";
import { ToggleRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/toggle-recurring-job-rule.use-case";
import { DeleteRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/delete-recurring-job-rule.use-case";
import { UpdateRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/update-recurring-job-rule.use-case";
import { CreateRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/create-recurring-job-rule.controller";
import { FetchRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/fetch-recurring-job-rule.controller";
import { ListRecurringJobRulesController } from "@/src/interface-adapters/controllers/recurring-job-rules/list-recurring-job-rules.controller";
import { ToggleRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/toggle-recurring-job-rule.controller";
import { DeleteRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/delete-recurring-job-rule.controller";
import { UpdateRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/update-recurring-job-rule.controller";
// API Keys
import { CreateApiKeyUseCase } from "@/src/application/use-cases/api-keys/create-api-key.use-case";
import { ListApiKeysUseCase } from "@/src/application/use-cases/api-keys/list-api-keys.use-case";
import { DeleteApiKeyUseCase } from "@/src/application/use-cases/api-keys/delete-api-key.use-case";
import { CreateApiKeyController } from "@/src/interface-adapters/controllers/api-keys/create-api-key.controller";
import { ListApiKeysController } from "@/src/interface-adapters/controllers/api-keys/list-api-keys.controller";
import { DeleteApiKeyController } from "@/src/interface-adapters/controllers/api-keys/delete-api-key.controller";
// Data sources
import { MongoDBDataSourcesRepository } from "@/src/infrastructure/repositories/mongodb.data-sources.repository";
import { MongoDBDataSourceDocsRepository } from "@/src/infrastructure/repositories/mongodb.data-source-docs.repository";
import { CreateDataSourceUseCase } from "@/src/application/use-cases/data-sources/create-data-source.use-case";
import { FetchDataSourceUseCase } from "@/src/application/use-cases/data-sources/fetch-data-source.use-case";
import { ListDataSourcesUseCase } from "@/src/application/use-cases/data-sources/list-data-sources.use-case";
import { UpdateDataSourceUseCase } from "@/src/application/use-cases/data-sources/update-data-source.use-case";
import { DeleteDataSourceUseCase } from "@/src/application/use-cases/data-sources/delete-data-source.use-case";
import { ToggleDataSourceUseCase } from "@/src/application/use-cases/data-sources/toggle-data-source.use-case";
import { CreateDataSourceController } from "@/src/interface-adapters/controllers/data-sources/create-data-source.controller";
import { FetchDataSourceController } from "@/src/interface-adapters/controllers/data-sources/fetch-data-source.controller";
import { ListDataSourcesController } from "@/src/interface-adapters/controllers/data-sources/list-data-sources.controller";
import { UpdateDataSourceController } from "@/src/interface-adapters/controllers/data-sources/update-data-source.controller";
import { DeleteDataSourceController } from "@/src/interface-adapters/controllers/data-sources/delete-data-source.controller";
import { ToggleDataSourceController } from "@/src/interface-adapters/controllers/data-sources/toggle-data-source.controller";
import { AddDocsToDataSourceUseCase } from "@/src/application/use-cases/data-sources/add-docs-to-data-source.use-case";
import { ListDocsInDataSourceUseCase } from "@/src/application/use-cases/data-sources/list-docs-in-data-source.use-case";
import { DeleteDocFromDataSourceUseCase } from "@/src/application/use-cases/data-sources/delete-doc-from-data-source.use-case";
import { RecrawlWebDataSourceUseCase } from "@/src/application/use-cases/data-sources/recrawl-web-data-source.use-case";
import { GetUploadUrlsForFilesUseCase } from "@/src/application/use-cases/data-sources/get-upload-urls-for-files.use-case";
import { GetDownloadUrlForFileUseCase } from "@/src/application/use-cases/data-sources/get-download-url-for-file.use-case";
import { AddDocsToDataSourceController } from "@/src/interface-adapters/controllers/data-sources/add-docs-to-data-source.controller";
import { ListDocsInDataSourceController } from "@/src/interface-adapters/controllers/data-sources/list-docs-in-data-source.controller";
import { DeleteDocFromDataSourceController } from "@/src/interface-adapters/controllers/data-sources/delete-doc-from-data-source.controller";
import { RecrawlWebDataSourceController } from "@/src/interface-adapters/controllers/data-sources/recrawl-web-data-source.controller";
import { GetUploadUrlsForFilesController } from "@/src/interface-adapters/controllers/data-sources/get-upload-urls-for-files.controller";
import { GetDownloadUrlForFileController } from "@/src/interface-adapters/controllers/data-sources/get-download-url-for-file.controller";
import { DeleteProjectController } from "@/src/interface-adapters/controllers/projects/delete-project.controller";
import { DeleteProjectUseCase } from "@/src/application/use-cases/projects/delete-project.use-case";
import { ListProjectsUseCase } from "@/src/application/use-cases/projects/list-projects.use-case";
import { ListProjectsController } from "@/src/interface-adapters/controllers/projects/list-projects.controller";
import { FetchProjectUseCase } from "@/src/application/use-cases/projects/fetch-project.use-case";
import { FetchProjectController } from "@/src/interface-adapters/controllers/projects/fetch-project.controller";
import { RotateSecretUseCase } from "@/src/application/use-cases/projects/rotate-secret.use-case";
import { RotateSecretController } from "@/src/interface-adapters/controllers/projects/rotate-secret.controller";
import { UpdateWebhookUrlUseCase } from "@/src/application/use-cases/projects/update-webhook-url.use-case";
import { UpdateWebhookUrlController } from "@/src/interface-adapters/controllers/projects/update-webhook-url.controller";
import { UpdateProjectNameUseCase } from "@/src/application/use-cases/projects/update-project-name.use-case";
import { UpdateProjectNameController } from "@/src/interface-adapters/controllers/projects/update-project-name.controller";
import { UpdateDraftWorkflowUseCase } from "@/src/application/use-cases/projects/update-draft-workflow.use-case";
import { UpdateDraftWorkflowController } from "@/src/interface-adapters/controllers/projects/update-draft-workflow.controller";
import { UpdateLiveWorkflowUseCase } from "@/src/application/use-cases/projects/update-live-workflow.use-case";
import { UpdateLiveWorkflowController } from "@/src/interface-adapters/controllers/projects/update-live-workflow.controller";
import { RevertToLiveWorkflowUseCase } from "@/src/application/use-cases/projects/revert-to-live-workflow.use-case";
import { RevertToLiveWorkflowController } from "@/src/interface-adapters/controllers/projects/revert-to-live-workflow.controller";
// copilot
import { CreateCopilotCachedTurnUseCase } from "@/src/application/use-cases/copilot/create-copilot-cached-turn.use-case";
import { CreateCopilotCachedTurnController } from "@/src/interface-adapters/controllers/copilot/create-copilot-cached-turn.controller";
import { RunCopilotCachedTurnUseCase } from "@/src/application/use-cases/copilot/run-copilot-cached-turn.use-case";
import { RunCopilotCachedTurnController } from "@/src/interface-adapters/controllers/copilot/run-copilot-cached-turn.controller";
// users
import { MongoDBUsersRepository } from "@/src/infrastructure/repositories/mongodb.users.repository";
import { coreRegistrations } from "@/di/modules/core";
import { apiKeyRegistrations } from "@/di/modules/api-keys";
import { dataSourceRegistrations } from "@/di/modules/data-sources";
import { jobRegistrations } from "@/di/modules/jobs";
import { projectRegistrations } from "@/di/modules/projects";
import { composioRegistrations } from "@/di/modules/composio";
import { conversationRegistrations } from "@/di/modules/conversations";
import { copilotRegistrations } from "@/di/modules/copilot";
import { userRegistrations } from "@/di/modules/users";
export const container = createContainer({
injectionMode: InjectionMode.PROXY,
@ -164,192 +16,13 @@ export const container = createContainer({
});
container.register({
// workers
// ---
jobsWorker: asClass(JobsWorker).singleton(),
jobRulesWorker: asClass(JobRulesWorker).singleton(),
// services
// ---
cacheService: asClass(RedisCacheService).singleton(),
pubSubService: asClass(RedisPubSubService).singleton(),
s3UploadsStorageService: asClass(S3UploadsStorageService).singleton(),
localUploadsStorageService: asClass(LocalUploadsStorageService).singleton(),
// policies
// ---
usageQuotaPolicy: asClass(RedisUsageQuotaPolicy).singleton(),
projectActionAuthorizationPolicy: asClass(ProjectActionAuthorizationPolicy).singleton(),
// projects
// ---
projectsRepository: asClass(MongodbProjectsRepository).singleton(),
// project members
// ---
projectMembersRepository: asClass(MongoDBProjectMembersRepository).singleton(),
// api keys
// ---
apiKeysRepository: asClass(MongoDBApiKeysRepository).singleton(),
createApiKeyUseCase: asClass(CreateApiKeyUseCase).singleton(),
listApiKeysUseCase: asClass(ListApiKeysUseCase).singleton(),
deleteApiKeyUseCase: asClass(DeleteApiKeyUseCase).singleton(),
createApiKeyController: asClass(CreateApiKeyController).singleton(),
listApiKeysController: asClass(ListApiKeysController).singleton(),
deleteApiKeyController: asClass(DeleteApiKeyController).singleton(),
// data sources
// ---
dataSourcesRepository: asClass(MongoDBDataSourcesRepository).singleton(),
dataSourceDocsRepository: asClass(MongoDBDataSourceDocsRepository).singleton(),
createDataSourceUseCase: asClass(CreateDataSourceUseCase).singleton(),
fetchDataSourceUseCase: asClass(FetchDataSourceUseCase).singleton(),
listDataSourcesUseCase: asClass(ListDataSourcesUseCase).singleton(),
updateDataSourceUseCase: asClass(UpdateDataSourceUseCase).singleton(),
deleteDataSourceUseCase: asClass(DeleteDataSourceUseCase).singleton(),
toggleDataSourceUseCase: asClass(ToggleDataSourceUseCase).singleton(),
createDataSourceController: asClass(CreateDataSourceController).singleton(),
fetchDataSourceController: asClass(FetchDataSourceController).singleton(),
listDataSourcesController: asClass(ListDataSourcesController).singleton(),
updateDataSourceController: asClass(UpdateDataSourceController).singleton(),
deleteDataSourceController: asClass(DeleteDataSourceController).singleton(),
toggleDataSourceController: asClass(ToggleDataSourceController).singleton(),
addDocsToDataSourceUseCase: asClass(AddDocsToDataSourceUseCase).singleton(),
listDocsInDataSourceUseCase: asClass(ListDocsInDataSourceUseCase).singleton(),
deleteDocFromDataSourceUseCase: asClass(DeleteDocFromDataSourceUseCase).singleton(),
recrawlWebDataSourceUseCase: asClass(RecrawlWebDataSourceUseCase).singleton(),
getUploadUrlsForFilesUseCase: asClass(GetUploadUrlsForFilesUseCase).singleton(),
getDownloadUrlForFileUseCase: asClass(GetDownloadUrlForFileUseCase).singleton(),
addDocsToDataSourceController: asClass(AddDocsToDataSourceController).singleton(),
listDocsInDataSourceController: asClass(ListDocsInDataSourceController).singleton(),
deleteDocFromDataSourceController: asClass(DeleteDocFromDataSourceController).singleton(),
recrawlWebDataSourceController: asClass(RecrawlWebDataSourceController).singleton(),
getUploadUrlsForFilesController: asClass(GetUploadUrlsForFilesController).singleton(),
getDownloadUrlForFileController: asClass(GetDownloadUrlForFileController).singleton(),
// jobs
// ---
jobsRepository: asClass(MongoDBJobsRepository).singleton(),
listJobsUseCase: asClass(ListJobsUseCase).singleton(),
listJobsController: asClass(ListJobsController).singleton(),
fetchJobUseCase: asClass(FetchJobUseCase).singleton(),
fetchJobController: asClass(FetchJobController).singleton(),
// scheduled job rules
// ---
scheduledJobRulesRepository: asClass(MongoDBScheduledJobRulesRepository).singleton(),
createScheduledJobRuleUseCase: asClass(CreateScheduledJobRuleUseCase).singleton(),
fetchScheduledJobRuleUseCase: asClass(FetchScheduledJobRuleUseCase).singleton(),
listScheduledJobRulesUseCase: asClass(ListScheduledJobRulesUseCase).singleton(),
updateScheduledJobRuleUseCase: asClass(UpdateScheduledJobRuleUseCase).singleton(),
deleteScheduledJobRuleUseCase: asClass(DeleteScheduledJobRuleUseCase).singleton(),
createScheduledJobRuleController: asClass(CreateScheduledJobRuleController).singleton(),
fetchScheduledJobRuleController: asClass(FetchScheduledJobRuleController).singleton(),
listScheduledJobRulesController: asClass(ListScheduledJobRulesController).singleton(),
updateScheduledJobRuleController: asClass(UpdateScheduledJobRuleController).singleton(),
deleteScheduledJobRuleController: asClass(DeleteScheduledJobRuleController).singleton(),
// recurring job rules
// ---
recurringJobRulesRepository: asClass(MongoDBRecurringJobRulesRepository).singleton(),
createRecurringJobRuleUseCase: asClass(CreateRecurringJobRuleUseCase).singleton(),
fetchRecurringJobRuleUseCase: asClass(FetchRecurringJobRuleUseCase).singleton(),
listRecurringJobRulesUseCase: asClass(ListRecurringJobRulesUseCase).singleton(),
toggleRecurringJobRuleUseCase: asClass(ToggleRecurringJobRuleUseCase).singleton(),
updateRecurringJobRuleUseCase: asClass(UpdateRecurringJobRuleUseCase).singleton(),
deleteRecurringJobRuleUseCase: asClass(DeleteRecurringJobRuleUseCase).singleton(),
createRecurringJobRuleController: asClass(CreateRecurringJobRuleController).singleton(),
fetchRecurringJobRuleController: asClass(FetchRecurringJobRuleController).singleton(),
listRecurringJobRulesController: asClass(ListRecurringJobRulesController).singleton(),
toggleRecurringJobRuleController: asClass(ToggleRecurringJobRuleController).singleton(),
updateRecurringJobRuleController: asClass(UpdateRecurringJobRuleController).singleton(),
deleteRecurringJobRuleController: asClass(DeleteRecurringJobRuleController).singleton(),
// projects
// ---
createProjectUseCase: asClass(CreateProjectUseCase).singleton(),
createProjectController: asClass(CreateProjectController).singleton(),
fetchProjectUseCase: asClass(FetchProjectUseCase).singleton(),
fetchProjectController: asClass(FetchProjectController).singleton(),
listProjectsUseCase: asClass(ListProjectsUseCase).singleton(),
listProjectsController: asClass(ListProjectsController).singleton(),
rotateSecretUseCase: asClass(RotateSecretUseCase).singleton(),
rotateSecretController: asClass(RotateSecretController).singleton(),
updateWebhookUrlUseCase: asClass(UpdateWebhookUrlUseCase).singleton(),
updateWebhookUrlController: asClass(UpdateWebhookUrlController).singleton(),
updateProjectNameUseCase: asClass(UpdateProjectNameUseCase).singleton(),
updateProjectNameController: asClass(UpdateProjectNameController).singleton(),
updateDraftWorkflowUseCase: asClass(UpdateDraftWorkflowUseCase).singleton(),
updateDraftWorkflowController: asClass(UpdateDraftWorkflowController).singleton(),
updateLiveWorkflowUseCase: asClass(UpdateLiveWorkflowUseCase).singleton(),
updateLiveWorkflowController: asClass(UpdateLiveWorkflowController).singleton(),
revertToLiveWorkflowUseCase: asClass(RevertToLiveWorkflowUseCase).singleton(),
revertToLiveWorkflowController: asClass(RevertToLiveWorkflowController).singleton(),
deleteProjectUseCase: asClass(DeleteProjectUseCase).singleton(),
deleteProjectController: asClass(DeleteProjectController).singleton(),
deleteComposioConnectedAccountController: asClass(DeleteComposioConnectedAccountController).singleton(),
deleteComposioConnectedAccountUseCase: asClass(DeleteComposioConnectedAccountUseCase).singleton(),
createComposioManagedConnectedAccountUseCase: asClass(CreateComposioManagedConnectedAccountUseCase).singleton(),
createComposioManagedConnectedAccountController: asClass(CreateComposioManagedConnectedAccountController).singleton(),
createCustomConnectedAccountUseCase: asClass(CreateCustomConnectedAccountUseCase).singleton(),
createCustomConnectedAccountController: asClass(CreateCustomConnectedAccountController).singleton(),
syncConnectedAccountUseCase: asClass(SyncConnectedAccountUseCase).singleton(),
syncConnectedAccountController: asClass(SyncConnectedAccountController).singleton(),
listComposioToolkitsUseCase: asClass(ListComposioToolkitsUseCase).singleton(),
listComposioToolkitsController: asClass(ListComposioToolkitsController).singleton(),
getComposioToolkitUseCase: asClass(GetComposioToolkitUseCase).singleton(),
getComposioToolkitController: asClass(GetComposioToolkitController).singleton(),
listComposioToolsUseCase: asClass(ListComposioToolsUseCase).singleton(),
listComposioToolsController: asClass(ListComposioToolsController).singleton(),
addCustomMcpServerUseCase: asClass(AddCustomMcpServerUseCase).singleton(),
addCustomMcpServerController: asClass(AddCustomMcpServerController).singleton(),
removeCustomMcpServerUseCase: asClass(RemoveCustomMcpServerUseCase).singleton(),
removeCustomMcpServerController: asClass(RemoveCustomMcpServerController).singleton(),
// composio
// ---
handleCompsioWebhookRequestUseCase: asClass(HandleCompsioWebhookRequestUseCase).singleton(),
handleComposioWebhookRequestController: asClass(HandleComposioWebhookRequestController).singleton(),
// composio trigger deployments
// ---
composioTriggerDeploymentsRepository: asClass(MongodbComposioTriggerDeploymentsRepository).singleton(),
listComposioTriggerTypesUseCase: asClass(ListComposioTriggerTypesUseCase).singleton(),
createComposioTriggerDeploymentUseCase: asClass(CreateComposioTriggerDeploymentUseCase).singleton(),
listComposioTriggerDeploymentsUseCase: asClass(ListComposioTriggerDeploymentsUseCase).singleton(),
fetchComposioTriggerDeploymentUseCase: asClass(FetchComposioTriggerDeploymentUseCase).singleton(),
deleteComposioTriggerDeploymentUseCase: asClass(DeleteComposioTriggerDeploymentUseCase).singleton(),
createComposioTriggerDeploymentController: asClass(CreateComposioTriggerDeploymentController).singleton(),
deleteComposioTriggerDeploymentController: asClass(DeleteComposioTriggerDeploymentController).singleton(),
listComposioTriggerDeploymentsController: asClass(ListComposioTriggerDeploymentsController).singleton(),
fetchComposioTriggerDeploymentController: asClass(FetchComposioTriggerDeploymentController).singleton(),
listComposioTriggerTypesController: asClass(ListComposioTriggerTypesController).singleton(),
// conversations
// ---
conversationsRepository: asClass(MongoDBConversationsRepository).singleton(),
createConversationUseCase: asClass(CreateConversationUseCase).singleton(),
createCachedTurnUseCase: asClass(CreateCachedTurnUseCase).singleton(),
fetchCachedTurnUseCase: asClass(FetchCachedTurnUseCase).singleton(),
runConversationTurnUseCase: asClass(RunConversationTurnUseCase).singleton(),
listConversationsUseCase: asClass(ListConversationsUseCase).singleton(),
fetchConversationUseCase: asClass(FetchConversationUseCase).singleton(),
createPlaygroundConversationController: asClass(CreatePlaygroundConversationController).singleton(),
createCachedTurnController: asClass(CreateCachedTurnController).singleton(),
runCachedTurnController: asClass(RunCachedTurnController).singleton(),
runTurnController: asClass(RunTurnController).singleton(),
listConversationsController: asClass(ListConversationsController).singleton(),
fetchConversationController: asClass(FetchConversationController).singleton(),
// copilot
// ---
createCopilotCachedTurnUseCase: asClass(CreateCopilotCachedTurnUseCase).singleton(),
createCopilotCachedTurnController: asClass(CreateCopilotCachedTurnController).singleton(),
runCopilotCachedTurnUseCase: asClass(RunCopilotCachedTurnUseCase).singleton(),
runCopilotCachedTurnController: asClass(RunCopilotCachedTurnController).singleton(),
// users
// ---
usersRepository: asClass(MongoDBUsersRepository).singleton(),
...coreRegistrations,
...projectRegistrations,
...apiKeyRegistrations,
...dataSourceRegistrations,
...jobRegistrations,
...composioRegistrations,
...conversationRegistrations,
...copilotRegistrations,
...userRegistrations,
});

View file

@ -0,0 +1,19 @@
import { asClass } from "awilix";
import { MongoDBApiKeysRepository } from "@/src/infrastructure/repositories/mongodb.api-keys.repository";
import { CreateApiKeyUseCase } from "@/src/application/use-cases/api-keys/create-api-key.use-case";
import { ListApiKeysUseCase } from "@/src/application/use-cases/api-keys/list-api-keys.use-case";
import { DeleteApiKeyUseCase } from "@/src/application/use-cases/api-keys/delete-api-key.use-case";
import { CreateApiKeyController } from "@/src/interface-adapters/controllers/api-keys/create-api-key.controller";
import { ListApiKeysController } from "@/src/interface-adapters/controllers/api-keys/list-api-keys.controller";
import { DeleteApiKeyController } from "@/src/interface-adapters/controllers/api-keys/delete-api-key.controller";
export const apiKeyRegistrations = {
apiKeysRepository: asClass(MongoDBApiKeysRepository).singleton(),
createApiKeyUseCase: asClass(CreateApiKeyUseCase).singleton(),
listApiKeysUseCase: asClass(ListApiKeysUseCase).singleton(),
deleteApiKeyUseCase: asClass(DeleteApiKeyUseCase).singleton(),
createApiKeyController: asClass(CreateApiKeyController).singleton(),
listApiKeysController: asClass(ListApiKeysController).singleton(),
deleteApiKeyController: asClass(DeleteApiKeyController).singleton(),
};

View file

@ -0,0 +1,31 @@
import { asClass } from "awilix";
import { MongodbComposioTriggerDeploymentsRepository } from "@/src/infrastructure/repositories/mongodb.composio-trigger-deployments.repository";
import { CreateComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/create-composio-trigger-deployment.use-case";
import { ListComposioTriggerDeploymentsUseCase } from "@/src/application/use-cases/composio-trigger-deployments/list-composio-trigger-deployments.use-case";
import { FetchComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/fetch-composio-trigger-deployment.use-case";
import { DeleteComposioTriggerDeploymentUseCase } from "@/src/application/use-cases/composio-trigger-deployments/delete-composio-trigger-deployment.use-case";
import { ListComposioTriggerTypesUseCase } from "@/src/application/use-cases/composio-trigger-deployments/list-composio-trigger-types.use-case";
import { HandleCompsioWebhookRequestUseCase } from "@/src/application/use-cases/composio/webhook/handle-composio-webhook-request.use-case";
import { CreateComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/create-composio-trigger-deployment.controller";
import { DeleteComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/delete-composio-trigger-deployment.controller";
import { ListComposioTriggerDeploymentsController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/list-composio-trigger-deployments.controller";
import { FetchComposioTriggerDeploymentController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/fetch-composio-trigger-deployment.controller";
import { ListComposioTriggerTypesController } from "@/src/interface-adapters/controllers/composio-trigger-deployments/list-composio-trigger-types.controller";
import { HandleComposioWebhookRequestController } from "@/src/interface-adapters/controllers/composio/webhook/handle-composio-webhook-request.controller";
export const composioRegistrations = {
handleCompsioWebhookRequestUseCase: asClass(HandleCompsioWebhookRequestUseCase).singleton(),
handleComposioWebhookRequestController: asClass(HandleComposioWebhookRequestController).singleton(),
composioTriggerDeploymentsRepository: asClass(MongodbComposioTriggerDeploymentsRepository).singleton(),
listComposioTriggerTypesUseCase: asClass(ListComposioTriggerTypesUseCase).singleton(),
createComposioTriggerDeploymentUseCase: asClass(CreateComposioTriggerDeploymentUseCase).singleton(),
listComposioTriggerDeploymentsUseCase: asClass(ListComposioTriggerDeploymentsUseCase).singleton(),
fetchComposioTriggerDeploymentUseCase: asClass(FetchComposioTriggerDeploymentUseCase).singleton(),
deleteComposioTriggerDeploymentUseCase: asClass(DeleteComposioTriggerDeploymentUseCase).singleton(),
createComposioTriggerDeploymentController: asClass(CreateComposioTriggerDeploymentController).singleton(),
deleteComposioTriggerDeploymentController: asClass(DeleteComposioTriggerDeploymentController).singleton(),
listComposioTriggerDeploymentsController: asClass(ListComposioTriggerDeploymentsController).singleton(),
fetchComposioTriggerDeploymentController: asClass(FetchComposioTriggerDeploymentController).singleton(),
listComposioTriggerTypesController: asClass(ListComposioTriggerTypesController).singleton(),
};

View file

@ -0,0 +1,31 @@
import { asClass } from "awilix";
import { MongoDBConversationsRepository } from "@/src/infrastructure/repositories/mongodb.conversations.repository";
import { CreateConversationUseCase } from "@/src/application/use-cases/conversations/create-conversation.use-case";
import { CreateCachedTurnUseCase } from "@/src/application/use-cases/conversations/create-cached-turn.use-case";
import { FetchCachedTurnUseCase } from "@/src/application/use-cases/conversations/fetch-cached-turn.use-case";
import { RunConversationTurnUseCase } from "@/src/application/use-cases/conversations/run-conversation-turn.use-case";
import { ListConversationsUseCase } from "@/src/application/use-cases/conversations/list-conversations.use-case";
import { FetchConversationUseCase } from "@/src/application/use-cases/conversations/fetch-conversation.use-case";
import { CreatePlaygroundConversationController } from "@/src/interface-adapters/controllers/conversations/create-playground-conversation.controller";
import { CreateCachedTurnController } from "@/src/interface-adapters/controllers/conversations/create-cached-turn.controller";
import { RunCachedTurnController } from "@/src/interface-adapters/controllers/conversations/run-cached-turn.controller";
import { RunTurnController } from "@/src/interface-adapters/controllers/conversations/run-turn.controller";
import { ListConversationsController } from "@/src/interface-adapters/controllers/conversations/list-conversations.controller";
import { FetchConversationController } from "@/src/interface-adapters/controllers/conversations/fetch-conversation.controller";
export const conversationRegistrations = {
conversationsRepository: asClass(MongoDBConversationsRepository).singleton(),
createConversationUseCase: asClass(CreateConversationUseCase).singleton(),
createCachedTurnUseCase: asClass(CreateCachedTurnUseCase).singleton(),
fetchCachedTurnUseCase: asClass(FetchCachedTurnUseCase).singleton(),
runConversationTurnUseCase: asClass(RunConversationTurnUseCase).singleton(),
listConversationsUseCase: asClass(ListConversationsUseCase).singleton(),
fetchConversationUseCase: asClass(FetchConversationUseCase).singleton(),
createPlaygroundConversationController: asClass(CreatePlaygroundConversationController).singleton(),
createCachedTurnController: asClass(CreateCachedTurnController).singleton(),
runCachedTurnController: asClass(RunCachedTurnController).singleton(),
runTurnController: asClass(RunTurnController).singleton(),
listConversationsController: asClass(ListConversationsController).singleton(),
fetchConversationController: asClass(FetchConversationController).singleton(),
};

View file

@ -0,0 +1,13 @@
import { asClass } from "awilix";
import { CreateCopilotCachedTurnUseCase } from "@/src/application/use-cases/copilot/create-copilot-cached-turn.use-case";
import { RunCopilotCachedTurnUseCase } from "@/src/application/use-cases/copilot/run-copilot-cached-turn.use-case";
import { CreateCopilotCachedTurnController } from "@/src/interface-adapters/controllers/copilot/create-copilot-cached-turn.controller";
import { RunCopilotCachedTurnController } from "@/src/interface-adapters/controllers/copilot/run-copilot-cached-turn.controller";
export const copilotRegistrations = {
createCopilotCachedTurnUseCase: asClass(CreateCopilotCachedTurnUseCase).singleton(),
createCopilotCachedTurnController: asClass(CreateCopilotCachedTurnController).singleton(),
runCopilotCachedTurnUseCase: asClass(RunCopilotCachedTurnUseCase).singleton(),
runCopilotCachedTurnController: asClass(RunCopilotCachedTurnController).singleton(),
};

View file

@ -0,0 +1,21 @@
import { asClass } from "awilix";
import { RedisPubSubService } from "@/src/infrastructure/services/redis.pub-sub.service";
import { S3UploadsStorageService } from "@/src/infrastructure/services/s3.uploads-storage.service";
import { LocalUploadsStorageService } from "@/src/infrastructure/services/local.uploads-storage.service";
import { RedisCacheService } from "@/src/infrastructure/services/redis.cache.service";
import { RedisUsageQuotaPolicy } from "@/src/infrastructure/policies/redis.usage-quota.policy";
import { ProjectActionAuthorizationPolicy } from "@/src/application/policies/project-action-authorization.policy";
import { JobsWorker } from "@/src/application/workers/jobs.worker";
import { JobRulesWorker } from "@/src/application/workers/job-rules.worker";
export const coreRegistrations = {
jobsWorker: asClass(JobsWorker).singleton(),
jobRulesWorker: asClass(JobRulesWorker).singleton(),
cacheService: asClass(RedisCacheService).singleton(),
pubSubService: asClass(RedisPubSubService).singleton(),
s3UploadsStorageService: asClass(S3UploadsStorageService).singleton(),
localUploadsStorageService: asClass(LocalUploadsStorageService).singleton(),
usageQuotaPolicy: asClass(RedisUsageQuotaPolicy).singleton(),
projectActionAuthorizationPolicy: asClass(ProjectActionAuthorizationPolicy).singleton(),
};

View file

@ -0,0 +1,57 @@
import { asClass } from "awilix";
import { MongoDBDataSourcesRepository } from "@/src/infrastructure/repositories/mongodb.data-sources.repository";
import { MongoDBDataSourceDocsRepository } from "@/src/infrastructure/repositories/mongodb.data-source-docs.repository";
import { CreateDataSourceUseCase } from "@/src/application/use-cases/data-sources/create-data-source.use-case";
import { FetchDataSourceUseCase } from "@/src/application/use-cases/data-sources/fetch-data-source.use-case";
import { ListDataSourcesUseCase } from "@/src/application/use-cases/data-sources/list-data-sources.use-case";
import { UpdateDataSourceUseCase } from "@/src/application/use-cases/data-sources/update-data-source.use-case";
import { DeleteDataSourceUseCase } from "@/src/application/use-cases/data-sources/delete-data-source.use-case";
import { ToggleDataSourceUseCase } from "@/src/application/use-cases/data-sources/toggle-data-source.use-case";
import { AddDocsToDataSourceUseCase } from "@/src/application/use-cases/data-sources/add-docs-to-data-source.use-case";
import { ListDocsInDataSourceUseCase } from "@/src/application/use-cases/data-sources/list-docs-in-data-source.use-case";
import { DeleteDocFromDataSourceUseCase } from "@/src/application/use-cases/data-sources/delete-doc-from-data-source.use-case";
import { RecrawlWebDataSourceUseCase } from "@/src/application/use-cases/data-sources/recrawl-web-data-source.use-case";
import { GetUploadUrlsForFilesUseCase } from "@/src/application/use-cases/data-sources/get-upload-urls-for-files.use-case";
import { GetDownloadUrlForFileUseCase } from "@/src/application/use-cases/data-sources/get-download-url-for-file.use-case";
import { CreateDataSourceController } from "@/src/interface-adapters/controllers/data-sources/create-data-source.controller";
import { FetchDataSourceController } from "@/src/interface-adapters/controllers/data-sources/fetch-data-source.controller";
import { ListDataSourcesController } from "@/src/interface-adapters/controllers/data-sources/list-data-sources.controller";
import { UpdateDataSourceController } from "@/src/interface-adapters/controllers/data-sources/update-data-source.controller";
import { DeleteDataSourceController } from "@/src/interface-adapters/controllers/data-sources/delete-data-source.controller";
import { ToggleDataSourceController } from "@/src/interface-adapters/controllers/data-sources/toggle-data-source.controller";
import { AddDocsToDataSourceController } from "@/src/interface-adapters/controllers/data-sources/add-docs-to-data-source.controller";
import { ListDocsInDataSourceController } from "@/src/interface-adapters/controllers/data-sources/list-docs-in-data-source.controller";
import { DeleteDocFromDataSourceController } from "@/src/interface-adapters/controllers/data-sources/delete-doc-from-data-source.controller";
import { RecrawlWebDataSourceController } from "@/src/interface-adapters/controllers/data-sources/recrawl-web-data-source.controller";
import { GetUploadUrlsForFilesController } from "@/src/interface-adapters/controllers/data-sources/get-upload-urls-for-files.controller";
import { GetDownloadUrlForFileController } from "@/src/interface-adapters/controllers/data-sources/get-download-url-for-file.controller";
export const dataSourceRegistrations = {
dataSourcesRepository: asClass(MongoDBDataSourcesRepository).singleton(),
dataSourceDocsRepository: asClass(MongoDBDataSourceDocsRepository).singleton(),
createDataSourceUseCase: asClass(CreateDataSourceUseCase).singleton(),
fetchDataSourceUseCase: asClass(FetchDataSourceUseCase).singleton(),
listDataSourcesUseCase: asClass(ListDataSourcesUseCase).singleton(),
updateDataSourceUseCase: asClass(UpdateDataSourceUseCase).singleton(),
deleteDataSourceUseCase: asClass(DeleteDataSourceUseCase).singleton(),
toggleDataSourceUseCase: asClass(ToggleDataSourceUseCase).singleton(),
createDataSourceController: asClass(CreateDataSourceController).singleton(),
fetchDataSourceController: asClass(FetchDataSourceController).singleton(),
listDataSourcesController: asClass(ListDataSourcesController).singleton(),
updateDataSourceController: asClass(UpdateDataSourceController).singleton(),
deleteDataSourceController: asClass(DeleteDataSourceController).singleton(),
toggleDataSourceController: asClass(ToggleDataSourceController).singleton(),
addDocsToDataSourceUseCase: asClass(AddDocsToDataSourceUseCase).singleton(),
listDocsInDataSourceUseCase: asClass(ListDocsInDataSourceUseCase).singleton(),
deleteDocFromDataSourceUseCase: asClass(DeleteDocFromDataSourceUseCase).singleton(),
recrawlWebDataSourceUseCase: asClass(RecrawlWebDataSourceUseCase).singleton(),
getUploadUrlsForFilesUseCase: asClass(GetUploadUrlsForFilesUseCase).singleton(),
getDownloadUrlForFileUseCase: asClass(GetDownloadUrlForFileUseCase).singleton(),
addDocsToDataSourceController: asClass(AddDocsToDataSourceController).singleton(),
listDocsInDataSourceController: asClass(ListDocsInDataSourceController).singleton(),
deleteDocFromDataSourceController: asClass(DeleteDocFromDataSourceController).singleton(),
recrawlWebDataSourceController: asClass(RecrawlWebDataSourceController).singleton(),
getUploadUrlsForFilesController: asClass(GetUploadUrlsForFilesController).singleton(),
getDownloadUrlForFileController: asClass(GetDownloadUrlForFileController).singleton(),
};

View file

@ -0,0 +1,63 @@
import { asClass } from "awilix";
import { MongoDBJobsRepository } from "@/src/infrastructure/repositories/mongodb.jobs.repository";
import { MongoDBScheduledJobRulesRepository } from "@/src/infrastructure/repositories/mongodb.scheduled-job-rules.repository";
import { MongoDBRecurringJobRulesRepository } from "@/src/infrastructure/repositories/mongodb.recurring-job-rules.repository";
import { ListJobsUseCase } from "@/src/application/use-cases/jobs/list-jobs.use-case";
import { FetchJobUseCase } from "@/src/application/use-cases/jobs/fetch-job.use-case";
import { ListJobsController } from "@/src/interface-adapters/controllers/jobs/list-jobs.controller";
import { FetchJobController } from "@/src/interface-adapters/controllers/jobs/fetch-job.controller";
import { CreateScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/create-scheduled-job-rule.use-case";
import { FetchScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/fetch-scheduled-job-rule.use-case";
import { ListScheduledJobRulesUseCase } from "@/src/application/use-cases/scheduled-job-rules/list-scheduled-job-rules.use-case";
import { DeleteScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/delete-scheduled-job-rule.use-case";
import { UpdateScheduledJobRuleUseCase } from "@/src/application/use-cases/scheduled-job-rules/update-scheduled-job-rule.use-case";
import { CreateScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/create-scheduled-job-rule.controller";
import { FetchScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/fetch-scheduled-job-rule.controller";
import { ListScheduledJobRulesController } from "@/src/interface-adapters/controllers/scheduled-job-rules/list-scheduled-job-rules.controller";
import { DeleteScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/delete-scheduled-job-rule.controller";
import { UpdateScheduledJobRuleController } from "@/src/interface-adapters/controllers/scheduled-job-rules/update-scheduled-job-rule.controller";
import { CreateRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/create-recurring-job-rule.use-case";
import { FetchRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/fetch-recurring-job-rule.use-case";
import { ListRecurringJobRulesUseCase } from "@/src/application/use-cases/recurring-job-rules/list-recurring-job-rules.use-case";
import { ToggleRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/toggle-recurring-job-rule.use-case";
import { DeleteRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/delete-recurring-job-rule.use-case";
import { UpdateRecurringJobRuleUseCase } from "@/src/application/use-cases/recurring-job-rules/update-recurring-job-rule.use-case";
import { CreateRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/create-recurring-job-rule.controller";
import { FetchRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/fetch-recurring-job-rule.controller";
import { ListRecurringJobRulesController } from "@/src/interface-adapters/controllers/recurring-job-rules/list-recurring-job-rules.controller";
import { ToggleRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/toggle-recurring-job-rule.controller";
import { DeleteRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/delete-recurring-job-rule.controller";
import { UpdateRecurringJobRuleController } from "@/src/interface-adapters/controllers/recurring-job-rules/update-recurring-job-rule.controller";
export const jobRegistrations = {
jobsRepository: asClass(MongoDBJobsRepository).singleton(),
listJobsUseCase: asClass(ListJobsUseCase).singleton(),
listJobsController: asClass(ListJobsController).singleton(),
fetchJobUseCase: asClass(FetchJobUseCase).singleton(),
fetchJobController: asClass(FetchJobController).singleton(),
scheduledJobRulesRepository: asClass(MongoDBScheduledJobRulesRepository).singleton(),
createScheduledJobRuleUseCase: asClass(CreateScheduledJobRuleUseCase).singleton(),
fetchScheduledJobRuleUseCase: asClass(FetchScheduledJobRuleUseCase).singleton(),
listScheduledJobRulesUseCase: asClass(ListScheduledJobRulesUseCase).singleton(),
updateScheduledJobRuleUseCase: asClass(UpdateScheduledJobRuleUseCase).singleton(),
deleteScheduledJobRuleUseCase: asClass(DeleteScheduledJobRuleUseCase).singleton(),
createScheduledJobRuleController: asClass(CreateScheduledJobRuleController).singleton(),
fetchScheduledJobRuleController: asClass(FetchScheduledJobRuleController).singleton(),
listScheduledJobRulesController: asClass(ListScheduledJobRulesController).singleton(),
updateScheduledJobRuleController: asClass(UpdateScheduledJobRuleController).singleton(),
deleteScheduledJobRuleController: asClass(DeleteScheduledJobRuleController).singleton(),
recurringJobRulesRepository: asClass(MongoDBRecurringJobRulesRepository).singleton(),
createRecurringJobRuleUseCase: asClass(CreateRecurringJobRuleUseCase).singleton(),
fetchRecurringJobRuleUseCase: asClass(FetchRecurringJobRuleUseCase).singleton(),
listRecurringJobRulesUseCase: asClass(ListRecurringJobRulesUseCase).singleton(),
toggleRecurringJobRuleUseCase: asClass(ToggleRecurringJobRuleUseCase).singleton(),
updateRecurringJobRuleUseCase: asClass(UpdateRecurringJobRuleUseCase).singleton(),
deleteRecurringJobRuleUseCase: asClass(DeleteRecurringJobRuleUseCase).singleton(),
createRecurringJobRuleController: asClass(CreateRecurringJobRuleController).singleton(),
fetchRecurringJobRuleController: asClass(FetchRecurringJobRuleController).singleton(),
listRecurringJobRulesController: asClass(ListRecurringJobRulesController).singleton(),
toggleRecurringJobRuleController: asClass(ToggleRecurringJobRuleController).singleton(),
updateRecurringJobRuleController: asClass(UpdateRecurringJobRuleController).singleton(),
deleteRecurringJobRuleController: asClass(DeleteRecurringJobRuleController).singleton(),
};

View file

@ -0,0 +1,85 @@
import { asClass } from "awilix";
import { MongodbProjectsRepository } from "@/src/infrastructure/repositories/mongodb.projects.repository";
import { MongoDBProjectMembersRepository } from "@/src/infrastructure/repositories/mongodb.project-members.repository";
import { CreateProjectUseCase } from "@/src/application/use-cases/projects/create-project.use-case";
import { DeleteComposioConnectedAccountUseCase } from "@/src/application/use-cases/projects/delete-composio-connected-account.use-case";
import { CreateComposioManagedConnectedAccountUseCase } from "@/src/application/use-cases/projects/create-composio-managed-connected-account.use-case";
import { CreateCustomConnectedAccountUseCase } from "@/src/application/use-cases/projects/create-custom-connected-account.use-case";
import { SyncConnectedAccountUseCase } from "@/src/application/use-cases/projects/sync-connected-account.use-case";
import { ListComposioToolkitsUseCase } from "@/src/application/use-cases/projects/list-composio-toolkits.use-case";
import { GetComposioToolkitUseCase } from "@/src/application/use-cases/projects/get-composio-toolkit.use-case";
import { ListComposioToolsUseCase } from "@/src/application/use-cases/projects/list-composio-tools.use-case";
import { AddCustomMcpServerUseCase } from "@/src/application/use-cases/projects/add-custom-mcp-server.use-case";
import { RemoveCustomMcpServerUseCase } from "@/src/application/use-cases/projects/remove-custom-mcp-server.use-case";
import { DeleteProjectUseCase } from "@/src/application/use-cases/projects/delete-project.use-case";
import { ListProjectsUseCase } from "@/src/application/use-cases/projects/list-projects.use-case";
import { FetchProjectUseCase } from "@/src/application/use-cases/projects/fetch-project.use-case";
import { RotateSecretUseCase } from "@/src/application/use-cases/projects/rotate-secret.use-case";
import { UpdateWebhookUrlUseCase } from "@/src/application/use-cases/projects/update-webhook-url.use-case";
import { UpdateProjectNameUseCase } from "@/src/application/use-cases/projects/update-project-name.use-case";
import { UpdateDraftWorkflowUseCase } from "@/src/application/use-cases/projects/update-draft-workflow.use-case";
import { UpdateLiveWorkflowUseCase } from "@/src/application/use-cases/projects/update-live-workflow.use-case";
import { RevertToLiveWorkflowUseCase } from "@/src/application/use-cases/projects/revert-to-live-workflow.use-case";
import { CreateProjectController } from "@/src/interface-adapters/controllers/projects/create-project.controller";
import { DeleteComposioConnectedAccountController } from "@/src/interface-adapters/controllers/projects/delete-composio-connected-account.controller";
import { CreateComposioManagedConnectedAccountController } from "@/src/interface-adapters/controllers/projects/create-composio-managed-connected-account.controller";
import { CreateCustomConnectedAccountController } from "@/src/interface-adapters/controllers/projects/create-custom-connected-account.controller";
import { SyncConnectedAccountController } from "@/src/interface-adapters/controllers/projects/sync-connected-account.controller";
import { ListComposioToolkitsController } from "@/src/interface-adapters/controllers/projects/list-composio-toolkits.controller";
import { GetComposioToolkitController } from "@/src/interface-adapters/controllers/projects/get-composio-toolkit.controller";
import { ListComposioToolsController } from "@/src/interface-adapters/controllers/projects/list-composio-tools.controller";
import { AddCustomMcpServerController } from "@/src/interface-adapters/controllers/projects/add-custom-mcp-server.controller";
import { RemoveCustomMcpServerController } from "@/src/interface-adapters/controllers/projects/remove-custom-mcp-server.controller";
import { DeleteProjectController } from "@/src/interface-adapters/controllers/projects/delete-project.controller";
import { ListProjectsController } from "@/src/interface-adapters/controllers/projects/list-projects.controller";
import { FetchProjectController } from "@/src/interface-adapters/controllers/projects/fetch-project.controller";
import { RotateSecretController } from "@/src/interface-adapters/controllers/projects/rotate-secret.controller";
import { UpdateWebhookUrlController } from "@/src/interface-adapters/controllers/projects/update-webhook-url.controller";
import { UpdateProjectNameController } from "@/src/interface-adapters/controllers/projects/update-project-name.controller";
import { UpdateDraftWorkflowController } from "@/src/interface-adapters/controllers/projects/update-draft-workflow.controller";
import { UpdateLiveWorkflowController } from "@/src/interface-adapters/controllers/projects/update-live-workflow.controller";
import { RevertToLiveWorkflowController } from "@/src/interface-adapters/controllers/projects/revert-to-live-workflow.controller";
export const projectRegistrations = {
projectsRepository: asClass(MongodbProjectsRepository).singleton(),
projectMembersRepository: asClass(MongoDBProjectMembersRepository).singleton(),
createProjectUseCase: asClass(CreateProjectUseCase).singleton(),
createProjectController: asClass(CreateProjectController).singleton(),
fetchProjectUseCase: asClass(FetchProjectUseCase).singleton(),
fetchProjectController: asClass(FetchProjectController).singleton(),
listProjectsUseCase: asClass(ListProjectsUseCase).singleton(),
listProjectsController: asClass(ListProjectsController).singleton(),
rotateSecretUseCase: asClass(RotateSecretUseCase).singleton(),
rotateSecretController: asClass(RotateSecretController).singleton(),
updateWebhookUrlUseCase: asClass(UpdateWebhookUrlUseCase).singleton(),
updateWebhookUrlController: asClass(UpdateWebhookUrlController).singleton(),
updateProjectNameUseCase: asClass(UpdateProjectNameUseCase).singleton(),
updateProjectNameController: asClass(UpdateProjectNameController).singleton(),
updateDraftWorkflowUseCase: asClass(UpdateDraftWorkflowUseCase).singleton(),
updateDraftWorkflowController: asClass(UpdateDraftWorkflowController).singleton(),
updateLiveWorkflowUseCase: asClass(UpdateLiveWorkflowUseCase).singleton(),
updateLiveWorkflowController: asClass(UpdateLiveWorkflowController).singleton(),
revertToLiveWorkflowUseCase: asClass(RevertToLiveWorkflowUseCase).singleton(),
revertToLiveWorkflowController: asClass(RevertToLiveWorkflowController).singleton(),
deleteProjectUseCase: asClass(DeleteProjectUseCase).singleton(),
deleteProjectController: asClass(DeleteProjectController).singleton(),
deleteComposioConnectedAccountController: asClass(DeleteComposioConnectedAccountController).singleton(),
deleteComposioConnectedAccountUseCase: asClass(DeleteComposioConnectedAccountUseCase).singleton(),
createComposioManagedConnectedAccountUseCase: asClass(CreateComposioManagedConnectedAccountUseCase).singleton(),
createComposioManagedConnectedAccountController: asClass(CreateComposioManagedConnectedAccountController).singleton(),
createCustomConnectedAccountUseCase: asClass(CreateCustomConnectedAccountUseCase).singleton(),
createCustomConnectedAccountController: asClass(CreateCustomConnectedAccountController).singleton(),
syncConnectedAccountUseCase: asClass(SyncConnectedAccountUseCase).singleton(),
syncConnectedAccountController: asClass(SyncConnectedAccountController).singleton(),
listComposioToolkitsUseCase: asClass(ListComposioToolkitsUseCase).singleton(),
listComposioToolkitsController: asClass(ListComposioToolkitsController).singleton(),
getComposioToolkitUseCase: asClass(GetComposioToolkitUseCase).singleton(),
getComposioToolkitController: asClass(GetComposioToolkitController).singleton(),
listComposioToolsUseCase: asClass(ListComposioToolsUseCase).singleton(),
listComposioToolsController: asClass(ListComposioToolsController).singleton(),
addCustomMcpServerUseCase: asClass(AddCustomMcpServerUseCase).singleton(),
addCustomMcpServerController: asClass(AddCustomMcpServerController).singleton(),
removeCustomMcpServerUseCase: asClass(RemoveCustomMcpServerUseCase).singleton(),
removeCustomMcpServerController: asClass(RemoveCustomMcpServerController).singleton(),
};

View file

@ -0,0 +1,7 @@
import { asClass } from "awilix";
import { MongoDBUsersRepository } from "@/src/infrastructure/repositories/mongodb.users.repository";
export const userRegistrations = {
usersRepository: asClass(MongoDBUsersRepository).singleton(),
};

View file

@ -8,6 +8,9 @@
"build": "next build",
"start": "next start",
"lint": "next lint",
"typecheck": "tsc --noEmit",
"test": "node --import tsx --test --test-force-exit test/*.test.ts",
"verify": "npm run lint && npm run typecheck && npm test",
"setupQdrant": "tsx app/scripts/setup_qdrant.ts",
"deleteQdrant": "tsx app/scripts/delete_qdrant.ts",
"rag-worker": "tsx app/scripts/rag-worker.ts",

View file

@ -4,7 +4,6 @@ import { IProjectsRepository } from "../../repositories/projects.repository.inte
import { IUsageQuotaPolicy } from "../../policies/usage-quota.policy.interface";
import { BadRequestError, BillingError } from "@/src/entities/errors/common";
import { IProjectMembersRepository } from "../../repositories/project-members.repository.interface";
import { authorize, getCustomerForUserId } from "@/app/lib/billing";
import { USE_BILLING } from "@/app/lib/feature_flags";
import { Project } from "@/src/entities/models/project";
import { Workflow } from "@/app/lib/types/workflow_types";
@ -58,6 +57,8 @@ export class CreateProjectUseCase implements ICreateProjectUseCase {
// Check billing auth
if (USE_BILLING) {
const { authorize, getCustomerForUserId } = await import("@/app/lib/billing");
// get billing customer id for project
const customer = await getCustomerForUserId(request.userId);
if (!customer) {

View file

@ -0,0 +1,107 @@
import test from "node:test";
import assert from "node:assert/strict";
import { handlePostChat } from "../app/api/v1/[projectId]/chat/route";
test("returns 400 for invalid request bodies", async () => {
const response = await handlePostChat(
new Request("http://localhost/api/v1/proj/chat", {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify({ invalid: true }),
}),
{ params: Promise.resolve({ projectId: "proj" }) },
{
createLogger: () => ({ log() {} }),
resolveRunTurnController: () => ({
execute: async () => {
throw new Error("should not be called");
},
} as any),
},
);
assert.equal(response.status, 400);
assert.deepEqual(await response.json(), { error: "Invalid request" });
});
test("returns json for non-streaming responses", async () => {
const calls: Array<Record<string, unknown>> = [];
const response = await handlePostChat(
new Request("http://localhost/api/v1/proj/chat", {
method: "POST",
headers: {
"content-type": "application/json",
Authorization: "Bearer test-key",
},
body: JSON.stringify({
messages: [{ role: "user", content: "hello" }],
stream: false,
}),
}),
{ params: Promise.resolve({ projectId: "proj-1" }) },
{
createLogger: () => ({ log() {} }),
resolveRunTurnController: () => ({
execute: async (input: any) => {
calls.push(input);
return {
conversationId: "conv-1",
turn: { output: [{ role: "assistant", content: "hi" }] },
};
},
} as any),
},
);
assert.equal(response.status, 200);
assert.deepEqual(await response.json(), {
conversationId: "conv-1",
turn: { output: [{ role: "assistant", content: "hi" }] },
});
assert.deepEqual(calls, [{
caller: "api",
apiKey: "test-key",
projectId: "proj-1",
input: {
messages: [{ role: "user", content: "hello" }],
mockTools: undefined,
},
conversationId: undefined,
stream: false,
}]);
});
test("returns SSE for streaming responses", async () => {
async function* makeStream() {
yield { type: "text-delta", delta: "hello" };
yield { type: "done" };
}
const response = await handlePostChat(
new Request("http://localhost/api/v1/proj/chat", {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify({
messages: [{ role: "user", content: "stream please" }],
stream: true,
}),
}),
{ params: Promise.resolve({ projectId: "proj-stream" }) },
{
createLogger: () => ({ log() {} }),
resolveRunTurnController: () => ({
execute: async () => ({
conversationId: "conv-stream",
stream: makeStream(),
}),
} as any),
},
);
assert.equal(response.status, 200);
assert.equal(response.headers.get("Content-Type"), "text/event-stream");
const text = await response.text();
assert.match(text, /event: message/);
assert.match(text, /"delta":"hello"/);
});

View file

@ -0,0 +1,364 @@
import test from "node:test";
import assert from "node:assert/strict";
import { BadRequestError, NotFoundError } from "../src/entities/errors/common";
import { RunTurnController } from "../src/interface-adapters/controllers/conversations/run-turn.controller";
import { CreateCachedTurnUseCase } from "../src/application/use-cases/conversations/create-cached-turn.use-case";
import { CreateProjectController } from "../src/interface-adapters/controllers/projects/create-project.controller";
import { CreateProjectUseCase } from "../src/application/use-cases/projects/create-project.use-case";
const isoNow = "2026-01-01T00:00:00.000Z";
function createUserMessage(content: string) {
return { role: "user" as const, content };
}
function createTurn() {
return {
id: "turn-1",
reason: { type: "chat" as const },
input: { messages: [createUserMessage("hello")], mockTools: undefined },
output: [{ role: "assistant" as const, content: "hi", agentName: null, responseType: "external" as const }],
createdAt: isoNow,
};
}
test("RunTurnController creates a conversation and returns the completed turn", async () => {
const createdRequests: unknown[] = [];
const runRequests: unknown[] = [];
const turn = createTurn();
const controller = new RunTurnController({
createConversationUseCase: {
execute: async (input: unknown) => {
createdRequests.push(input);
return {
id: "conv-1",
projectId: "proj-1",
workflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
reason: { type: "chat" as const },
isLiveWorkflow: false,
createdAt: isoNow,
};
},
},
runConversationTurnUseCase: {
execute: async function* (input: unknown) {
runRequests.push(input);
yield { type: "done" as const, conversationId: "conv-1", turn };
},
},
});
const result = await controller.execute({
caller: "user",
userId: "user-1",
projectId: "proj-1",
input: { messages: [createUserMessage("hello")] },
stream: false,
});
assert.deepEqual(createdRequests, [{
caller: "user",
userId: "user-1",
apiKey: undefined,
projectId: "proj-1",
reason: { type: "chat" },
}]);
assert.deepEqual(runRequests, [{
caller: "user",
userId: "user-1",
apiKey: undefined,
conversationId: "conv-1",
reason: { type: "chat" },
input: { messages: [createUserMessage("hello")] },
}]);
assert.deepEqual(result, {
conversationId: "conv-1",
turn,
});
});
test("RunTurnController returns a stream without creating a conversation when one is supplied", async () => {
async function* makeStream() {
yield { type: "message" as const, data: createUserMessage("hello") };
}
const controller = new RunTurnController({
createConversationUseCase: {
execute: async () => {
throw new Error("should not create a conversation");
},
},
runConversationTurnUseCase: {
execute: () => makeStream(),
},
});
const result = await controller.execute({
caller: "api",
apiKey: "key-1",
projectId: "proj-1",
conversationId: "conv-existing",
input: { messages: [createUserMessage("hello")] },
stream: true,
});
assert.equal(result.conversationId, "conv-existing");
assert.ok("stream" in result);
});
test("CreateCachedTurnUseCase rejects missing conversations", async () => {
const useCase = new CreateCachedTurnUseCase({
cacheService: {
get: async () => null,
set: async () => {},
delete: async () => false,
},
conversationsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
list: async () => ({ items: [], nextCursor: null }),
addTurn: async () => { throw new Error("unused"); },
deleteByProjectId: async () => {},
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async () => {},
assertAndConsumeRunJobAction: async () => {},
},
projectActionAuthorizationPolicy: {
authorize: async () => {},
},
});
await assert.rejects(
() => useCase.execute({
caller: "user",
userId: "user-1",
conversationId: "missing",
input: { messages: [createUserMessage("hello")] },
}),
NotFoundError,
);
});
test("CreateCachedTurnUseCase authorizes, consumes quota, and stores the payload", async () => {
const calls: Record<string, unknown>[] = [];
const useCase = new CreateCachedTurnUseCase({
cacheService: {
get: async () => null,
set: async (key, value, ttl) => {
calls.push({ key, value, ttl });
},
delete: async () => false,
},
conversationsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => ({
id: "conv-1",
projectId: "proj-1",
workflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
reason: { type: "chat" as const },
isLiveWorkflow: false,
createdAt: isoNow,
}),
list: async () => ({ items: [], nextCursor: null }),
addTurn: async () => { throw new Error("unused"); },
deleteByProjectId: async () => {},
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async (projectId) => {
calls.push({ quotaProjectId: projectId });
},
assertAndConsumeRunJobAction: async () => {},
},
projectActionAuthorizationPolicy: {
authorize: async (input) => {
calls.push({ authorization: input });
},
},
});
const result = await useCase.execute({
caller: "api",
apiKey: "api-key",
conversationId: "conv-1",
input: { messages: [createUserMessage("hello")], mockTools: { calc: "42" } },
});
assert.equal(typeof result.key, "string");
assert.ok(result.key.length > 0);
assert.deepEqual(calls[0], {
authorization: {
caller: "api",
userId: undefined,
apiKey: "api-key",
projectId: "proj-1",
},
});
assert.deepEqual(calls[1], { quotaProjectId: "proj-1" });
assert.deepEqual(calls[2], {
key: `turn-${result.key}`,
value: JSON.stringify({
conversationId: "conv-1",
input: { messages: [createUserMessage("hello")], mockTools: { calc: "42" } },
}),
ttl: 600,
});
});
test("CreateProjectController validates input before delegating", async () => {
const calls: unknown[] = [];
const controller = new CreateProjectController({
createProjectUseCase: {
execute: async (input: unknown) => {
calls.push(input);
return {
id: "550e8400-e29b-41d4-a716-446655440000",
name: "Assistant 1",
createdAt: isoNow,
createdByUserId: "user-1",
secret: "secret",
draftWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
liveWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
};
},
},
});
await assert.rejects(
() => controller.execute({ userId: "user-1" } as any),
BadRequestError,
);
const result = await controller.execute({
userId: "user-1",
data: {
mode: { template: "default" },
},
});
assert.equal(result.name, "Assistant 1");
assert.deepEqual(calls, [{
userId: "user-1",
data: {
mode: { template: "default" },
},
}]);
});
test("CreateProjectUseCase builds a default template project and membership", async () => {
const createCalls: unknown[] = [];
const memberCalls: unknown[] = [];
const quotaCalls: string[] = [];
const useCase = new CreateProjectUseCase({
projectsRepository: {
countCreatedProjects: async () => 0,
create: async (data: any) => {
createCalls.push(data);
return {
id: "550e8400-e29b-41d4-a716-446655440000",
name: data.name,
createdAt: isoNow,
createdByUserId: data.createdByUserId,
secret: data.secret,
draftWorkflow: { ...data.workflow, lastUpdatedAt: isoNow },
liveWorkflow: { ...data.workflow, lastUpdatedAt: isoNow },
};
},
fetch: async () => null,
listProjects: async () => ({ items: [], nextCursor: null }),
addComposioConnectedAccount: async () => { throw new Error("unused"); },
deleteComposioConnectedAccount: async () => false,
addCustomMcpServer: async () => { throw new Error("unused"); },
deleteCustomMcpServer: async () => false,
updateSecret: async () => { throw new Error("unused"); },
updateWebhookUrl: async () => { throw new Error("unused"); },
updateName: async () => { throw new Error("unused"); },
updateDraftWorkflow: async () => { throw new Error("unused"); },
updateLiveWorkflow: async () => { throw new Error("unused"); },
delete: async () => false,
},
projectMembersRepository: {
create: async (data) => {
memberCalls.push(data);
return {
id: "member-1",
userId: data.userId,
projectId: data.projectId,
createdAt: isoNow,
lastUpdatedAt: isoNow,
};
},
findByUserId: async () => ({ items: [], nextCursor: null }),
deleteByProjectId: async () => {},
exists: async () => true,
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async (projectId) => {
quotaCalls.push(projectId);
},
assertAndConsumeRunJobAction: async () => {},
},
});
const result = await useCase.execute({
userId: "user-1",
data: {
mode: { template: "default" },
},
});
assert.equal(result.name, "Assistant 1");
assert.equal(createCalls.length, 1);
assert.deepEqual(memberCalls, [{
projectId: "550e8400-e29b-41d4-a716-446655440000",
userId: "user-1",
}]);
assert.deepEqual(quotaCalls, ["550e8400-e29b-41d4-a716-446655440000"]);
assert.equal((createCalls[0] as any).name, "Assistant 1");
assert.equal((createCalls[0] as any).createdByUserId, "user-1");
assert.equal(typeof (createCalls[0] as any).secret, "string");
assert.equal((createCalls[0] as any).workflow.startAgent, "");
});
test("CreateProjectUseCase rejects invalid workflow JSON", async () => {
const useCase = new CreateProjectUseCase({
projectsRepository: {
countCreatedProjects: async () => 0,
create: async () => { throw new Error("should not create"); },
fetch: async () => null,
listProjects: async () => ({ items: [], nextCursor: null }),
addComposioConnectedAccount: async () => { throw new Error("unused"); },
deleteComposioConnectedAccount: async () => false,
addCustomMcpServer: async () => { throw new Error("unused"); },
deleteCustomMcpServer: async () => false,
updateSecret: async () => { throw new Error("unused"); },
updateWebhookUrl: async () => { throw new Error("unused"); },
updateName: async () => { throw new Error("unused"); },
updateDraftWorkflow: async () => { throw new Error("unused"); },
updateLiveWorkflow: async () => { throw new Error("unused"); },
delete: async () => false,
},
projectMembersRepository: {
create: async () => { throw new Error("unused"); },
findByUserId: async () => ({ items: [], nextCursor: null }),
deleteByProjectId: async () => {},
exists: async () => true,
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async () => {},
assertAndConsumeRunJobAction: async () => {},
},
});
await assert.rejects(
() => useCase.execute({
userId: "user-1",
data: {
mode: { workflowJson: "{" },
},
}),
BadRequestError,
);
});

View file

@ -0,0 +1,398 @@
import test from "node:test";
import assert from "node:assert/strict";
import { QuotaExceededError } from "../src/entities/errors/common";
import { JobsWorker } from "../src/application/workers/jobs.worker";
import { JobRulesWorker } from "../src/application/workers/job-rules.worker";
const isoNow = "2026-01-01T00:00:00.000Z";
function userMessage(content: string) {
return { role: "user" as const, content };
}
function createJob(id: string) {
return {
id,
reason: { type: "scheduled_job_rule" as const, ruleId: "rule-1" },
projectId: "proj-1",
input: { messages: [userMessage("hello")] },
workerId: null,
lastWorkerId: null,
status: "pending" as const,
createdAt: isoNow,
};
}
async function waitFor(condition: () => boolean, timeoutMs: number = 100): Promise<void> {
const start = Date.now();
while (!condition()) {
if (Date.now() - start > timeoutMs) {
throw new Error("Timed out waiting for async worker state");
}
await new Promise((resolve) => setTimeout(resolve, 5));
}
}
test("JobsWorker processes subscription-delivered jobs end to end", async () => {
const updates: Array<{ id: string; data: unknown }> = [];
const released: string[] = [];
const conversations: unknown[] = [];
const runInputs: unknown[] = [];
let subscriptionHandler: ((message: string) => void) | null = null;
const worker = new JobsWorker({
jobsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
poll: async () => null,
lock: async (id: string) => createJob(id),
update: async (id: string, data: any) => {
updates.push({ id, data });
return { ...createJob(id), ...data };
},
release: async (id: string) => {
released.push(id);
},
list: async () => ({ items: [], nextCursor: null }),
deleteByProjectId: async () => {},
},
projectsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => ({
id: "proj-1",
name: "Project",
createdAt: isoNow,
createdByUserId: "user-1",
secret: "secret",
draftWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
liveWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
}),
countCreatedProjects: async () => 0,
listProjects: async () => ({ items: [], nextCursor: null }),
addComposioConnectedAccount: async () => { throw new Error("unused"); },
deleteComposioConnectedAccount: async () => false,
addCustomMcpServer: async () => { throw new Error("unused"); },
deleteCustomMcpServer: async () => false,
updateSecret: async () => { throw new Error("unused"); },
updateWebhookUrl: async () => { throw new Error("unused"); },
updateName: async () => { throw new Error("unused"); },
updateDraftWorkflow: async () => { throw new Error("unused"); },
updateLiveWorkflow: async () => { throw new Error("unused"); },
delete: async () => false,
},
createConversationUseCase: {
execute: async (input: unknown) => {
conversations.push(input);
return {
id: "conv-1",
projectId: "proj-1",
workflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
reason: { type: "job", jobId: "job-1" },
isLiveWorkflow: true,
createdAt: isoNow,
};
},
},
runConversationTurnUseCase: {
execute: async function* (input: unknown) {
runInputs.push(input);
yield {
type: "done" as const,
conversationId: "conv-1",
turn: {
id: "turn-1",
reason: { type: "job" as const, jobId: "job-1" },
input: { messages: [userMessage("hello")] },
output: [{ role: "assistant" as const, content: "done", agentName: null, responseType: "external" as const }],
createdAt: isoNow,
},
};
},
},
pubSubService: {
publish: async () => {},
subscribe: async (_channel: string, handler: (message: string) => void) => {
subscriptionHandler = handler;
return { unsubscribe: async () => {} };
},
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async () => {},
assertAndConsumeRunJobAction: async () => {},
},
});
await worker.run();
if (!subscriptionHandler) {
throw new Error("subscription handler was not registered");
}
const handler = subscriptionHandler as (message: string) => void;
handler("job-1");
await waitFor(() => updates.length === 1 && runInputs.length === 1 && released.length === 1);
await worker.stop();
assert.deepEqual(conversations, [{
caller: "job_worker",
projectId: "proj-1",
reason: { type: "job", jobId: "job-1" },
isLiveWorkflow: true,
}]);
assert.equal(runInputs.length, 1);
assert.equal((runInputs[0] as any).caller, "job_worker");
assert.equal((runInputs[0] as any).conversationId, "conv-1");
assert.deepEqual((runInputs[0] as any).reason, { type: "job", jobId: "job-1" });
assert.deepEqual((runInputs[0] as any).input, { messages: [userMessage("hello")] });
assert.deepEqual(updates, [{
id: "job-1",
data: {
status: "completed",
output: {
conversationId: "conv-1",
turnId: "turn-1",
},
},
}]);
assert.deepEqual(released, ["job-1"]);
});
test("JobsWorker marks quota-exceeded jobs as failed with a user-facing error", async () => {
const updates: Array<{ id: string; data: unknown }> = [];
const released: string[] = [];
const worker = new JobsWorker({
jobsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
poll: async () => null,
lock: async (id: string) => createJob(id),
update: async (id: string, data: any) => {
updates.push({ id, data });
return { ...createJob(id), ...data };
},
release: async (id: string) => {
released.push(id);
},
list: async () => ({ items: [], nextCursor: null }),
deleteByProjectId: async () => {},
},
projectsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => ({
id: "proj-1",
name: "Project",
createdAt: isoNow,
createdByUserId: "user-1",
secret: "secret",
draftWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
liveWorkflow: { agents: [], prompts: [], tools: [], pipelines: [], startAgent: "", lastUpdatedAt: isoNow },
}),
countCreatedProjects: async () => 0,
listProjects: async () => ({ items: [], nextCursor: null }),
addComposioConnectedAccount: async () => { throw new Error("unused"); },
deleteComposioConnectedAccount: async () => false,
addCustomMcpServer: async () => { throw new Error("unused"); },
deleteCustomMcpServer: async () => false,
updateSecret: async () => { throw new Error("unused"); },
updateWebhookUrl: async () => { throw new Error("unused"); },
updateName: async () => { throw new Error("unused"); },
updateDraftWorkflow: async () => { throw new Error("unused"); },
updateLiveWorkflow: async () => { throw new Error("unused"); },
delete: async () => false,
},
createConversationUseCase: {
execute: async () => { throw new Error("should not create conversation"); },
},
runConversationTurnUseCase: {
execute: async function* () {
yield* [] as Array<{ type: "error"; error: string }>;
throw new Error("should not run");
},
},
pubSubService: {
publish: async () => {},
subscribe: async () => ({ unsubscribe: async () => {} }),
},
usageQuotaPolicy: {
assertAndConsumeProjectAction: async () => {},
assertAndConsumeRunJobAction: async () => {
throw new QuotaExceededError("No credits left");
},
},
});
await (worker as any).processJob(createJob("job-2"));
assert.deepEqual(updates, [{
id: "job-2",
data: {
status: "failed",
output: {
error: "No credits left",
},
},
}]);
assert.deepEqual(released, ["job-2"]);
});
test("JobRulesWorker creates jobs from scheduled and recurring rules and publishes them", async () => {
const createdJobs: unknown[] = [];
const published: Array<{ channel: string; message: string }> = [];
const scheduledUpdates: Array<{ id: string; data: unknown }> = [];
const scheduledReleases: string[] = [];
const recurringReleases: string[] = [];
const worker = new JobRulesWorker({
scheduledJobRulesRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
poll: async () => null,
update: async (id: string, data: unknown) => {
scheduledUpdates.push({ id, data });
return {
id,
projectId: "proj-1",
input: { messages: [userMessage("hello")] },
nextRunAt: isoNow,
workerId: null,
lastWorkerId: null,
status: "triggered" as const,
output: (data as any).output,
createdAt: isoNow,
};
},
updateRule: async () => { throw new Error("unused"); },
release: async (id: string) => {
scheduledReleases.push(id);
return {
id,
projectId: "proj-1",
input: { messages: [userMessage("hello")] },
nextRunAt: isoNow,
workerId: null,
lastWorkerId: null,
status: "triggered" as const,
createdAt: isoNow,
};
},
list: async () => ({ items: [], nextCursor: null }),
delete: async () => false,
deleteByProjectId: async () => {},
},
recurringJobRulesRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
poll: async () => null,
release: async (id: string) => {
recurringReleases.push(id);
return {
id,
projectId: "proj-1",
input: { messages: [userMessage("hello")] },
cron: "* * * * *",
nextRunAt: isoNow,
workerId: null,
lastWorkerId: null,
disabled: false,
createdAt: isoNow,
};
},
list: async () => ({ items: [], nextCursor: null }),
toggle: async () => { throw new Error("unused"); },
update: async () => { throw new Error("unused"); },
delete: async () => false,
deleteByProjectId: async () => {},
},
jobsRepository: {
create: async (data: any) => {
createdJobs.push(data);
return {
id: `job-${createdJobs.length}`,
...data,
workerId: null,
lastWorkerId: null,
status: "pending" as const,
createdAt: isoNow,
};
},
fetch: async () => null,
poll: async () => null,
lock: async () => { throw new Error("unused"); },
update: async () => { throw new Error("unused"); },
release: async () => {},
list: async () => ({ items: [], nextCursor: null }),
deleteByProjectId: async () => {},
},
projectsRepository: {
create: async () => { throw new Error("unused"); },
fetch: async () => null,
countCreatedProjects: async () => 0,
listProjects: async () => ({ items: [], nextCursor: null }),
addComposioConnectedAccount: async () => { throw new Error("unused"); },
deleteComposioConnectedAccount: async () => false,
addCustomMcpServer: async () => { throw new Error("unused"); },
deleteCustomMcpServer: async () => false,
updateSecret: async () => { throw new Error("unused"); },
updateWebhookUrl: async () => { throw new Error("unused"); },
updateName: async () => { throw new Error("unused"); },
updateDraftWorkflow: async () => { throw new Error("unused"); },
updateLiveWorkflow: async () => { throw new Error("unused"); },
delete: async () => false,
},
pubSubService: {
publish: async (channel: string, message: string) => {
published.push({ channel, message });
},
subscribe: async () => ({ unsubscribe: async () => {} }),
},
});
await (worker as any).processScheduledRule({
id: "scheduled-1",
projectId: "proj-1",
input: { messages: [userMessage("scheduled")] },
nextRunAt: isoNow,
workerId: null,
lastWorkerId: null,
status: "pending",
createdAt: isoNow,
});
await (worker as any).processRecurringRule({
id: "recurring-1",
projectId: "proj-1",
input: { messages: [userMessage("recurring")] },
cron: "* * * * *",
nextRunAt: isoNow,
workerId: null,
lastWorkerId: null,
disabled: false,
createdAt: isoNow,
});
assert.deepEqual(createdJobs, [
{
reason: { type: "scheduled_job_rule", ruleId: "scheduled-1" },
projectId: "proj-1",
input: { messages: [userMessage("scheduled")] },
},
{
reason: { type: "recurring_job_rule", ruleId: "recurring-1" },
projectId: "proj-1",
input: { messages: [userMessage("recurring")] },
},
]);
assert.deepEqual(published, [
{ channel: "new_jobs", message: "job-1" },
{ channel: "new_jobs", message: "job-2" },
]);
assert.deepEqual(scheduledUpdates, [{
id: "scheduled-1",
data: {
output: { jobId: "job-1" },
status: "triggered",
},
}]);
assert.deepEqual(scheduledReleases, ["scheduled-1"]);
assert.deepEqual(recurringReleases, ["recurring-1"]);
});

View file

@ -1,36 +1,38 @@
This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app).
# RowboatX Frontend
## Getting Started
`apps/rowboatx` is the newer frontend for the local Rowboat runtime. It is a Next.js UI that renders chat, artifacts, tools, and resource views on top of a runtime provided by `apps/cli` or another host shell.
First, run the development server:
## What Lives Here
- Main chat/dashboard page in `app/page.tsx`
- Shared UI primitives and AI-oriented components under `components/`
- Static export configuration in `next.config.ts`
## Runtime Expectations
This frontend is not self-contained. It expects one of the following to exist at runtime:
- `window.config.apiBase` for direct backend requests
- `/api/stream` for SSE run events
- `/api/rowboat/*` endpoints for local resource browsing and editing
In practice, this means the UI is meant to be served by a shell or proxy that also provides the local runtime APIs.
## Local Development
```bash
npm install
npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev
```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
Build the static export:
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
```bash
npm run build
```
This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel.
## Notes For Contributors
## Learn More
To learn more about Next.js, take a look at the following resources:
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome!
## Deploy on Vercel
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
Check out our [Next.js deployment documentation](https://nextjs.org/docs/app/building-your-application/deploying) for more details.
- Changes here should preserve the assumption that the backend lives outside this app.
- If you add a new runtime endpoint, document the expected contract in the host surface that provides it.
- For repo-level ownership and status, see the root `ARCHITECTURE.md`.

54
apps/x/README.md Normal file
View file

@ -0,0 +1,54 @@
# Rowboat Desktop App
`apps/x` is the primary local-first Rowboat desktop product. It is a nested `pnpm` workspace that packages the Electron app, renderer, preload bridge, shared contracts, and core knowledge/runtime logic.
## Workspace Layout
- `apps/main` - Electron main process
- `apps/renderer` - React and Vite renderer UI
- `apps/preload` - validated IPC bridge
- `packages/shared` - shared schemas and IPC contracts
- `packages/core` - workspace, knowledge graph, integrations, agents, and background services
## Local Development
Install dependencies:
```bash
pnpm install
```
Build shared dependencies used by the app:
```bash
npm run deps
```
Run the desktop app in development:
```bash
npm run dev
```
Useful verification commands:
```bash
npm run lint
npm run typecheck
npm run test
npm run verify
```
## Build Notes
- `npm run deps` builds `shared`, `core`, and `preload`
- `apps/main` bundles the Electron main process with esbuild for packaging
- The renderer uses Vite and hot reloads during development
## Local Data Model
- Default work directory: `~/.rowboat`
- Knowledge is stored as Markdown
- Knowledge note history is Git-backed for transparent local versioning
If you are new to the repo, read the root `ARCHITECTURE.md` before making cross-surface changes.

View file

@ -151,7 +151,7 @@ export async function initiateConnection(toolkitSlug: string): Promise<{
// Set up callback server
const timeoutRef: { current: NodeJS.Timeout | null } = { current: null };
let callbackHandled = false;
const { server } = await createAuthServer(8081, async (_callbackUrl) => {
const { server } = await createAuthServer(8081, async () => {
// Guard against duplicate callbacks (browser may send multiple requests)
if (callbackHandled) return;
callbackHandled = true;

View file

@ -1,73 +1,25 @@
# React + TypeScript + Vite
# Desktop Renderer
This template provides a minimal setup to get React working in Vite with HMR and some ESLint rules.
This package contains the React and Vite renderer for the Electron desktop app in `apps/x`.
Currently, two official plugins are available:
## Responsibilities
- [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react) uses [Babel](https://babeljs.io/) (or [oxc](https://oxc.rs) when used in [rolldown-vite](https://vite.dev/guide/rolldown)) for Fast Refresh
- [@vitejs/plugin-react-swc](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react-swc) uses [SWC](https://swc.rs/) for Fast Refresh
- Render the desktop UI
- Talk to the Electron preload bridge instead of Node APIs directly
- Display workspace, chat, notes, graph, and other local-first product surfaces
## React Compiler
## Commands
The React Compiler is not enabled on this template because of its impact on dev & build performances. To add it, see [this documentation](https://react.dev/learn/react-compiler/installation).
## Expanding the ESLint configuration
If you are developing a production application, we recommend updating the configuration to enable type-aware lint rules:
```js
export default defineConfig([
globalIgnores(['dist']),
{
files: ['**/*.{ts,tsx}'],
extends: [
// Other configs...
// Remove tseslint.configs.recommended and replace with this
tseslint.configs.recommendedTypeChecked,
// Alternatively, use this for stricter rules
tseslint.configs.strictTypeChecked,
// Optionally, add this for stylistic rules
tseslint.configs.stylisticTypeChecked,
// Other configs...
],
languageOptions: {
parserOptions: {
project: ['./tsconfig.node.json', './tsconfig.app.json'],
tsconfigRootDir: import.meta.dirname,
},
// other options...
},
},
])
```bash
npm run dev
npm run build
npm run lint
```
You can also install [eslint-plugin-react-x](https://github.com/Rel1cx/eslint-react/tree/main/packages/plugins/eslint-plugin-react-x) and [eslint-plugin-react-dom](https://github.com/Rel1cx/eslint-react/tree/main/packages/plugins/eslint-plugin-react-dom) for React-specific lint rules:
Run these from `apps/x/apps/renderer` when working only on the renderer, or use `apps/x` and run `npm run dev` to launch the full desktop stack.
```js
// eslint.config.js
import reactX from 'eslint-plugin-react-x'
import reactDom from 'eslint-plugin-react-dom'
## Constraints
export default defineConfig([
globalIgnores(['dist']),
{
files: ['**/*.{ts,tsx}'],
extends: [
// Other configs...
// Enable lint rules for React
reactX.configs['recommended-typescript'],
// Enable lint rules for React DOM
reactDom.configs.recommended,
],
languageOptions: {
parserOptions: {
project: ['./tsconfig.node.json', './tsconfig.app.json'],
tsconfigRootDir: import.meta.dirname,
},
// other options...
},
},
])
```
- Assume `nodeIntegration` is disabled in the renderer
- Use the preload IPC bridge for privileged operations
- Keep shared contracts in `packages/shared` when a renderer and the main process need the same schema

View file

@ -7,10 +7,19 @@
"dev": "npm run deps && concurrently -k \"npm:renderer\" \"npm:main\"",
"renderer": "cd apps/renderer && npm run dev",
"shared": "cd packages/shared && npm run build",
"shared:typecheck": "cd packages/shared && npx tsc --noEmit",
"core": "cd packages/core && npm run build",
"core:typecheck": "cd packages/core && npx tsc --noEmit",
"core:test": "cd packages/core && npm test",
"preload": "cd apps/preload && npm run build",
"preload:typecheck": "cd apps/preload && npx tsc --noEmit",
"deps": "npm run shared && npm run core && npm run preload",
"main": "wait-on http://localhost:5173 && cd apps/main && npm run build && npm run start",
"main:typecheck": "cd apps/main && npx tsc --noEmit",
"renderer:typecheck": "cd apps/renderer && npx tsc -p tsconfig.app.json --noEmit && npx tsc -p tsconfig.node.json --noEmit",
"typecheck": "npm run shared:typecheck && npm run core:typecheck && npm run deps && npm run preload:typecheck && npm run main:typecheck && npm run renderer:typecheck",
"test": "npm run shared && npm run core:test",
"verify": "npm run lint && npm run typecheck && npm run test",
"lint": "eslint .",
"lint:fix": "eslint . --fix"
},
@ -26,4 +35,4 @@
"typescript-eslint": "^8.50.1",
"wait-on": "^9.0.3"
}
}
}

View file

@ -6,7 +6,8 @@
"types": "./dist/index.d.ts",
"scripts": {
"build": "rm -rf dist && tsc",
"dev": "tsc -w"
"dev": "tsc -w",
"test": "npm run build && node ./test/run-tests.mjs"
},
"dependencies": {
"@ai-sdk/anthropic": "^2.0.63",

View file

@ -561,7 +561,7 @@ export const BuiltinTools: z.infer<typeof BuiltinToolsSchema> = {
count: matches.length,
tool: 'ripgrep',
};
} catch (rgError) {
} catch {
// Fallback to basic grep if ripgrep not available or failed
const grepArgs = [
'-rn',

View file

@ -1,7 +1,6 @@
import path from "path";
import fs from "fs";
import { homedir } from "os";
import { fileURLToPath } from "url";
function resolveWorkDir(): string {
const configured = process.env.ROWBOAT_WORKDIR;
@ -23,10 +22,6 @@ function resolveWorkDir(): string {
// Normalize to an absolute path so workspace boundary checks behave consistently.
export const WorkDir = resolveWorkDir();
// Get the directory of this file (for locating bundled assets)
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
function ensureDirs() {
const ensure = (p: string) => { if (!fs.existsSync(p)) fs.mkdirSync(p, { recursive: true }); };
ensure(WorkDir);

View file

@ -390,9 +390,6 @@ export function analyzeEmailsAndRecommend(): AnalysisResult {
let reason: string;
const totalHumanSenders = lowWouldCreate;
const noiseRatio = uniqueSenders.size > 0
? (newsletterSenders.size + automatedSenders.size) / uniqueSenders.size
: 0;
const consumerRatio = totalHumanSenders > 0
? consumerServiceSenders.size / totalHumanSenders
: 0;

View file

@ -193,13 +193,16 @@ function extractConversationMessages(runFilePath: string): { role: string; text:
// --- Wait for agent run completion ---
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}

View file

@ -189,13 +189,16 @@ async function readFileContents(filePaths: string[]): Promise<{ path: string; co
* Wait for a run to complete by listening for run-processing-end event
*/
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}

View file

@ -43,7 +43,7 @@ function pathToSlug(url: string): string {
const parsed = new URL(url);
const p = parsed.pathname + (parsed.search || '');
if (!p || p === '/') return 'index';
let slug = p.replace(/[^a-zA-Z0-9]+/g, '_').replace(/^_|_$/g, '');
const slug = p.replace(/[^a-zA-Z0-9]+/g, '_').replace(/^_|_$/g, '');
return slug.substring(0, 80) || 'index';
} catch {
return 'index';
@ -184,12 +184,13 @@ function saveConfig(config: Config): void {
fs.writeFileSync(CONFIG_FILE, JSON.stringify(config, null, 2), 'utf-8');
}
function validateConfig(data: any): data is Config {
function validateConfig(data: unknown): data is Config {
if (typeof data !== 'object' || data === null) return false;
if (data.mode !== 'all' && data.mode !== 'ask') return false;
if (!Array.isArray(data.whitelist)) return false;
if (!Array.isArray(data.blacklist)) return false;
if (typeof data.enabled !== 'boolean') return false;
const candidate = data as Partial<Config>;
if (candidate.mode !== 'all' && candidate.mode !== 'ask') return false;
if (!Array.isArray(candidate.whitelist)) return false;
if (!Array.isArray(candidate.blacklist)) return false;
if (typeof candidate.enabled !== 'boolean') return false;
return true;
}

View file

@ -133,13 +133,16 @@ function scanDirectoryRecursive(dir: string): string[] {
* Wait for a run to complete by listening for run-processing-end event
*/
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}

View file

@ -66,13 +66,16 @@ function getUnlabeledEmails(state: LabelingState): string[] {
* Wait for a run to complete by listening for run-processing-end event
*/
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}

View file

@ -348,24 +348,6 @@ async function performSync(syncDir: string, lookbackDays: number) {
// --- Composio-based Sync ---
interface ComposioCalendarState {
last_sync: string; // ISO string
}
function loadComposioState(stateFile: string): ComposioCalendarState | null {
if (fs.existsSync(stateFile)) {
try {
const data = JSON.parse(fs.readFileSync(stateFile, 'utf-8'));
if (data.last_sync) {
return { last_sync: data.last_sync };
}
} catch (e) {
console.error('[Calendar] Failed to load composio state:', e);
}
}
return null;
}
function saveComposioState(stateFile: string, lastSync: string): void {
fs.writeFileSync(stateFile, JSON.stringify({ last_sync: lastSync }, null, 2));
}

View file

@ -79,13 +79,16 @@ function getUntaggedNotes(state: NoteTaggingState): string[] {
* Wait for a run to complete by listening for run-processing-end event
*/
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}

View file

@ -22,13 +22,16 @@ const PREBUILT_DIR = path.join(WorkDir, 'pre-built');
* Wait for a run to complete by listening for run-processing-end event
*/
async function waitForRunCompletion(runId: string): Promise<void> {
return new Promise(async (resolve) => {
const unsubscribe = await bus.subscribe('*', async (event) => {
return new Promise((resolve, reject) => {
let unsubscribe = () => {};
bus.subscribe('*', async (event) => {
if (event.type === 'run-processing-end' && event.runId === runId) {
unsubscribe();
resolve();
}
});
}).then((nextUnsubscribe) => {
unsubscribe = nextUnsubscribe;
}).catch(reject);
});
}
@ -89,8 +92,6 @@ Process new items and use the user context above to identify yourself when draft
* Check all agents and run those that are due
*/
async function checkAndRunAgents(): Promise<void> {
const config = loadConfig();
for (const agentName of PREBUILT_AGENTS) {
try {
if (shouldRunAgent(agentName)) {
@ -149,7 +150,7 @@ export async function init(): Promise<void> {
* Manually trigger an agent run (useful for testing)
*/
export async function triggerAgent(agentName: string): Promise<void> {
if (!PREBUILT_AGENTS.includes(agentName as any)) {
if (!PREBUILT_AGENTS.includes(agentName as (typeof PREBUILT_AGENTS)[number])) {
throw new Error(`Unknown agent: ${agentName}. Available: ${PREBUILT_AGENTS.join(', ')}`);
}
await runAgent(agentName);

View file

@ -1,12 +1,12 @@
import chokidar, { type FSWatcher } from 'chokidar';
import fs from 'node:fs/promises';
import { workspace } from '@x/shared';
import { ensureWorkspaceRoot, absToRelPosix } from './workspace.js';
import { WorkDir } from '../config/config.js';
import { WorkspaceChangeEvent } from 'packages/shared/dist/workspace.js';
import z from 'zod';
import { Stats } from 'node:fs';
export type WorkspaceChangeCallback = (event: z.infer<typeof WorkspaceChangeEvent>) => void;
export type WorkspaceChangeCallback = (event: z.infer<typeof workspace.WorkspaceChangeEvent>) => void;
/**
* Create a workspace watcher

View file

@ -3,7 +3,6 @@ import type { Stats } from 'node:fs';
import path from 'node:path';
import { workspace } from '@x/shared';
import { z } from 'zod';
import { RemoveOptions, WriteFileOptions, WriteFileResult } from 'packages/shared/dist/workspace.js';
import { WorkDir } from '../config/config.js';
import { rewriteWikiLinksForRenamedKnowledgeFile } from './wiki-link-rewrite.js';
import { commitAll } from '../knowledge/version_history.js';
@ -237,8 +236,8 @@ function scheduleKnowledgeCommit(filename: string): void {
export async function writeFile(
relPath: string,
data: string,
opts?: z.infer<typeof WriteFileOptions>
): Promise<z.infer<typeof WriteFileResult>> {
opts?: z.infer<typeof workspace.WriteFileOptions>
): Promise<z.infer<typeof workspace.WriteFileResult>> {
const filePath = resolveWorkspacePath(relPath);
const encoding = opts?.encoding || 'utf8';
const atomic = opts?.atomic !== false; // default true
@ -381,7 +380,7 @@ export async function copy(
export async function remove(
relPath: string,
opts?: z.infer<typeof RemoveOptions>
opts?: z.infer<typeof workspace.RemoveOptions>
): Promise<{ ok: true }> {
const filePath = resolveWorkspacePath(relPath);
const trash = opts?.trash !== false; // default true

View file

@ -0,0 +1,31 @@
import { mkdtemp, rm } from "node:fs/promises";
import { tmpdir } from "node:os";
import path from "node:path";
import { spawn } from "node:child_process";
import { fileURLToPath } from "node:url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const packageDir = path.resolve(__dirname, "..");
const tempRoot = await mkdtemp(path.join(tmpdir(), "rowboat-core-test-"));
const testWorkDir = path.join(tempRoot, "workspace");
try {
const exitCode = await new Promise((resolve, reject) => {
const child = spawn(process.execPath, ["--test", "./test/workspace-path-safety.test.mjs"], {
cwd: packageDir,
stdio: "inherit",
env: {
...process.env,
ROWBOAT_WORKDIR: testWorkDir,
},
});
child.on("error", reject);
child.on("exit", (code) => resolve(code ?? 1));
});
process.exitCode = Number(exitCode);
} finally {
await rm(tempRoot, { recursive: true, force: true });
}

View file

@ -0,0 +1,44 @@
import test from "node:test";
import assert from "node:assert/strict";
import path from "node:path";
import { WorkDir } from "../dist/config/config.js";
import {
absToRelPosix,
assertSafeRelPath,
resolveWorkspacePath,
} from "../dist/workspace/workspace.js";
test("uses ROWBOAT_WORKDIR override for test isolation", () => {
assert.equal(WorkDir, process.env.ROWBOAT_WORKDIR);
});
test("assertSafeRelPath allows simple relative paths", () => {
assert.doesNotThrow(() => assertSafeRelPath("notes/today.md"));
});
test("assertSafeRelPath rejects absolute paths", () => {
assert.throws(() => assertSafeRelPath("/tmp/notes.md"), /Absolute paths are not allowed/);
});
test("assertSafeRelPath rejects traversal attempts", () => {
assert.throws(() => assertSafeRelPath("../notes.md"), /Path traversal/);
assert.throws(() => assertSafeRelPath("notes/../secret.md"), /Path traversal|Invalid path/);
});
test("resolveWorkspacePath returns the configured root for empty path", () => {
assert.equal(resolveWorkspacePath(""), WorkDir);
});
test("resolveWorkspacePath resolves safe relative paths inside the workspace", () => {
assert.equal(resolveWorkspacePath("knowledge/alpha.md"), path.join(WorkDir, "knowledge", "alpha.md"));
});
test("absToRelPosix returns POSIX relative paths inside the workspace", () => {
const absolutePath = path.join(WorkDir, "knowledge", "nested", "alpha.md");
assert.equal(absToRelPosix(absolutePath), "knowledge/nested/alpha.md");
});
test("absToRelPosix rejects paths outside the workspace", () => {
assert.equal(absToRelPosix("/tmp/outside.md"), null);
});

View file

@ -1,12 +1,14 @@
#!/bin/bash
set -e
# build rowboatx next.js app
# Prepare the frontend and local runtime embedded by the desktop shell.
# Build the RowboatX Next.js frontend.
(cd apps/rowboatx && \
npm install && \
npm run build)
# build rowboat server
# Build the local CLI/runtime service.
(cd apps/cli && \
npm install && \
npm run build)
npm run build)