mirror of
https://github.com/MODSetter/SurfSense.git
synced 2026-05-08 15:22:39 +02:00
feat: initialize agent and claude skill libraries with comprehensive knowledge bases, workflow templates, and implementation artifacts.
This commit is contained in:
parent
956d8c6322
commit
b35b4337bb
2028 changed files with 565614 additions and 0 deletions
21
.serena/memories/project_overview.md
Normal file
21
.serena/memories/project_overview.md
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
# SurfSense - Project Overview
|
||||
|
||||
## Purpose
|
||||
Open-source alternative to NotebookLM — personal knowledge base with AI chat, 25+ external connectors (Google Drive, Notion, Jira, Slack...), real-time multiplayer, desktop app, podcast/video generation.
|
||||
|
||||
## Tech Stack
|
||||
- **Backend**: Python 3.12, FastAPI, Celery (Redis broker), PostgreSQL (pgvector), Alembic, LiteLLM, LangGraph, uv package manager
|
||||
- **Frontend**: Next.js 16 (Turbopack), React 19, TypeScript, Tailwind v4, Jotai, @rocicorp/zero (real-time sync), pnpm
|
||||
- **Real-time**: zero-cache (rocicorp/zero:0.26.2) → Postgres logical replication
|
||||
- **Services (Docker)**: PostgreSQL pgvector, Redis, SearXNG, pgAdmin, zero-cache
|
||||
- **Desktop**: Electron (surfsense_desktop/)
|
||||
- **Browser Extension**: surfsense_browser_extension/
|
||||
|
||||
## Architecture
|
||||
```
|
||||
surfsense_backend/ - FastAPI + Celery workers
|
||||
surfsense_web/ - Next.js frontend
|
||||
surfsense_desktop/ - Electron desktop app
|
||||
surfsense_browser_extension/ - Browser extension
|
||||
docker/ - docker-compose.dev.yml & .env
|
||||
```
|
||||
73
.serena/memories/suggested_commands.md
Normal file
73
.serena/memories/suggested_commands.md
Normal file
|
|
@ -0,0 +1,73 @@
|
|||
# SurfSense - Suggested Commands
|
||||
|
||||
## Docker Services (Infrastructure)
|
||||
```bash
|
||||
# Start all infra services (db + pgadmin)
|
||||
cd /Users/luisphan/Documents/GitHub/SurfSense
|
||||
docker compose -f docker/docker-compose.dev.yml --env-file docker/.env up -d db pgadmin
|
||||
|
||||
# Start zero-cache standalone (BE & FE run local)
|
||||
docker run -d \
|
||||
--name surfsense-zero-cache \
|
||||
--network surfsense-dev_default \
|
||||
--add-host "host.docker.internal:host-gateway" \
|
||||
-p 4848:4848 \
|
||||
-v surfsense-dev-zero-cache:/data \
|
||||
-e ZERO_UPSTREAM_DB="postgresql://postgres:postgres@surfsense-dev-db-1:5432/surfsense?sslmode=disable" \
|
||||
-e ZERO_CVR_DB="postgresql://postgres:postgres@surfsense-dev-db-1:5432/surfsense?sslmode=disable" \
|
||||
-e ZERO_CHANGE_DB="postgresql://postgres:postgres@surfsense-dev-db-1:5432/surfsense?sslmode=disable" \
|
||||
-e ZERO_REPLICA_FILE="/data/zero.db" \
|
||||
-e ZERO_ADMIN_PASSWORD="surfsense-zero-admin" \
|
||||
-e ZERO_APP_PUBLICATIONS="zero_publication" \
|
||||
-e ZERO_NUM_SYNC_WORKERS="4" \
|
||||
-e ZERO_UPSTREAM_MAX_CONNS="20" \
|
||||
-e ZERO_CVR_MAX_CONNS="30" \
|
||||
-e ZERO_QUERY_URL="http://host.docker.internal:3000/api/zero/query" \
|
||||
-e ZERO_MUTATE_URL="http://host.docker.internal:3000/api/zero/mutate" \
|
||||
rocicorp/zero:0.26.2
|
||||
|
||||
# SearXNG: reuse mrholmes-searxng trên port 8888
|
||||
# Redis: reuse redis-server local trên localhost:6379/1
|
||||
```
|
||||
|
||||
## Backend (FastAPI)
|
||||
```bash
|
||||
cd /Users/luisphan/Documents/GitHub/SurfSense/surfsense_backend
|
||||
uv sync # install deps
|
||||
uv run alembic upgrade head # run migrations
|
||||
uv run python main.py --reload # start dev server on port 8001
|
||||
# OR Celery worker:
|
||||
uv run celery -A app.celery_app worker -Q surfsense --loglevel=info
|
||||
```
|
||||
|
||||
## Frontend (Next.js)
|
||||
```bash
|
||||
cd /Users/luisphan/Documents/GitHub/SurfSense/surfsense_web
|
||||
pnpm install
|
||||
pnpm dev # http://localhost:3000
|
||||
|
||||
# DB commands (drizzle)
|
||||
pnpm db:generate
|
||||
pnpm db:migrate
|
||||
pnpm db:studio # Drizzle Studio UI
|
||||
```
|
||||
|
||||
## Ports Summary
|
||||
| Service | Port | Notes |
|
||||
|---------------|-------|-------|
|
||||
| PostgreSQL | 5432 | docker |
|
||||
| pgAdmin | 5050 | docker, http://localhost:5050 |
|
||||
| Redis (local) | 6379 | db=1 for SurfSense |
|
||||
| SearXNG | 8888 | shared mrholmes-searxng |
|
||||
| zero-cache | 4848 | docker standalone |
|
||||
| Backend | 8001 | port 8000 used by chainlens |
|
||||
| Frontend | 3000 | Next.js |
|
||||
|
||||
## Health Checks
|
||||
```bash
|
||||
curl http://localhost:8001/health # {"status":"ok"}
|
||||
curl http://localhost:3000 # HTML
|
||||
nc -z localhost 4848 && echo OK # zero-cache
|
||||
redis-cli ping # PONG
|
||||
docker ps --filter "name=surfsense" # docker services
|
||||
```
|
||||
45
.serena/memories/trollllm-integration.md
Normal file
45
.serena/memories/trollllm-integration.md
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
# TrollLLM — Hướng dẫn tích hợp vào SurfSense
|
||||
|
||||
## Base URL
|
||||
- OpenAI-compatible endpoint: `https://chat.trollllm.xyz/v1`
|
||||
- Anthropic-compatible endpoint: `https://chat.trollllm.xyz` (không có /v1)
|
||||
|
||||
## Danh sách model chính xác (tên phải đúng 100%)
|
||||
| Model ID | Provider | Ghi chú |
|
||||
|---|---|---|
|
||||
| `claude-haiku-4.5` | Anthropic | Speed |
|
||||
| `claude-sonnet-4` | Anthropic | Balanced |
|
||||
| `claude-sonnet-4.5` | Anthropic | Balanced |
|
||||
| `claude-sonnet-4.6` | Anthropic | Balanced |
|
||||
| `claude-opus-4.5` | Anthropic | Reasoning |
|
||||
| `claude-opus-4.6` | Anthropic | Reasoning |
|
||||
| `gemini-3-flash-preview` | Google | Speed (**KHÔNG phải** gemini-3-flash) |
|
||||
| `gemini-3.1-pro-preview` | Google | Multimodal |
|
||||
| `gpt-5.2` | OpenAI | Reasoning |
|
||||
| `gpt-5.4` | OpenAI | Reasoning |
|
||||
| `gpt-5.2-codex` | OpenAI | Code |
|
||||
| `gpt-5.3-codex` | OpenAI | Code |
|
||||
|
||||
## Cách add model vào SurfSense (đúng cách)
|
||||
|
||||
### Cách 1 — Dùng Provider = OPENAI (khuyến nghị, dùng cho mọi model)
|
||||
- **LLM Provider**: `OPENAI`
|
||||
- **Model Name**: tên chính xác từ bảng trên (ví dụ `claude-sonnet-4.6`)
|
||||
- **API Key**: TrollLLM API key
|
||||
- **API Base URL**: `https://chat.trollllm.xyz/v1`
|
||||
|
||||
### Cách 2 — Dùng Custom Provider (LiteLLM format)
|
||||
- **LLM Provider**: `Custom Provider`
|
||||
- **Custom Provider Name**: tùy ý (ví dụ `trollllm`)
|
||||
- **Model Name**: phải có prefix `openai/` → `openai/claude-sonnet-4.6`
|
||||
- **API Key**: TrollLLM API key
|
||||
- **API Base URL**: `https://chat.trollllm.xyz/v1`
|
||||
|
||||
## Lỗi phổ biến
|
||||
1. **Tên model sai**: `gemini-3-flash` ❌ → phải là `gemini-3-flash-preview` ✅
|
||||
2. **Custom Provider thiếu prefix**: `gemini-3-flash-preview` ❌ → `openai/gemini-3-flash-preview` ✅
|
||||
3. **Base URL sai**: `https://trollllm.xyz/v1` ❌ → `https://chat.trollllm.xyz/v1` ✅
|
||||
|
||||
## Lưu ý đặc biệt
|
||||
- TrollLLM yêu cầu `User-Agent` header để bypass Cloudflare, nhưng SurfSense/LiteLLM thường tự set header này.
|
||||
- Nếu dùng Anthropic SDK format: dùng `x-api-key` header thay vì `Authorization: Bearer`.
|
||||
Loading…
Add table
Add a link
Reference in a new issue