SurfSense/surfsense_backend/app/agents/new_chat
PR Bot 760aa38225 feat: complete MiniMax LLM provider integration
Add full MiniMax provider support across the entire stack:

Backend:
- Add MINIMAX to LiteLLMProvider enum in db.py
- Add MINIMAX mapping to all provider_map dicts in llm_service.py,
  llm_router_service.py, and llm_config.py
- Add Alembic migration (rev 106) for PostgreSQL enum
- Add MiniMax M2.5 example in global_llm_config.example.yaml

Frontend:
- Add MiniMax to LLM_PROVIDERS enum with apiBase
- Add MiniMax-M2.5 and MiniMax-M2.5-highspeed to LLM_MODELS
- Add MINIMAX to Zod validation schema
- Add MiniMax SVG icon and wire up in provider-icons

Docs:
- Add MiniMax setup guide in chinese-llm-setup.md

MiniMax uses an OpenAI-compatible API (https://api.minimax.io/v1)
with models supporting up to 204K context window.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-13 07:27:47 +08:00
..
tools feat: refactor agent tools management and add UI integration 2026-03-10 17:36:26 -07:00
__init__.py feat: migrated to surfsense deep agent 2025-12-23 01:16:25 -08:00
chat_deepagent.py feat: refactor agent tools management and add UI integration 2026-03-10 17:36:26 -07:00
checkpointer.py feat: implement connection pooling for AsyncPostgresSaver in checkpointer 2026-02-05 17:32:43 -08:00
context.py organize deepagent codebase 2025-12-20 18:35:39 +02:00
llm_config.py feat: complete MiniMax LLM provider integration 2026-03-13 07:27:47 +08:00
sandbox.py feat: enhance caching mechanisms to prevent memory leaks 2026-02-27 17:56:00 -08:00
system_prompt.py feat: refactor agent tools management and add UI integration 2026-03-10 17:36:26 -07:00
utils.py inject tools at runtime 2025-12-20 18:35:39 +02:00