mirror of
https://github.com/MODSetter/SurfSense.git
synced 2026-04-25 00:36:31 +02:00
feat: complete MiniMax LLM provider integration
Add full MiniMax provider support across the entire stack: Backend: - Add MINIMAX to LiteLLMProvider enum in db.py - Add MINIMAX mapping to all provider_map dicts in llm_service.py, llm_router_service.py, and llm_config.py - Add Alembic migration (rev 106) for PostgreSQL enum - Add MiniMax M2.5 example in global_llm_config.example.yaml Frontend: - Add MiniMax to LLM_PROVIDERS enum with apiBase - Add MiniMax-M2.5 and MiniMax-M2.5-highspeed to LLM_MODELS - Add MINIMAX to Zod validation schema - Add MiniMax SVG icon and wire up in provider-icons Docs: - Add MiniMax setup guide in chinese-llm-setup.md MiniMax uses an OpenAI-compatible API (https://api.minimax.io/v1) with models supporting up to 204K context window. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
54e56e1b9a
commit
760aa38225
13 changed files with 123 additions and 2 deletions
|
|
@ -183,6 +183,23 @@ global_llm_configs:
|
|||
use_default_system_instructions: true
|
||||
citations_enabled: true
|
||||
|
||||
# Example: MiniMax M2.5 - High-performance with 204K context window
|
||||
- id: -8
|
||||
name: "Global MiniMax M2.5"
|
||||
description: "MiniMax M2.5 with 204K context window and competitive pricing"
|
||||
provider: "MINIMAX"
|
||||
model_name: "MiniMax-M2.5"
|
||||
api_key: "your-minimax-api-key-here"
|
||||
api_base: "https://api.minimax.io/v1"
|
||||
rpm: 60
|
||||
tpm: 100000
|
||||
litellm_params:
|
||||
temperature: 1.0 # MiniMax requires temperature in (0.0, 1.0], cannot be 0
|
||||
max_tokens: 4000
|
||||
system_instructions: ""
|
||||
use_default_system_instructions: true
|
||||
citations_enabled: true
|
||||
|
||||
# =============================================================================
|
||||
# Image Generation Configuration
|
||||
# =============================================================================
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue