mirror of
https://github.com/0xMassi/webclaw.git
synced 2026-04-25 00:06:21 +02:00
- Update CLAUDE.md: provider chain, LLM modules section, CLI examples - Update env.example: add GEMINI_MODEL, reorder providers (Gemini first) - Update noxa-llm/src/lib.rs crate doc comment
46 lines
1.2 KiB
Text
46 lines
1.2 KiB
Text
# ============================================
|
|
# Noxa Configuration
|
|
# Copy to .env and fill in your values
|
|
# ============================================
|
|
|
|
# --- LLM Providers ---
|
|
|
|
# Gemini CLI (primary provider — requires `gemini` binary on PATH)
|
|
# GEMINI_MODEL=gemini-2.5-pro # defaults to gemini-2.5-pro
|
|
|
|
# Ollama (fallback; local inference)
|
|
OLLAMA_HOST=http://localhost:11434
|
|
OLLAMA_MODEL=qwen3:8b
|
|
|
|
# OpenAI (optional cloud fallback)
|
|
# OPENAI_API_KEY — set your OpenAI key
|
|
# OPENAI_BASE_URL — defaults to https://api.openai.com/v1
|
|
# OPENAI_MODEL — defaults to gpt-4o-mini
|
|
|
|
# Anthropic (optional cloud fallback)
|
|
# ANTHROPIC_API_KEY — set your Anthropic key
|
|
# ANTHROPIC_MODEL — defaults to claude-sonnet-4-20250514
|
|
|
|
# --- Proxy ---
|
|
|
|
# Single proxy
|
|
# NOXA_PROXY=http://user:pass@host:port
|
|
|
|
# Proxy file (one per line: host:port:user:pass)
|
|
# NOXA_PROXY_FILE=/path/to/proxies.txt
|
|
|
|
# --- Server (noxa-server only) ---
|
|
# NOXA_PORT=3000
|
|
# NOXA_HOST=0.0.0.0
|
|
# NOXA_AUTH_KEY=your-auth-key
|
|
# NOXA_MAX_CONCURRENCY=50
|
|
# NOXA_JOB_TTL_SECS=3600
|
|
# NOXA_MAX_JOBS=100
|
|
|
|
# --- CLI LLM overrides ---
|
|
# NOXA_LLM_PROVIDER=ollama
|
|
# NOXA_LLM_MODEL=qwen3:8b
|
|
# NOXA_LLM_BASE_URL=http://localhost:11434
|
|
|
|
# --- Logging ---
|
|
# NOXA_LOG=info
|