webclaw/config.example.json
Jacob Magar adf4b6ba55 feat(llm): add Gemini CLI provider as primary; set qwen3.5:9b as default Ollama model
- Add GeminiCliProvider: shells out to `gemini -p` with --output-format json,
  injection-safe prompt passing, MCP server suppression via temp workdir,
  6-slot concurrency semaphore, 60s subprocess deadline
- Add --llm-provider, --llm-model, --llm-base-url CLI flags for per-call overrides
- Provider chain: Gemini CLI → OpenAI → Ollama → Anthropic
- Move LLM timing to dispatch layer (LLM: Xs on stderr)
- Default Ollama model: qwen3:8b → qwen3.5:9b (benchmark shows better schema extraction)
- Add noxa mcp subcommand
- Add docs/reports/llm-benchmark-2026-04-11.md (Gemini vs qwen3.5:4b vs qwen3.5:9b)
- Bump version 0.3.11 → 0.4.0

Co-authored-by: Claude <claude@anthropic.com>
2026-04-12 00:52:53 -04:00

35 lines
1.2 KiB
JSON

{
"$schema": "./config.schema.json",
"_doc": [
"Copy to config.json and remove fields you don't need.",
"Secrets (api_key, proxy, webhook, llm_base_url) go in .env — NOT here.",
"BOOL FLAG LIMITATION: once set to true here, cannot be overridden to false",
"from the CLI for a single run (no --no-flag support). Use NOXA_CONFIG=/dev/null",
"on the command line to bypass this config entirely.",
"LLM provider/model are optional overrides. Leave them unset to keep the",
"Gemini -> OpenAI -> Ollama -> Anthropic fallback chain intact.",
"on_change is intentionally absent — it must remain a CLI-only flag.",
"Unknown fields are silently ignored, so this file works across noxa versions.",
"Set output_dir to write results to files instead of stdout."
],
"format": "markdown",
"browser": "chrome",
"timeout": 30,
"pdf_mode": "auto",
"metadata": false,
"verbose": false,
"only_main_content": false,
"include_selectors": [],
"exclude_selectors": ["nav", "footer", ".sidebar", ".cookie-banner"],
"depth": 1,
"max_pages": 20,
"concurrency": 5,
"delay": 100,
"path_prefix": null,
"include_paths": [],
"exclude_paths": ["/changelog/*", "/blog/*", "/releases/*"],
"use_sitemap": false
}