Commit graph

3 commits

Author SHA1 Message Date
Jacob Magar
420a1d7522 feat(noxa-9fw.2): make gemini cli the primary llm backend
- ProviderChain::default() order: Gemini CLI -> OpenAI -> Ollama -> Anthropic
- Add --llm-provider gemini arm to build_llm_provider() in noxa-cli
- Update unknown-provider error to mention gemini
- Update empty-chain error messages in CLI and MCP to mention gemini CLI
- Update MCP startup warn! to list gemini CLI as first option
2026-04-11 07:32:24 -04:00
Jacob Magar
d800c37bfd feat(noxa-9fw.1): add gemini cli provider adapter
- Add LlmError::Subprocess(#[from] io::Error) and LlmError::Timeout variants
- Implement GeminiCliProvider: new(model) -> Self matching OllamaProvider pattern
- Prompts passed exclusively via stdin (Stdio::piped), never as CLI args
- 30s subprocess timeout via tokio::time::timeout to prevent hung processes
- 6-slot Semaphore to bound concurrent subprocess spawns in MCP context
- Stderr captured and included (first 500 bytes) in non-zero exit errors
- is_available(): pure `gemini --version` PATH check, no live inference
- GEMINI_MODEL env override; default model gemini-2.5-pro
- strip_thinking_tags + strip_code_fences applied to stdout output
2026-04-11 07:30:41 -04:00
Jacob Magar
8674b60b4e chore: rebrand webclaw to noxa 2026-04-11 00:10:38 -04:00