docs(noxa-9fw.4): describe gemini cli as primary llm backend

- Update CLAUDE.md: provider chain, LLM modules section, CLI examples
- Update env.example: add GEMINI_MODEL, reorder providers (Gemini first)
- Update noxa-llm/src/lib.rs crate doc comment
This commit is contained in:
Jacob Magar 2026-04-11 07:36:19 -04:00
parent 993fd6c45d
commit af304eda7f
3 changed files with 20 additions and 10 deletions

View file

@ -1,8 +1,9 @@
/// noxa-llm: LLM integration with local-first hybrid architecture.
/// noxa-llm: LLM integration with Gemini-CLI-first hybrid architecture.
///
/// Provider chain tries Ollama (local) first, falls back to OpenAI, then Anthropic.
/// Provides schema-based extraction, prompt extraction, and summarization
/// on top of noxa-core's content pipeline.
/// Provider chain: Gemini CLI (primary) → OpenAI → Ollama → Anthropic.
/// Gemini CLI requires the `gemini` binary on PATH; GEMINI_MODEL env var sets the model.
/// Provides schema-validated extraction (with one retry on parse failure),
/// prompt extraction, and summarization on top of noxa-core's content pipeline.
pub mod chain;
pub mod clean;
pub mod error;