mirror of
https://github.com/0xMassi/webclaw.git
synced 2026-05-13 17:02:36 +02:00
docs(noxa-9fw.4): describe gemini cli as primary llm backend
- Update CLAUDE.md: provider chain, LLM modules section, CLI examples - Update env.example: add GEMINI_MODEL, reorder providers (Gemini first) - Update noxa-llm/src/lib.rs crate doc comment
This commit is contained in:
parent
993fd6c45d
commit
af304eda7f
3 changed files with 20 additions and 10 deletions
|
|
@ -1,8 +1,9 @@
|
|||
/// noxa-llm: LLM integration with local-first hybrid architecture.
|
||||
/// noxa-llm: LLM integration with Gemini-CLI-first hybrid architecture.
|
||||
///
|
||||
/// Provider chain tries Ollama (local) first, falls back to OpenAI, then Anthropic.
|
||||
/// Provides schema-based extraction, prompt extraction, and summarization
|
||||
/// on top of noxa-core's content pipeline.
|
||||
/// Provider chain: Gemini CLI (primary) → OpenAI → Ollama → Anthropic.
|
||||
/// Gemini CLI requires the `gemini` binary on PATH; GEMINI_MODEL env var sets the model.
|
||||
/// Provides schema-validated extraction (with one retry on parse failure),
|
||||
/// prompt extraction, and summarization on top of noxa-core's content pipeline.
|
||||
pub mod chain;
|
||||
pub mod clean;
|
||||
pub mod error;
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue