This website requires JavaScript.
Explore
Help
Register
Sign in
apunkt
/
plano
Watch
1
Star
0
Fork
You've already forked plano
0
mirror of
https://github.com/katanemo/plano.git
synced
2026-05-10 08:12:48 +02:00
Code
Issues
Projects
Releases
Packages
Wiki
Activity
Actions
fb04cbb2ce
plano
/
crates
/
brightstaff
/
src
/
lib.rs
7 lines
98 B
Rust
Raw
Normal View
History
Unescape
Escape
Introduce brightstaff a new terminal service for llm routing (#477)
2025-05-19 09:59:22 -07:00
pub
mod
handlers
;
pub
mod
router
;
Introduce signals change (#655) * adding support for signals * reducing false positives for signals like positive interaction * adding docs. Still need to fix the messages list, but waiting on PR #621 * Improve frustration detection: normalize contractions and refine punctuation * Further refine test cases with longer messages * minor doc changes * fixing echo statement for build * fixing the messages construction and using the trait for signals * update signals docs * fixed some minor doc changes * added more tests and fixed docuemtnation. PR 100% ready * made fixes based on PR comments * Optimize latency 1. replace sliding window approach with trigram containment check 2. add code to pre-compute ngrams for patterns * removed some debug statements to make tests easier to read * PR comments to make ObservableStreamProcessor accept optonal Vec<Messagges> * fixed PR comments --------- Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-342.local> Co-authored-by: MeiyuZhong <mariazhong9612@gmail.com> Co-authored-by: nehcgs <54548843+nehcgs@users.noreply.github.com>
2026-01-07 11:20:44 -08:00
pub
mod
signals
;
enable state management for v1/responses (#631) * first commit with tests to enable state mamangement via memory * fixed logs to follow the conversational flow a bit better * added support for supabase * added the state_storage_v1_responses flag, and use that to store state appropriately * cleaned up logs and fixed issue with connectivity for llm gateway in weather forecast demo * fixed mixed inputs from openai v1/responses api (#632) * fixed mixed inputs from openai v1/responses api * removing tracing from model-alias-rouing * handling additional input types from openairs --------- Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-342.local> * resolving PR comments --------- Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-342.local>
2025-12-17 12:18:38 -08:00
pub
mod
state
;
Improve end to end tracing (#628) * adding canonical tracing support via bright-staff * improved formatting for tools in the traces * removing anthropic from the currency exchange demo * using Envoy to transport traces, not calling OTEL directly * moving otel collcetor cluster outside tracing if/else * minor fixes to not write to the OTEL collector if tracing is disabled * fixed PR comments and added more trace attributes * more fixes based on PR comments * more clean up based on PR comments --------- Co-authored-by: Salman Paracha <salmanparacha@MacBook-Pro-342.local>
2025-12-11 15:21:57 -08:00
pub
mod
tracing
;
trim conversation if it exceed max limit of what router model can handle (#488)
2025-05-27 20:28:22 -07:00
pub
mod
utils
;
Reference in a new issue
Copy permalink