feat(v2.0.5): Intentional Amnesia — active forgetting via top-down inhibitory control

First AI memory system to model forgetting as a neuroscience-grounded
PROCESS rather than passive decay. Adds the `suppress` MCP tool (#24),
Rac1 cascade worker, migration V10, and dashboard forgetting indicators.

Based on:
- Anderson, Hanslmayr & Quaegebeur (2025), Nat Rev Neurosci — right
  lateral PFC as the domain-general inhibitory controller; SIF
  compounds with each stopping attempt.
- Cervantes-Sandoval et al. (2020), Front Cell Neurosci PMC7477079 —
  Rac1 GTPase as the active synaptic destabilization mechanism.

What's new:
* `suppress` MCP tool — each call compounds `suppression_count` and
  subtracts a `0.15 × count` penalty (saturating at 80%) from
  retrieval scores during hybrid search. Distinct from delete
  (removes) and demote (one-shot).
* Rac1 cascade worker — background sweep piggybacks the 6h
  consolidation loop, walks `memory_connections` edges from
  recently-suppressed seeds, applies attenuated FSRS decay to
  co-activated neighbors. You don't just forget Jake — you fade
  the café, the roommate, the birthday.
* 24h labile window — reversible via `suppress({id, reverse: true})`
  within 24 hours. Matches Nader reconsolidation semantics.
* Migration V10 — additive-only (`suppression_count`, `suppressed_at`
  + partial indices). All v2.0.x DBs upgrade seamlessly on first launch.
* Dashboard: `ForgettingIndicator.svelte` pulses when suppressions
  are active. 3D graph nodes dim to 20% opacity when suppressed.
  New WebSocket events: `MemorySuppressed`, `MemoryUnsuppressed`,
  `Rac1CascadeSwept`. Heartbeat carries `suppressed_count`.
* Search pipeline: SIF penalty inserted into the accessibility stage
  so it stacks on top of passive FSRS decay.
* Tool count bumped 23 → 24. Cognitive modules 29 → 30.

Memories persist — they are INHIBITED, not erased. `memory.get(id)`
returns full content through any number of suppressions. The 24h
labile window is a grace period for regret.

Also fixes issue #31 (dashboard graph view buggy) as a companion UI
bug discovered during the v2.0.5 audit cycle:

* Root cause: node glow `SpriteMaterial` had no `map`, so
  `THREE.Sprite` rendered as a solid-coloured 1×1 plane. Additive
  blending + `UnrealBloomPass(0.8, 0.4, 0.85)` amplified the square
  edges into hard-edged glowing cubes.
* Fix: shared 128×128 radial-gradient `CanvasTexture` singleton used
  as the sprite map. Retuned bloom to `(0.55, 0.6, 0.2)`. Halved fog
  density (0.008 → 0.0035). Edges bumped from dark navy `0x4a4a7a`
  to brand violet `0x8b5cf6` with higher opacity. Added explicit
  `scene.background` and a 2000-point starfield for depth.
* 21 regression tests added in `ui-fixes.test.ts` locking every
  invariant in (shared texture singleton, depthWrite:false, scale
  ×6, bloom magic numbers via source regex, starfield presence).

Tests: 1,284 Rust (+47) + 171 Vitest (+21) = 1,455 total, 0 failed
Clippy: clean across all targets, zero warnings
Release binary: 22.6MB, `cargo build --release -p vestige-mcp` green
Versions: workspace aligned at 2.0.5 across all 6 crates/packages

Closes #31
This commit is contained in:
Sam Valladares 2026-04-14 17:30:30 -05:00
parent 95bde93b49
commit 8178beb961
359 changed files with 8277 additions and 3416 deletions

View file

@ -18,7 +18,9 @@ use vestige_core::{IngestInput, Storage};
#[command(author = "samvallad33")]
#[command(version = env!("CARGO_PKG_VERSION"))]
#[command(about = "CLI for the Vestige cognitive memory system")]
#[command(long_about = "Vestige is a cognitive memory system based on 130 years of memory research.\n\nIt implements FSRS-6, spreading activation, synaptic tagging, and more.")]
#[command(
long_about = "Vestige is a cognitive memory system based on 130 years of memory research.\n\nIt implements FSRS-6, spreading activation, synaptic tagging, and more."
)]
struct Cli {
#[command(subcommand)]
command: Commands,
@ -171,21 +173,49 @@ fn run_stats(show_tagging: bool, show_states: bool) -> anyhow::Result<()> {
// Basic stats
println!("{}: {}", "Total Memories".white().bold(), stats.total_nodes);
println!("{}: {}", "Due for Review".white().bold(), stats.nodes_due_for_review);
println!("{}: {:.1}%", "Average Retention".white().bold(), stats.average_retention * 100.0);
println!("{}: {:.2}", "Average Storage Strength".white().bold(), stats.average_storage_strength);
println!("{}: {:.2}", "Average Retrieval Strength".white().bold(), stats.average_retrieval_strength);
println!("{}: {}", "With Embeddings".white().bold(), stats.nodes_with_embeddings);
println!(
"{}: {}",
"Due for Review".white().bold(),
stats.nodes_due_for_review
);
println!(
"{}: {:.1}%",
"Average Retention".white().bold(),
stats.average_retention * 100.0
);
println!(
"{}: {:.2}",
"Average Storage Strength".white().bold(),
stats.average_storage_strength
);
println!(
"{}: {:.2}",
"Average Retrieval Strength".white().bold(),
stats.average_retrieval_strength
);
println!(
"{}: {}",
"With Embeddings".white().bold(),
stats.nodes_with_embeddings
);
if let Some(model) = &stats.embedding_model {
println!("{}: {}", "Embedding Model".white().bold(), model);
}
if let Some(oldest) = stats.oldest_memory {
println!("{}: {}", "Oldest Memory".white().bold(), oldest.format("%Y-%m-%d %H:%M:%S"));
println!(
"{}: {}",
"Oldest Memory".white().bold(),
oldest.format("%Y-%m-%d %H:%M:%S")
);
}
if let Some(newest) = stats.newest_memory {
println!("{}: {}", "Newest Memory".white().bold(), newest.format("%Y-%m-%d %H:%M:%S"));
println!(
"{}: {}",
"Newest Memory".white().bold(),
newest.format("%Y-%m-%d %H:%M:%S")
);
}
// Embedding coverage
@ -194,7 +224,11 @@ fn run_stats(show_tagging: bool, show_states: bool) -> anyhow::Result<()> {
} else {
0.0
};
println!("{}: {:.1}%", "Embedding Coverage".white().bold(), embedding_coverage);
println!(
"{}: {:.1}%",
"Embedding Coverage".white().bold(),
embedding_coverage
);
// Tagging distribution (retention levels)
if show_tagging {
@ -205,9 +239,18 @@ fn run_stats(show_tagging: bool, show_states: bool) -> anyhow::Result<()> {
let total = memories.len();
if total > 0 {
let high = memories.iter().filter(|m| m.retention_strength >= 0.7).count();
let medium = memories.iter().filter(|m| m.retention_strength >= 0.4 && m.retention_strength < 0.7).count();
let low = memories.iter().filter(|m| m.retention_strength < 0.4).count();
let high = memories
.iter()
.filter(|m| m.retention_strength >= 0.7)
.count();
let medium = memories
.iter()
.filter(|m| m.retention_strength >= 0.4 && m.retention_strength < 0.7)
.count();
let low = memories
.iter()
.filter(|m| m.retention_strength < 0.4)
.count();
print_distribution_bar("High (>=70%)", high, total, "green");
print_distribution_bar("Medium (40-70%)", medium, total, "yellow");
@ -220,7 +263,10 @@ fn run_stats(show_tagging: bool, show_states: bool) -> anyhow::Result<()> {
// State distribution
if show_states {
println!();
println!("{}", "=== Cognitive State Distribution ===".magenta().bold());
println!(
"{}",
"=== Cognitive State Distribution ===".magenta().bold()
);
let memories = storage.get_all_nodes(500, 0)?;
let total = memories.len();
@ -248,7 +294,9 @@ fn run_stats(show_tagging: bool, show_states: bool) -> anyhow::Result<()> {
}
/// Compute cognitive state distribution for memories
fn compute_state_distribution(memories: &[vestige_core::KnowledgeNode]) -> (usize, usize, usize, usize) {
fn compute_state_distribution(
memories: &[vestige_core::KnowledgeNode],
) -> (usize, usize, usize, usize) {
let mut active = 0;
let mut dormant = 0;
let mut silent = 0;
@ -297,10 +345,7 @@ fn print_distribution_bar(label: &str, count: usize, total: usize, color: &str)
println!(
" {:15} [{:30}] {:>4} ({:>5.1}%)",
label,
colored_bar,
count,
percentage
label, colored_bar, count, percentage
);
}
@ -332,8 +377,16 @@ fn run_health() -> anyhow::Result<()> {
println!("{}: {}", "Status".white().bold(), colored_status);
println!("{}: {}", "Total Memories".white(), stats.total_nodes);
println!("{}: {}", "Due for Review".white(), stats.nodes_due_for_review);
println!("{}: {:.1}%", "Average Retention".white(), stats.average_retention * 100.0);
println!(
"{}: {}",
"Due for Review".white(),
stats.nodes_due_for_review
);
println!(
"{}: {:.1}%",
"Average Retention".white(),
stats.average_retention * 100.0
);
// Embedding coverage
let embedding_coverage = if stats.total_nodes > 0 {
@ -341,15 +394,27 @@ fn run_health() -> anyhow::Result<()> {
} else {
0.0
};
println!("{}: {:.1}%", "Embedding Coverage".white(), embedding_coverage);
println!("{}: {}", "Embedding Service".white(),
if storage.is_embedding_ready() { "Ready".green() } else { "Not Ready".red() });
println!(
"{}: {:.1}%",
"Embedding Coverage".white(),
embedding_coverage
);
println!(
"{}: {}",
"Embedding Service".white(),
if storage.is_embedding_ready() {
"Ready".green()
} else {
"Not Ready".red()
}
);
// Warnings
let mut warnings = Vec::new();
if stats.average_retention < 0.5 && stats.total_nodes > 0 {
warnings.push("Low average retention - consider running consolidation or reviewing memories");
warnings
.push("Low average retention - consider running consolidation or reviewing memories");
}
if stats.nodes_due_for_review > 10 {
@ -376,7 +441,8 @@ fn run_health() -> anyhow::Result<()> {
let mut recommendations = Vec::new();
if status == "CRITICAL" {
recommendations.push("CRITICAL: Many memories have very low retention. Review important memories.");
recommendations
.push("CRITICAL: Many memories have very low retention. Review important memories.");
}
if stats.nodes_due_for_review > 5 {
@ -384,7 +450,8 @@ fn run_health() -> anyhow::Result<()> {
}
if stats.nodes_with_embeddings < stats.total_nodes {
recommendations.push("Run 'vestige consolidate' to generate embeddings for better semantic search.");
recommendations
.push("Run 'vestige consolidate' to generate embeddings for better semantic search.");
}
if stats.total_nodes > 100 && stats.average_retention < 0.7 {
@ -398,8 +465,16 @@ fn run_health() -> anyhow::Result<()> {
println!();
println!("{}", "Recommendations:".cyan().bold());
for rec in &recommendations {
let icon = if rec.starts_with("CRITICAL") { "!".red().bold() } else { ">".cyan() };
let text = if rec.starts_with("CRITICAL") { rec.red().to_string() } else { rec.to_string() };
let icon = if rec.starts_with("CRITICAL") {
"!".red().bold()
} else {
">".cyan()
};
let text = if rec.starts_with("CRITICAL") {
rec.red().to_string()
} else {
rec.to_string()
};
println!(" {} {}", icon, text);
}
@ -416,11 +491,27 @@ fn run_consolidate() -> anyhow::Result<()> {
let storage = Storage::new(None)?;
let result = storage.run_consolidation()?;
println!("{}: {}", "Nodes Processed".white().bold(), result.nodes_processed);
println!("{}: {}", "Nodes Promoted".white().bold(), result.nodes_promoted);
println!(
"{}: {}",
"Nodes Processed".white().bold(),
result.nodes_processed
);
println!(
"{}: {}",
"Nodes Promoted".white().bold(),
result.nodes_promoted
);
println!("{}: {}", "Nodes Pruned".white().bold(), result.nodes_pruned);
println!("{}: {}", "Decay Applied".white().bold(), result.decay_applied);
println!("{}: {}", "Embeddings Generated".white().bold(), result.embeddings_generated);
println!(
"{}: {}",
"Decay Applied".white().bold(),
result.decay_applied
);
println!(
"{}: {}",
"Embeddings Generated".white().bold(),
result.embeddings_generated
);
println!("{}: {}ms", "Duration".white().bold(), result.duration_ms);
println!();
@ -523,7 +614,11 @@ fn run_restore(backup_path: PathBuf) -> anyhow::Result<()> {
let stats = storage.get_stats()?;
println!();
println!("{}: {}", "Total Nodes".white(), stats.total_nodes);
println!("{}: {}", "With Embeddings".white(), stats.nodes_with_embeddings);
println!(
"{}: {}",
"With Embeddings".white(),
stats.nodes_with_embeddings
);
Ok(())
}
@ -581,9 +676,10 @@ fn run_backup(output: PathBuf) -> anyhow::Result<()> {
// Create parent directories if needed
if let Some(parent) = output.parent()
&& !parent.exists() {
std::fs::create_dir_all(parent)?;
}
&& !parent.exists()
{
std::fs::create_dir_all(parent)?;
}
// Copy the database file
println!("Copying database...");
@ -630,8 +726,9 @@ fn run_export(
// Parse since date if provided
let since_date = match &since {
Some(date_str) => {
let naive = NaiveDate::parse_from_str(date_str, "%Y-%m-%d")
.map_err(|e| anyhow::anyhow!("Invalid date '{}': {}. Use YYYY-MM-DD format.", date_str, e))?;
let naive = NaiveDate::parse_from_str(date_str, "%Y-%m-%d").map_err(|e| {
anyhow::anyhow!("Invalid date '{}': {}. Use YYYY-MM-DD format.", date_str, e)
})?;
Some(
naive
.and_hms_opt(0, 0, 0)
@ -645,7 +742,12 @@ fn run_export(
// Parse tags filter
let tag_filter: Vec<String> = tags
.as_deref()
.map(|t| t.split(',').map(|s| s.trim().to_string()).filter(|s| !s.is_empty()).collect())
.map(|t| {
t.split(',')
.map(|s| s.trim().to_string())
.filter(|s| !s.is_empty())
.collect()
})
.unwrap_or_default();
let storage = Storage::new(None)?;
@ -657,9 +759,10 @@ fn run_export(
.filter(|node| {
// Date filter
if let Some(ref since_dt) = since_date
&& node.created_at < *since_dt {
return false;
}
&& node.created_at < *since_dt
{
return false;
}
// Tag filter: node must contain ALL specified tags
if !tag_filter.is_empty() {
for tag in &tag_filter {
@ -689,9 +792,10 @@ fn run_export(
// Create parent directories if needed
if let Some(parent) = output.parent()
&& !parent.exists() {
std::fs::create_dir_all(parent)?;
}
&& !parent.exists()
{
std::fs::create_dir_all(parent)?;
}
let file = std::fs::File::create(&output)?;
let mut writer = BufWriter::new(file);
@ -770,7 +874,11 @@ fn run_gc(
})
.collect();
println!("{}: {}", "Min retention threshold".white().bold(), min_retention);
println!(
"{}: {}",
"Min retention threshold".white().bold(),
min_retention
);
if let Some(max_days) = max_age_days {
println!("{}: {} days", "Max age".white().bold(), max_days);
}
@ -783,7 +891,10 @@ fn run_gc(
if candidates.is_empty() {
println!();
println!("{}", "No memories match the garbage collection criteria.".green());
println!(
"{}",
"No memories match the garbage collection criteria.".green()
);
return Ok(());
}
@ -853,7 +964,12 @@ fn run_gc(
Ok(true) => deleted += 1,
Ok(false) => errors += 1, // node was already gone
Err(e) => {
eprintln!(" {} Failed to delete {}: {}", "ERR".red(), &node.id[..8], e);
eprintln!(
" {} Failed to delete {}: {}",
"ERR".red(),
&node.id[..8],
e
);
errors += 1;
}
}
@ -960,7 +1076,10 @@ fn run_ingest(
fn run_dashboard(port: u16, open_browser: bool) -> anyhow::Result<()> {
println!("{}", "=== Vestige Dashboard ===".cyan().bold());
println!();
println!("Starting dashboard at {}...", format!("http://127.0.0.1:{}", port).cyan());
println!(
"Starting dashboard at {}...",
format!("http://127.0.0.1:{}", port).cyan()
);
let storage = Storage::new(None)?;
@ -1025,8 +1144,19 @@ fn run_serve(port: u16, with_dashboard: bool, dashboard_port: u16) -> anyhow::Re
let dc = Arc::clone(&cognitive);
let dtx = event_tx.clone();
tokio::spawn(async move {
match vestige_mcp::dashboard::start_background_with_event_tx(ds, Some(dc), dtx, dashboard_port).await {
Ok(_) => println!(" {} Dashboard: http://127.0.0.1:{}", ">".cyan(), dashboard_port),
match vestige_mcp::dashboard::start_background_with_event_tx(
ds,
Some(dc),
dtx,
dashboard_port,
)
.await
{
Ok(_) => println!(
" {} Dashboard: http://127.0.0.1:{}",
">".cyan(),
dashboard_port
),
Err(e) => eprintln!(" {} Dashboard failed: {}", "!".yellow(), e),
}
});
@ -1037,7 +1167,12 @@ fn run_serve(port: u16, with_dashboard: bool, dashboard_port: u16) -> anyhow::Re
.map_err(|e| anyhow::anyhow!("Failed to create auth token: {}", e))?;
let bind = std::env::var("VESTIGE_HTTP_BIND").unwrap_or_else(|_| "127.0.0.1".to_string());
println!(" {} HTTP transport: http://{}:{}/mcp", ">".cyan(), bind, port);
println!(
" {} HTTP transport: http://{}:{}/mcp",
">".cyan(),
bind,
port
);
println!(" {} Auth token: {}...", ">".cyan(), &token[..8]);
println!();
println!("{}", "Press Ctrl+C to stop.".dimmed());

View file

@ -65,7 +65,12 @@ fn main() -> anyhow::Result<()> {
match storage.ingest(input) {
Ok(_node) => {
success_count += 1;
println!("[{}/{}] OK: {}", i + 1, total, truncate(&memory.content, 60));
println!(
"[{}/{}] OK: {}",
i + 1,
total,
truncate(&memory.content, 60)
);
}
Err(e) => {
println!("[{}/{}] FAIL: {}", i + 1, total, e);
@ -73,7 +78,10 @@ fn main() -> anyhow::Result<()> {
}
}
println!("\nRestore complete: {}/{} memories restored", success_count, total);
println!(
"\nRestore complete: {}/{} memories restored",
success_count, total
);
// Show stats
let stats = storage.get_stats()?;