feat(v2.0.5): Intentional Amnesia — active forgetting via top-down inhibitory control

First AI memory system to model forgetting as a neuroscience-grounded
PROCESS rather than passive decay. Adds the `suppress` MCP tool (#24),
Rac1 cascade worker, migration V10, and dashboard forgetting indicators.

Based on:
- Anderson, Hanslmayr & Quaegebeur (2025), Nat Rev Neurosci — right
  lateral PFC as the domain-general inhibitory controller; SIF
  compounds with each stopping attempt.
- Cervantes-Sandoval et al. (2020), Front Cell Neurosci PMC7477079 —
  Rac1 GTPase as the active synaptic destabilization mechanism.

What's new:
* `suppress` MCP tool — each call compounds `suppression_count` and
  subtracts a `0.15 × count` penalty (saturating at 80%) from
  retrieval scores during hybrid search. Distinct from delete
  (removes) and demote (one-shot).
* Rac1 cascade worker — background sweep piggybacks the 6h
  consolidation loop, walks `memory_connections` edges from
  recently-suppressed seeds, applies attenuated FSRS decay to
  co-activated neighbors. You don't just forget Jake — you fade
  the café, the roommate, the birthday.
* 24h labile window — reversible via `suppress({id, reverse: true})`
  within 24 hours. Matches Nader reconsolidation semantics.
* Migration V10 — additive-only (`suppression_count`, `suppressed_at`
  + partial indices). All v2.0.x DBs upgrade seamlessly on first launch.
* Dashboard: `ForgettingIndicator.svelte` pulses when suppressions
  are active. 3D graph nodes dim to 20% opacity when suppressed.
  New WebSocket events: `MemorySuppressed`, `MemoryUnsuppressed`,
  `Rac1CascadeSwept`. Heartbeat carries `suppressed_count`.
* Search pipeline: SIF penalty inserted into the accessibility stage
  so it stacks on top of passive FSRS decay.
* Tool count bumped 23 → 24. Cognitive modules 29 → 30.

Memories persist — they are INHIBITED, not erased. `memory.get(id)`
returns full content through any number of suppressions. The 24h
labile window is a grace period for regret.

Also fixes issue #31 (dashboard graph view buggy) as a companion UI
bug discovered during the v2.0.5 audit cycle:

* Root cause: node glow `SpriteMaterial` had no `map`, so
  `THREE.Sprite` rendered as a solid-coloured 1×1 plane. Additive
  blending + `UnrealBloomPass(0.8, 0.4, 0.85)` amplified the square
  edges into hard-edged glowing cubes.
* Fix: shared 128×128 radial-gradient `CanvasTexture` singleton used
  as the sprite map. Retuned bloom to `(0.55, 0.6, 0.2)`. Halved fog
  density (0.008 → 0.0035). Edges bumped from dark navy `0x4a4a7a`
  to brand violet `0x8b5cf6` with higher opacity. Added explicit
  `scene.background` and a 2000-point starfield for depth.
* 21 regression tests added in `ui-fixes.test.ts` locking every
  invariant in (shared texture singleton, depthWrite:false, scale
  ×6, bloom magic numbers via source regex, starfield presence).

Tests: 1,284 Rust (+47) + 171 Vitest (+21) = 1,455 total, 0 failed
Clippy: clean across all targets, zero warnings
Release binary: 22.6MB, `cargo build --release -p vestige-mcp` green
Versions: workspace aligned at 2.0.5 across all 6 crates/packages

Closes #31
This commit is contained in:
Sam Valladares 2026-04-14 17:30:30 -05:00
parent 95bde93b49
commit 8178beb961
359 changed files with 8277 additions and 3416 deletions

View file

@ -73,19 +73,23 @@ pub async fn execute_promote(
// Validate UUID
uuid::Uuid::parse_str(&args.id).map_err(|_| "Invalid node ID format".to_string())?;
// Get node before for comparison
let before = storage.get_node(&args.id).map_err(|e| e.to_string())?
let before = storage
.get_node(&args.id)
.map_err(|e| e.to_string())?
.ok_or_else(|| format!("Node not found: {}", args.id))?;
let node = storage.promote_memory(&args.id).map_err(|e| e.to_string())?;
let node = storage
.promote_memory(&args.id)
.map_err(|e| e.to_string())?;
// ====================================================================
// COGNITIVE FEEDBACK PIPELINE (promote)
// ====================================================================
if let Ok(mut cog) = cognitive.try_lock() {
// 5A. Reward signal — record positive outcome
cog.reward_signal.record_outcome(&args.id, OutcomeType::Helpful);
cog.reward_signal
.record_outcome(&args.id, OutcomeType::Helpful);
// 5B. Importance tracking — mark as helpful retrieval
cog.importance_tracker.on_retrieved(&args.id, true);
@ -143,9 +147,10 @@ pub async fn execute_demote(
// Validate UUID
uuid::Uuid::parse_str(&args.id).map_err(|_| "Invalid node ID format".to_string())?;
// Get node before for comparison
let before = storage.get_node(&args.id).map_err(|e| e.to_string())?
let before = storage
.get_node(&args.id)
.map_err(|e| e.to_string())?
.ok_or_else(|| format!("Node not found: {}", args.id))?;
let node = storage.demote_memory(&args.id).map_err(|e| e.to_string())?;
@ -155,7 +160,8 @@ pub async fn execute_demote(
// ====================================================================
if let Ok(mut cog) = cognitive.try_lock() {
// 5A. Reward signal — record negative outcome
cog.reward_signal.record_outcome(&args.id, OutcomeType::NotHelpful);
cog.reward_signal
.record_outcome(&args.id, OutcomeType::NotHelpful);
// 5B. Importance tracking — mark as unhelpful retrieval
cog.importance_tracker.on_retrieved(&args.id, false);
@ -237,8 +243,9 @@ pub async fn execute_request_feedback(
// Validate UUID
uuid::Uuid::parse_str(&args.id).map_err(|_| "Invalid node ID format".to_string())?;
let node = storage.get_node(&args.id).map_err(|e| e.to_string())?
let node = storage
.get_node(&args.id)
.map_err(|e| e.to_string())?
.ok_or_else(|| format!("Node not found: {}", args.id))?;
// Truncate content for display
@ -319,10 +326,12 @@ mod tests {
assert_eq!(schema["type"], "object");
assert!(schema["properties"]["id"].is_object());
assert!(schema["properties"]["reason"].is_object());
assert!(schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id")));
assert!(
schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id"))
);
}
#[test]
@ -330,10 +339,12 @@ mod tests {
let schema = demote_schema();
assert_eq!(schema["type"], "object");
assert!(schema["properties"]["id"].is_object());
assert!(schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id")));
assert!(
schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id"))
);
}
#[test]
@ -342,10 +353,12 @@ mod tests {
assert_eq!(schema["type"], "object");
assert!(schema["properties"]["id"].is_object());
assert!(schema["properties"]["context"].is_object());
assert!(schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id")));
assert!(
schema["required"]
.as_array()
.unwrap()
.contains(&serde_json::json!("id"))
);
}
// === PROMOTE TESTS ===
@ -370,8 +383,7 @@ mod tests {
#[tokio::test]
async fn test_promote_nonexistent_node_fails() {
let (storage, _dir) = test_storage().await;
let args =
serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let args = serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let result = execute_promote(&storage, &test_cognitive(), Some(args)).await;
assert!(result.is_err());
assert!(result.unwrap_err().contains("Node not found"));
@ -454,8 +466,7 @@ mod tests {
#[tokio::test]
async fn test_demote_nonexistent_node_fails() {
let (storage, _dir) = test_storage().await;
let args =
serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let args = serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let result = execute_demote(&storage, &test_cognitive(), Some(args)).await;
assert!(result.is_err());
assert!(result.unwrap_err().contains("Node not found"));
@ -510,8 +521,7 @@ mod tests {
#[tokio::test]
async fn test_request_feedback_nonexistent_node_fails() {
let (storage, _dir) = test_storage().await;
let args =
serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let args = serde_json::json!({ "id": "00000000-0000-0000-0000-000000000000" });
let result = execute_request_feedback(&storage, Some(args)).await;
assert!(result.is_err());
}