feat(v2.0.5): Intentional Amnesia — active forgetting via top-down inhibitory control

First AI memory system to model forgetting as a neuroscience-grounded
PROCESS rather than passive decay. Adds the `suppress` MCP tool (#24),
Rac1 cascade worker, migration V10, and dashboard forgetting indicators.

Based on:
- Anderson, Hanslmayr & Quaegebeur (2025), Nat Rev Neurosci — right
  lateral PFC as the domain-general inhibitory controller; SIF
  compounds with each stopping attempt.
- Cervantes-Sandoval et al. (2020), Front Cell Neurosci PMC7477079 —
  Rac1 GTPase as the active synaptic destabilization mechanism.

What's new:
* `suppress` MCP tool — each call compounds `suppression_count` and
  subtracts a `0.15 × count` penalty (saturating at 80%) from
  retrieval scores during hybrid search. Distinct from delete
  (removes) and demote (one-shot).
* Rac1 cascade worker — background sweep piggybacks the 6h
  consolidation loop, walks `memory_connections` edges from
  recently-suppressed seeds, applies attenuated FSRS decay to
  co-activated neighbors. You don't just forget Jake — you fade
  the café, the roommate, the birthday.
* 24h labile window — reversible via `suppress({id, reverse: true})`
  within 24 hours. Matches Nader reconsolidation semantics.
* Migration V10 — additive-only (`suppression_count`, `suppressed_at`
  + partial indices). All v2.0.x DBs upgrade seamlessly on first launch.
* Dashboard: `ForgettingIndicator.svelte` pulses when suppressions
  are active. 3D graph nodes dim to 20% opacity when suppressed.
  New WebSocket events: `MemorySuppressed`, `MemoryUnsuppressed`,
  `Rac1CascadeSwept`. Heartbeat carries `suppressed_count`.
* Search pipeline: SIF penalty inserted into the accessibility stage
  so it stacks on top of passive FSRS decay.
* Tool count bumped 23 → 24. Cognitive modules 29 → 30.

Memories persist — they are INHIBITED, not erased. `memory.get(id)`
returns full content through any number of suppressions. The 24h
labile window is a grace period for regret.

Also fixes issue #31 (dashboard graph view buggy) as a companion UI
bug discovered during the v2.0.5 audit cycle:

* Root cause: node glow `SpriteMaterial` had no `map`, so
  `THREE.Sprite` rendered as a solid-coloured 1×1 plane. Additive
  blending + `UnrealBloomPass(0.8, 0.4, 0.85)` amplified the square
  edges into hard-edged glowing cubes.
* Fix: shared 128×128 radial-gradient `CanvasTexture` singleton used
  as the sprite map. Retuned bloom to `(0.55, 0.6, 0.2)`. Halved fog
  density (0.008 → 0.0035). Edges bumped from dark navy `0x4a4a7a`
  to brand violet `0x8b5cf6` with higher opacity. Added explicit
  `scene.background` and a 2000-point starfield for depth.
* 21 regression tests added in `ui-fixes.test.ts` locking every
  invariant in (shared texture singleton, depthWrite:false, scale
  ×6, bloom magic numbers via source regex, starfield presence).

Tests: 1,284 Rust (+47) + 171 Vitest (+21) = 1,455 total, 0 failed
Clippy: clean across all targets, zero warnings
Release binary: 22.6MB, `cargo build --release -p vestige-mcp` green
Versions: workspace aligned at 2.0.5 across all 6 crates/packages

Closes #31
This commit is contained in:
Sam Valladares 2026-04-14 17:30:30 -05:00
parent 95bde93b49
commit 8178beb961
359 changed files with 8277 additions and 3416 deletions

View file

@ -271,16 +271,11 @@ macro_rules! assert_search_count {
#[macro_export]
macro_rules! assert_search_order {
($results:expr, $expected_first:expr) => {
assert!(
!$results.is_empty(),
"Expected non-empty search results"
);
assert!(!$results.is_empty(), "Expected non-empty search results");
assert_eq!(
$results[0].id,
$expected_first,
$results[0].id, $expected_first,
"Expected first result to be {}, got {}",
$expected_first,
$results[0].id
$expected_first, $results[0].id
);
};
}

View file

@ -6,9 +6,9 @@
//! - Database snapshots and restoration
//! - Concurrent test isolation
use vestige_core::{KnowledgeNode, Rating, Storage};
use std::path::PathBuf;
use tempfile::TempDir;
use vestige_core::{KnowledgeNode, Rating, Storage};
/// Helper to create IngestInput (works around non_exhaustive)
#[allow(clippy::too_many_arguments)]
@ -107,10 +107,7 @@ impl TestDatabaseManager {
/// Get the number of nodes in the database
pub fn node_count(&self) -> i64 {
self.storage
.get_stats()
.map(|s| s.total_nodes)
.unwrap_or(0)
self.storage.get_stats().map(|s| s.total_nodes).unwrap_or(0)
}
// ========================================================================
@ -257,10 +254,7 @@ impl TestDatabaseManager {
/// Take a snapshot of current database state
pub fn take_snapshot(&mut self) {
let nodes = self
.storage
.get_all_nodes(10000, 0)
.unwrap_or_default();
let nodes = self.storage.get_all_nodes(10000, 0).unwrap_or_default();
self.snapshot = Some(nodes);
}
@ -322,8 +316,8 @@ impl TestDatabaseManager {
let _ = std::fs::remove_file(&self.db_path);
// Recreate storage
self.storage = Storage::new(Some(self.db_path.clone()))
.expect("Failed to recreate storage");
self.storage =
Storage::new(Some(self.db_path.clone())).expect("Failed to recreate storage");
}
}

View file

@ -183,7 +183,13 @@ impl TestDataFactory {
/// Create a batch of memories
pub fn create_batch(storage: &mut Storage, count: usize) -> Vec<String> {
Self::create_batch_with_config(storage, BatchConfig { count, ..Default::default() })
Self::create_batch_with_config(
storage,
BatchConfig {
count,
..Default::default()
},
)
}
/// Create a batch with custom configuration
@ -212,9 +218,15 @@ impl TestDataFactory {
let (valid_from, valid_until) = if config.with_temporal {
let now = Utc::now();
if i % 3 == 0 {
(Some(now - Duration::days(30)), Some(now + Duration::days(30)))
(
Some(now - Duration::days(30)),
Some(now + Duration::days(30)),
)
} else if i % 3 == 1 {
(Some(now - Duration::days(60)), Some(now - Duration::days(30)))
(
Some(now - Duration::days(60)),
Some(now - Duration::days(30)),
)
} else {
(None, None)
}
@ -273,12 +285,7 @@ impl TestDataFactory {
}
// Emotional memory (decay should be affected by sentiment)
let emotional = Self::create_emotional_memory(
storage,
"Important life event",
0.9,
0.95,
);
let emotional = Self::create_emotional_memory(storage, "Important life event", 0.9, 0.95);
if let Some(node) = emotional {
metadata.insert("emotional".to_string(), node.id.clone());
ids.push(node.id);
@ -445,12 +452,8 @@ impl TestDataFactory {
}
// No bounds (always valid)
if let Some(node) = Self::create_temporal_memory(
storage,
"Always valid memory",
None,
None,
) {
if let Some(node) = Self::create_temporal_memory(storage, "Always valid memory", None, None)
{
metadata.insert("always_valid".to_string(), node.id.clone());
ids.push(node.id);
}
@ -469,8 +472,15 @@ impl TestDataFactory {
/// Get a random node type
pub fn random_node_type(seed: usize) -> &'static str {
const TYPES: [&str; 9] = [
"fact", "concept", "procedure", "event", "relationship",
"quote", "code", "question", "insight",
"fact",
"concept",
"procedure",
"event",
"relationship",
"quote",
"code",
"question",
"insight",
];
TYPES[seed % TYPES.len()]
}
@ -478,10 +488,26 @@ impl TestDataFactory {
/// Generate lorem ipsum-like content
pub fn lorem_content(words: usize, seed: usize) -> String {
const WORDS: [&str; 20] = [
"the", "memory", "learning", "knowledge", "algorithm",
"data", "system", "process", "function", "method",
"class", "object", "variable", "constant", "type",
"structure", "pattern", "design", "architecture", "code",
"the",
"memory",
"learning",
"knowledge",
"algorithm",
"data",
"system",
"process",
"function",
"method",
"class",
"object",
"variable",
"constant",
"type",
"structure",
"pattern",
"design",
"architecture",
"code",
];
(0..words)
@ -493,8 +519,16 @@ impl TestDataFactory {
/// Generate tags
pub fn generate_tags(count: usize, seed: usize) -> Vec<String> {
const TAGS: [&str; 10] = [
"important", "review", "todo", "concept", "fact",
"code", "note", "idea", "question", "reference",
"important",
"review",
"todo",
"concept",
"fact",
"code",
"note",
"idea",
"question",
"reference",
];
(0..count)

View file

@ -145,7 +145,11 @@ impl MockEmbeddingService {
// Map word to a sparse set of dimensions
for i in 0..16 {
let dim = ((word_hash >> (i * 4)) as usize) % MOCK_EMBEDDING_DIM;
let sign = if (word_hash >> (i + 48)) & 1 == 0 { 1.0 } else { -1.0 };
let sign = if (word_hash >> (i + 48)) & 1 == 0 {
1.0
} else {
-1.0
};
let magnitude = ((word_hash >> (i * 2)) as f32 % 100.0) / 100.0 + 0.5;
embedding[dim] += sign * magnitude;
}
@ -342,9 +346,15 @@ mod tests {
let query = service.embed("programming code");
let candidates = vec![
("doc1".to_string(), service.embed("python programming language")),
(
"doc1".to_string(),
service.embed("python programming language"),
),
("doc2".to_string(), service.embed("cooking recipes")),
("doc3".to_string(), service.embed("software development code")),
(
"doc3".to_string(),
service.embed("software development code"),
),
];
let result = service.find_most_similar(&query, &candidates);