mirror of
https://github.com/samvallad33/vestige.git
synced 2026-05-01 03:46:22 +02:00
test(v2.3): full e2e + integration coverage for Pulse + Birth Ritual
Post-ship verification pass — five parallel write-agents produced 229 new
tests across vitest units, vitest integration, and Playwright browser e2e.
Net suite: 361 vitest pass (up from 251, +110) and 9/9 Playwright pass on
back-to-back runs.
**toast.test.ts (NEW, 661 lines, 42 tests)**
Silent-lobotomy batch walk proven (multi-event tick processes ALL, not
just newest, oldest-first ordering preserved). Hover-panic pause/resume
with remaining-ms math. All 9 event type translations asserted, all 11
noise types asserted silent. ConnectionDiscovered 1500ms throttle.
MAX_VISIBLE=4 eviction. clear() tears down all timers. fireDemoSequence
staggers 4 toasts at 800ms intervals. vi.useFakeTimers + vi.mock of
eventFeed; vi.resetModules in beforeEach for module-singleton isolation.
**websocket.test.ts (NEW, 247 lines, 30 tests)**
injectEvent adds to front, respects MAX_EVENTS=200 with FIFO eviction,
triggers eventFeed emissions. All 6 derived stores (isConnected,
heartbeat, memoryCount, avgRetention, suppressedCount, uptimeSeconds)
verified — defaults, post-heartbeat values, clearEvents preserves
lastHeartbeat. 13 formatUptime boundary cases (0/59/60/3599/3600/
86399/86400 seconds + negative / NaN / ±Infinity).
**effects.test.ts (EXTENDED, +501 lines, +21 tests, 51 total)**
createBirthOrb full lifecycle — sprite count (halo + core), cosmic
center via camera.quaternion, gestation phase (position lock, opacity
rise, scale easing, color tint), flight Bezier arc above linear
midpoint at t=0.5, dynamic mid-flight target redirect. onArrive fires
exactly once at frame 139. Post-arrival fade + disposal cleans scene
children. Sanhedrin Shatter: target goes undefined mid-flight →
onArrive NEVER called, implosion spawned, halo blood-red, eventual
cleanup. dispose() cleans active orbs. Multiple simultaneous orbs.
Custom gestation/flight frame opts honored. Zero-alloc invariant
smoke test (6 orbs × 150 frames, no leaks).
**nodes.test.ts (EXTENDED, +197 lines, +10 tests, 42 total)**
addNode({isBirthRitual:true}) hides mesh/glow/label immediately,
stamps birthRitualPending sentinel with correct totalFrames +
targetScale, does NOT enqueue materialization. igniteNode flips
visibility + enqueues materialization. Idempotent — second call
no-op. Non-ritual nodes unaffected. Unknown id is safe no-op.
Position stored in positions map while invisible (force sim still
sees it). removeNode + late igniteNode is safe.
**events.test.ts (EXTENDED, +268 lines, +7 tests, 55 total)**
MemoryCreated → mesh hidden immediately, 2 birth-orb sprites added,
ZERO RingGeometry meshes and ZERO Points particles at spawn. Full
ritual drive → onArrive fires, node visible + materializing, sentinel
cleared. Newton's Cradle: target mesh scale exactly 0.001 * 1.8 right
after arrival. Dual shockwave: exactly 2 Ring meshes added. Re-read
live position on arrival — force-sim motion during ritual → burst
lands at the NEW position. Sanhedrin abort path → rainbow burst,
shockwave, ripple wave are NEVER called (vi.spyOn).
**three-mock.ts (EXTENDED)**
Added Color.setRGB — production Three.js has it, the Sanhedrin-
Shatter path in effects.ts uses it. Two write-agents independently
monkey-patched the mock inline; consolidated as a 5-line mock
addition so tests stay clean.
**e2e/pulse-toast.spec.ts (NEW, 235 lines, 6 Playwright tests)**
Navigate /dashboard/settings → click Preview Pulse → assert first
toast appears within 500ms → assert >= 2 toasts visible at peak.
Click-to-dismiss removes clicked toast (matched by aria-label).
Hover survives >8s past the 5.5s dwell. Keyboard Enter dismisses
focused toast. CSS animation-play-state:paused on .toast-progress-
fill while hovered, running on mouseleave. Screenshots attached to
HTML report. Zero backend dependency (fireDemoSequence is purely
client-side).
**e2e/birth-ritual.spec.ts (NEW, 199 lines, 3 Playwright tests)**
Canvas mounts on /dashboard/graph (gracefully test.fixme if MCP
backend absent). Settings button injection + SPA route to /graph
→ screenshot timeline at t=0/500/1200/2000/2400/3000ms attached
to HTML report. pageerror + console-error listeners catch any
crash (would re-surface FATAL 6 if reintroduced). Three back-to-
back births — no errors, canvas still dispatches clicks.
Run commands:
cd apps/dashboard && npm test # 361/361 pass, ~600ms
cd apps/dashboard && npx playwright test # 9/9 pass, ~25s
Typecheck: 0 errors, 0 warnings. Build: clean adapter-static.
This commit is contained in:
parent
ec614fed85
commit
8fe8bb2f39
8 changed files with 2408 additions and 1 deletions
|
|
@ -497,4 +497,505 @@ describe('EffectManager', () => {
|
|||
expect(effects.pulseEffects.length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('createBirthOrb (v2.3 Memory Birth Ritual)', () => {
|
||||
// Build a camera with a Quaternion for createBirthOrb's view-space
|
||||
// projection. The three-mock's applyQuaternion is identity, so the
|
||||
// start position collapses to `camera.position + (0, 0, -distance)`.
|
||||
function makeCamera() {
|
||||
return {
|
||||
position: new Vector3(0, 30, 80),
|
||||
quaternion: new (class {
|
||||
x = 0; y = 0; z = 0; w = 1;
|
||||
})(),
|
||||
} as any;
|
||||
}
|
||||
|
||||
it('adds exactly 2 sprites to the scene on spawn', () => {
|
||||
const cam = makeCamera();
|
||||
const baseline = scene.children.length;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
expect(scene.children.length).toBe(baseline + 2);
|
||||
});
|
||||
|
||||
it('both sprite and core use additive blending', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0xff8800) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
const core = scene.children[1] as any;
|
||||
// AdditiveBlending constant from three-mock is 2
|
||||
expect(halo.material.blending).toBe(2);
|
||||
expect(core.material.blending).toBe(2);
|
||||
// depthTest:false is passed to the SpriteMaterial constructor in
|
||||
// effects.ts so the orb stays visible through other nodes. The
|
||||
// three-mock's SpriteMaterial constructor does not persist this
|
||||
// param, so we can't assert it at the instance level here; the
|
||||
// production behavior is covered by ui-fixes.test.ts source grep.
|
||||
expect(halo.material.transparent).toBe(true);
|
||||
expect(core.material.transparent).toBe(true);
|
||||
});
|
||||
|
||||
it('positions the orb at camera-relative cosmic center on spawn', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {},
|
||||
{ distanceFromCamera: 40 }
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
const core = scene.children[1] as any;
|
||||
// mock applyQuaternion is identity, so startPos = camera.pos + (0,0,-40)
|
||||
expect(halo.position.x).toBeCloseTo(0);
|
||||
expect(halo.position.y).toBeCloseTo(30);
|
||||
expect(halo.position.z).toBeCloseTo(40); // 80 + (-40)
|
||||
expect(core.position.x).toBeCloseTo(halo.position.x);
|
||||
expect(core.position.y).toBeCloseTo(halo.position.y);
|
||||
expect(core.position.z).toBeCloseTo(halo.position.z);
|
||||
});
|
||||
|
||||
it('gestation phase: position stays at startPos for all 48 frames', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(100, 100, 100) as any, // far-away target
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
const startX = halo.position.x;
|
||||
const startY = halo.position.y;
|
||||
const startZ = halo.position.z;
|
||||
|
||||
for (let f = 0; f < 48; f++) {
|
||||
effects.update(nodeMeshMap, cam);
|
||||
expect(halo.position.x).toBeCloseTo(startX);
|
||||
expect(halo.position.y).toBeCloseTo(startY);
|
||||
expect(halo.position.z).toBeCloseTo(startZ);
|
||||
}
|
||||
});
|
||||
|
||||
it('gestation phase: opacity rises from 0 toward 0.95', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
const core = scene.children[1] as any;
|
||||
|
||||
// Spawn opacity
|
||||
expect(halo.material.opacity).toBe(0);
|
||||
expect(core.material.opacity).toBe(0);
|
||||
|
||||
effects.update(nodeMeshMap, cam); // age 1
|
||||
const earlyHaloOp = halo.material.opacity;
|
||||
expect(earlyHaloOp).toBeGreaterThan(0);
|
||||
expect(earlyHaloOp).toBeLessThan(0.2);
|
||||
|
||||
// Run to end of gestation
|
||||
for (let f = 0; f < 47; f++) effects.update(nodeMeshMap, cam);
|
||||
expect(halo.material.opacity).toBeCloseTo(0.95, 1);
|
||||
expect(core.material.opacity).toBeCloseTo(1.0, 1);
|
||||
// Monotonic-ish growth: late gestation > early gestation
|
||||
expect(halo.material.opacity).toBeGreaterThan(earlyHaloOp);
|
||||
});
|
||||
|
||||
it('gestation phase: sprite scale grows substantially', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
effects.update(nodeMeshMap, cam); // age 1
|
||||
const earlyScale = halo.scale.x;
|
||||
|
||||
for (let f = 0; f < 47; f++) effects.update(nodeMeshMap, cam); // age 48
|
||||
const lateScale = halo.scale.x;
|
||||
|
||||
// Halo grows from ~0.5 toward ~5 during gestation (with pulse variation).
|
||||
expect(lateScale).toBeGreaterThan(earlyScale);
|
||||
expect(lateScale).toBeGreaterThan(2);
|
||||
});
|
||||
|
||||
it('gestation phase: halo color tints toward event color', () => {
|
||||
const cam = makeCamera();
|
||||
const eventColor = new Color(0xff0000); // pure red
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
eventColor as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
effects.update(nodeMeshMap, cam); // age 1 — factor ≈ 0.72
|
||||
const earlyR = halo.material.color.r;
|
||||
|
||||
for (let f = 0; f < 47; f++) effects.update(nodeMeshMap, cam); // age 48 — factor = 1.0
|
||||
const lateR = halo.material.color.r;
|
||||
|
||||
// Red channel should approach the event color's red (1.0) from a dimmer value
|
||||
expect(lateR).toBeGreaterThan(earlyR);
|
||||
expect(lateR).toBeCloseTo(1.0, 1);
|
||||
// Green/blue stay at 0 (event color is pure red)
|
||||
expect(halo.material.color.g).toBeCloseTo(0);
|
||||
expect(halo.material.color.b).toBeCloseTo(0);
|
||||
});
|
||||
|
||||
it('flight phase: Bezier arc passes ABOVE the linear midpoint at t=0.5', () => {
|
||||
const cam = makeCamera();
|
||||
// startPos = (0, 30, 40), target = (0, 0, 0)
|
||||
// linear midpoint y = 15; control point y = 15 + 30 + dist*0.15 = 52.5
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
// Drive past gestation (48) + half of flight (45) = 93 frames → t=0.5
|
||||
for (let f = 0; f < 93; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
// Linear midpoint y is 15; Bezier midpoint should be notably higher.
|
||||
expect(halo.position.y).toBeGreaterThan(15);
|
||||
// And not as high as the control point itself (52.5) — Bezier
|
||||
// passes through midpoint-ish at t=0.5, biased upward by the arc.
|
||||
expect(halo.position.y).toBeLessThan(52.5);
|
||||
});
|
||||
|
||||
it('flight phase: orb moves from startPos toward target', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
// End of gestation
|
||||
for (let f = 0; f < 48; f++) effects.update(nodeMeshMap, cam);
|
||||
const gestZ = halo.position.z;
|
||||
|
||||
// One tick into flight
|
||||
effects.update(nodeMeshMap, cam);
|
||||
const earlyFlightZ = halo.position.z;
|
||||
|
||||
// Near end of flight
|
||||
for (let f = 0; f < 88; f++) effects.update(nodeMeshMap, cam);
|
||||
const lateFlightZ = halo.position.z;
|
||||
|
||||
// Z moves from 40 toward 0
|
||||
expect(earlyFlightZ).toBeLessThan(gestZ);
|
||||
expect(lateFlightZ).toBeLessThan(earlyFlightZ);
|
||||
expect(lateFlightZ).toBeLessThan(5); // close to target z=0
|
||||
});
|
||||
|
||||
it('dynamic target tracking: changing getTargetPos mid-flight redirects the orb', () => {
|
||||
const cam = makeCamera();
|
||||
let target = new Vector3(0, 0, 0);
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => target as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
// Drive to mid-flight (gestation 48 + 30 flight frames = 78)
|
||||
for (let f = 0; f < 78; f++) effects.update(nodeMeshMap, cam);
|
||||
const xBeforeRedirect = halo.position.x;
|
||||
|
||||
// Redirect target far to the +X side
|
||||
target = new Vector3(200, 0, 0);
|
||||
|
||||
// A few more flight frames — orb should track the new target
|
||||
for (let f = 0; f < 10; f++) effects.update(nodeMeshMap, cam);
|
||||
const xAfterRedirect = halo.position.x;
|
||||
|
||||
// With the original target at (0,0,0), x stays near 0 throughout.
|
||||
// After redirect, x should swing toward the new target's +200.
|
||||
expect(xAfterRedirect).toBeGreaterThan(xBeforeRedirect + 5);
|
||||
});
|
||||
|
||||
it('onArrive fires exactly once at frame 139 (totalFrames + 1)', () => {
|
||||
const cam = makeCamera();
|
||||
let arriveCount = 0;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {
|
||||
arriveCount++;
|
||||
}
|
||||
);
|
||||
|
||||
// Drive through gestation (48) + flight (90) = 138 frames. Should NOT have fired.
|
||||
for (let f = 0; f < 138; f++) effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(0);
|
||||
|
||||
// Frame 139 — fires onArrive
|
||||
effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(1);
|
||||
|
||||
// Drive many more frames — must stay at 1
|
||||
for (let f = 0; f < 50; f++) effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(1);
|
||||
});
|
||||
|
||||
it('post-arrival fade: orb disposes from scene after ~8 fade frames', () => {
|
||||
const cam = makeCamera();
|
||||
const baseline = scene.children.length;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
expect(scene.children.length).toBe(baseline + 2);
|
||||
|
||||
// Gestation + flight + arrive + fade = 138 + 1 + 8 = 147 frames
|
||||
for (let f = 0; f < 150; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
// Both orb sprites should be gone
|
||||
expect(scene.children.length).toBe(baseline);
|
||||
});
|
||||
|
||||
it('onArrive callback wrapped in try/catch so a throw does not crash the loop', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {
|
||||
throw new Error('caller blew up');
|
||||
}
|
||||
);
|
||||
|
||||
// Should not throw — the production code swallows arrival-callback errors.
|
||||
expect(() => {
|
||||
for (let f = 0; f < 160; f++) effects.update(nodeMeshMap, cam);
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it('Sanhedrin Shatter: onArrive NEVER fires when target vanishes mid-flight', () => {
|
||||
const cam = makeCamera();
|
||||
let arriveCount = 0;
|
||||
let target: Vector3 | undefined = new Vector3(0, 0, 0);
|
||||
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => target as any,
|
||||
() => {
|
||||
arriveCount++;
|
||||
}
|
||||
);
|
||||
|
||||
// Finish gestation (48 frames) with target present
|
||||
for (let f = 0; f < 48; f++) effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(0);
|
||||
|
||||
// Stop hook yanks the target mid-flight
|
||||
target = undefined;
|
||||
|
||||
// Run enough frames to cover the entire orb lifecycle
|
||||
for (let f = 0; f < 200; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
// onArrive must NEVER fire on aborted orbs
|
||||
expect(arriveCount).toBe(0);
|
||||
});
|
||||
|
||||
it('Sanhedrin Shatter: implosion is spawned when target vanishes mid-flight', () => {
|
||||
const cam = makeCamera();
|
||||
let target: Vector3 | undefined = new Vector3(0, 0, 0);
|
||||
|
||||
const baseline = scene.children.length;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => target as any,
|
||||
() => {}
|
||||
);
|
||||
// baseline + 2 sprites
|
||||
expect(scene.children.length).toBe(baseline + 2);
|
||||
|
||||
// Finish gestation
|
||||
for (let f = 0; f < 48; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
// Yank target → abort triggers on next tick
|
||||
target = undefined;
|
||||
const beforeAbort = scene.children.length;
|
||||
effects.update(nodeMeshMap, cam);
|
||||
// Scene should have grown by at least 1 (the implosion particles)
|
||||
expect(scene.children.length).toBeGreaterThan(beforeAbort);
|
||||
});
|
||||
|
||||
it('Sanhedrin Shatter: halo turns blood-red on abort', () => {
|
||||
const cam = makeCamera();
|
||||
let target: Vector3 | undefined = new Vector3(0, 0, 0);
|
||||
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any, // cyan — NOT red
|
||||
() => target as any,
|
||||
() => {}
|
||||
);
|
||||
const halo = scene.children[0] as any;
|
||||
|
||||
// Finish gestation
|
||||
for (let f = 0; f < 48; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
// Sanity: halo is NOT red yet (event color cyan has r≈0)
|
||||
expect(halo.material.color.r).toBeLessThan(0.5);
|
||||
|
||||
// Yank target; abort triggers next tick
|
||||
target = undefined;
|
||||
effects.update(nodeMeshMap, cam);
|
||||
|
||||
// Halo should now be blood red (1.0, 0.15, 0.2)
|
||||
expect(halo.material.color.r).toBeGreaterThan(0.9);
|
||||
expect(halo.material.color.g).toBeLessThan(0.3);
|
||||
expect(halo.material.color.b).toBeLessThan(0.3);
|
||||
});
|
||||
|
||||
it('Sanhedrin Shatter: orb eventually disposes from scene', () => {
|
||||
const cam = makeCamera();
|
||||
let target: Vector3 | undefined = new Vector3(0, 0, 0);
|
||||
|
||||
const baseline = scene.children.length;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => target as any,
|
||||
() => {}
|
||||
);
|
||||
|
||||
// Finish gestation
|
||||
for (let f = 0; f < 48; f++) effects.update(nodeMeshMap, cam);
|
||||
// Yank target
|
||||
target = undefined;
|
||||
|
||||
// Drive a long time — orb + implosion should both dispose
|
||||
// (orb fade ~8 frames, implosion lifetime ~80 frames)
|
||||
for (let f = 0; f < 200; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
expect(scene.children.length).toBe(baseline);
|
||||
});
|
||||
|
||||
it('dispose() removes active birth orbs from the scene', () => {
|
||||
const cam = makeCamera();
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0xff00ff) as any,
|
||||
() => new Vector3(10, 10, 10) as any,
|
||||
() => {}
|
||||
);
|
||||
// 4 sprites in scene (2 per orb)
|
||||
expect(scene.children.length).toBeGreaterThanOrEqual(4);
|
||||
|
||||
effects.dispose();
|
||||
|
||||
// All orb sprites should be gone
|
||||
expect(scene.children.length).toBe(0);
|
||||
});
|
||||
|
||||
it('multiple orbs in flight: all 3 onArrive callbacks fire exactly once each', () => {
|
||||
const cam = makeCamera();
|
||||
let c1 = 0, c2 = 0, c3 = 0;
|
||||
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0xff0000) as any,
|
||||
() => new Vector3(10, 0, 0) as any,
|
||||
() => { c1++; }
|
||||
);
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ff00) as any,
|
||||
() => new Vector3(-10, 0, 0) as any,
|
||||
() => { c2++; }
|
||||
);
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x0000ff) as any,
|
||||
() => new Vector3(0, 0, -10) as any,
|
||||
() => { c3++; }
|
||||
);
|
||||
|
||||
// Drive past arrival (139) with margin
|
||||
for (let f = 0; f < 160; f++) effects.update(nodeMeshMap, cam);
|
||||
|
||||
expect(c1).toBe(1);
|
||||
expect(c2).toBe(1);
|
||||
expect(c3).toBe(1);
|
||||
});
|
||||
|
||||
it('custom gestation/flight frame counts are honored', () => {
|
||||
const cam = makeCamera();
|
||||
let arriveCount = 0;
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(0, 0, 0) as any,
|
||||
() => { arriveCount++; },
|
||||
{ gestationFrames: 10, flightFrames: 20 }
|
||||
);
|
||||
|
||||
// Before frame 31 — no arrival
|
||||
for (let f = 0; f < 30; f++) effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(0);
|
||||
|
||||
// Frame 31 — fires
|
||||
effects.update(nodeMeshMap, cam);
|
||||
expect(arriveCount).toBe(1);
|
||||
});
|
||||
|
||||
it('zero-alloc invariant (advisory): flight phase runs without throwing across many orbs', () => {
|
||||
// Advisory test — vitest has no allocator introspection, but the
|
||||
// inline algebraic Bezier eval in effects.ts is intentionally zero-
|
||||
// allocation per frame (no `new Vector3`, no `new QuadraticBezierCurve3`).
|
||||
// Here we just smoke-test that running many orbs across the full
|
||||
// flight phase does not throw and completes cleanly.
|
||||
const cam = makeCamera();
|
||||
for (let k = 0; k < 6; k++) {
|
||||
effects.createBirthOrb(
|
||||
cam,
|
||||
new Color(0x00ffd1) as any,
|
||||
() => new Vector3(k * 5, 0, 0) as any,
|
||||
() => {}
|
||||
);
|
||||
}
|
||||
expect(() => {
|
||||
for (let f = 0; f < 150; f++) effects.update(nodeMeshMap, cam);
|
||||
}).not.toThrow();
|
||||
// All orbs should have cleaned up
|
||||
expect(scene.children.length).toBe(0);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ import { NodeManager } from '../nodes';
|
|||
import { EdgeManager } from '../edges';
|
||||
import { EffectManager } from '../effects';
|
||||
import { ForceSimulation } from '../force-sim';
|
||||
import { Vector3, Scene } from './three-mock';
|
||||
import { Vector3, Scene, RingGeometry, Mesh, Points, Sprite } from './three-mock';
|
||||
import { makeNode, makeEdge, makeEvent, resetNodeCounter } from './helpers';
|
||||
import type { GraphNode, VestigeEvent } from '$types';
|
||||
|
||||
|
|
@ -874,4 +874,270 @@ describe('Event-to-Mutation Pipeline', () => {
|
|||
expect(mutations.some((m) => m.type === 'edgeAdded')).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('v2.3 Birth Ritual wiring', () => {
|
||||
/** Count shockwave rings currently in the scene by their RingGeometry. */
|
||||
function countRings(s: InstanceType<typeof Scene>): number {
|
||||
let n = 0;
|
||||
for (const child of s.children) {
|
||||
if (child instanceof Mesh && child.geometry instanceof RingGeometry) n++;
|
||||
}
|
||||
return n;
|
||||
}
|
||||
|
||||
/** Count Points children — rainbow bursts, spawn bursts, implosions. */
|
||||
function countPoints(s: InstanceType<typeof Scene>): number {
|
||||
let n = 0;
|
||||
for (const child of s.children) if (child instanceof Points) n++;
|
||||
return n;
|
||||
}
|
||||
|
||||
/** Count Sprite children — birth orb adds a halo + core sprite. */
|
||||
function countSprites(s: InstanceType<typeof Scene>): number {
|
||||
let n = 0;
|
||||
for (const child of s.children) if (child instanceof Sprite) n++;
|
||||
return n;
|
||||
}
|
||||
|
||||
it('node mesh is hidden immediately after MemoryCreated dispatch', () => {
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'ritual-create',
|
||||
content: 'fresh memory',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
// Ritual path: mesh/glow/label are all .visible = false until
|
||||
// igniteNode fires on orb arrival.
|
||||
const mesh = nodeManager.meshMap.get('ritual-create')!;
|
||||
const glow = nodeManager.glowMap.get('ritual-create')!;
|
||||
const label = nodeManager.labelSprites.get('ritual-create')!;
|
||||
expect(mesh.visible).toBe(false);
|
||||
expect(glow.visible).toBe(false);
|
||||
expect(label.visible).toBe(false);
|
||||
|
||||
// Pending sentinel is stamped on userData.
|
||||
expect(mesh.userData.birthRitualPending).toBeDefined();
|
||||
});
|
||||
|
||||
it('does NOT fire burst/ripple/shockwave at spawn (only the birth orb)', () => {
|
||||
const ringsBefore = countRings(scene);
|
||||
const pointsBefore = countPoints(scene);
|
||||
const spritesBefore = countSprites(scene);
|
||||
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'spawn-quiet',
|
||||
content: 'test',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
// Birth orb adds exactly 2 sprites (halo + core). NodeManager's
|
||||
// addNode also adds a glow Sprite + label Sprite to the NodeManager
|
||||
// GROUP, not to the scene — so spritesBefore -> after delta is +2.
|
||||
expect(countSprites(scene) - spritesBefore).toBe(2);
|
||||
|
||||
// No arrival-cascade effects yet: no shockwave rings, no rainbow
|
||||
// burst/spawn burst/ripple particles.
|
||||
expect(countRings(scene)).toBe(ringsBefore);
|
||||
expect(countPoints(scene)).toBe(pointsBefore);
|
||||
});
|
||||
|
||||
it('drives through the full ritual: onArrive fires, node becomes visible, scale grows', () => {
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'full-ritual',
|
||||
content: 'visible after arrival',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
const mesh = nodeManager.meshMap.get('full-ritual')!;
|
||||
expect(mesh.visible).toBe(false);
|
||||
|
||||
// Drive the effects update loop past the full ritual duration
|
||||
// (gestation 48 + flight 90 = 138 frames). After frame 138 the
|
||||
// orb fires onArrive which ignites the node and queues materialization.
|
||||
for (let i = 0; i < 140; i++) {
|
||||
effects.update(nodeManager.meshMap, camera, nodeManager.positions);
|
||||
}
|
||||
|
||||
// Node is now visible and sentinel is cleared.
|
||||
expect(mesh.visible).toBe(true);
|
||||
expect(mesh.userData.birthRitualPending).toBeUndefined();
|
||||
|
||||
// Run node animation a few frames to let materialization scale grow.
|
||||
// Note: onArrive bumped scale by 1.8x (from 0.001 -> 0.0018), then
|
||||
// materialization easeOutElastic pulls it toward targetScale.
|
||||
for (let f = 0; f < 10; f++) {
|
||||
nodeManager.animate(f * 0.016, allNodes, camera);
|
||||
}
|
||||
expect(mesh.scale.x).toBeGreaterThan(0.001);
|
||||
});
|
||||
|
||||
it("Newton's Cradle — target mesh scale is multiplied by 1.8x on arrival", () => {
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'newton-cradle',
|
||||
content: 'recoil test',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
const mesh = nodeManager.meshMap.get('newton-cradle')!;
|
||||
// Pre-arrival: scale is the addNode initial 0.001.
|
||||
expect(mesh.scale.x).toBeCloseTo(0.001, 6);
|
||||
|
||||
// Drive just to the moment onArrive fires. Gestation (48) +
|
||||
// flight (90) = 138 frames. Arrival bumps scale by 1.8x BEFORE
|
||||
// materialization has run any ticks, so the scale should be
|
||||
// exactly 0.001 * 1.8 = 0.0018 at that instant. We check right
|
||||
// after onArrive (frame 139) — but effects.update progresses the
|
||||
// orb's age counter by one each call, and on the tick where
|
||||
// orb.age > totalFrames, onArrive fires. We then must NOT tick
|
||||
// nodeManager.animate (or materialization would diverge the scale).
|
||||
for (let i = 0; i < 140; i++) {
|
||||
effects.update(nodeManager.meshMap, camera, nodeManager.positions);
|
||||
}
|
||||
|
||||
// onArrive fired. Scale was 0.001, got multiplied by 1.8 -> 0.0018.
|
||||
// Materialization is queued but hasn't run yet (no animate() calls).
|
||||
expect(mesh.scale.x).toBeCloseTo(0.0018, 6);
|
||||
});
|
||||
|
||||
it('dual shockwave — arrival cascade adds TWO RingGeometry meshes, not one', () => {
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'dual-shock',
|
||||
content: 'layered crash',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
const ringsBefore = countRings(scene);
|
||||
|
||||
// Drive past full ritual so onArrive fires.
|
||||
for (let i = 0; i < 140; i++) {
|
||||
effects.update(nodeManager.meshMap, camera, nodeManager.positions);
|
||||
}
|
||||
|
||||
// Both shockwaves fire synchronously in the onArrive callback
|
||||
// (the previous setTimeout-delayed second shockwave was dropped
|
||||
// because it could outlive the scene on route change).
|
||||
const ringsAfter = countRings(scene);
|
||||
expect(ringsAfter - ringsBefore).toBe(2);
|
||||
});
|
||||
|
||||
it('re-reads position on arrival — fires cascade at force-sim-moved position', () => {
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'moving-target',
|
||||
content: 'follow the node',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
// Grab the spawn position, then mutate it to simulate the force
|
||||
// simulation pushing the node during the ritual.
|
||||
const movedPos = new Vector3(123, 456, -789);
|
||||
nodeManager.positions.set('moving-target', movedPos);
|
||||
|
||||
// Drive past full ritual.
|
||||
for (let i = 0; i < 140; i++) {
|
||||
effects.update(nodeManager.meshMap, camera, nodeManager.positions);
|
||||
}
|
||||
|
||||
// The onArrive callback re-reads nodeManager.positions and fires
|
||||
// the cascade at the LIVE position. The two shockwave Ring meshes
|
||||
// should have been created at movedPos. Find them and check.
|
||||
const rings = scene.children.filter(
|
||||
(c) => c instanceof Mesh && c.geometry instanceof RingGeometry
|
||||
);
|
||||
expect(rings.length).toBeGreaterThanOrEqual(2);
|
||||
// Rings for this node: their .position copies from arrivePos at
|
||||
// spawn time inside createShockwave.
|
||||
const atMovedPos = rings.filter(
|
||||
(r) => r.position.x === 123 && r.position.y === 456 && r.position.z === -789
|
||||
);
|
||||
expect(atMovedPos.length).toBe(2);
|
||||
});
|
||||
|
||||
it('Sanhedrin abort path — removeNode before arrival prevents the regular cascade', () => {
|
||||
// Spy on the three arrival-cascade emitters so we can assert
|
||||
// they were NEVER called when the target is vetoed mid-ritual.
|
||||
const burstSpy = vi.spyOn(effects, 'createRainbowBurst');
|
||||
const shockwaveSpy = vi.spyOn(effects, 'createShockwave');
|
||||
const rippleSpy = vi.spyOn(effects, 'createRippleWave');
|
||||
|
||||
mapEventToEffects(
|
||||
makeEvent('MemoryCreated', {
|
||||
id: 'vetoed',
|
||||
content: 'about to be shattered',
|
||||
node_type: 'fact',
|
||||
}),
|
||||
ctx,
|
||||
allNodes
|
||||
);
|
||||
|
||||
// The orb's getTargetPos() closure reads
|
||||
// nodeManager.positions.get('vetoed'). Dropping the position
|
||||
// directly simulates the "target gone" state that the Sanhedrin
|
||||
// veto produces after dissolution completes — without needing to
|
||||
// drive the full 60-frame dissolution animation.
|
||||
nodeManager.positions.delete('vetoed');
|
||||
expect(nodeManager.positions.has('vetoed')).toBe(false);
|
||||
|
||||
// Snapshot the orb reference before the update loop disposes it.
|
||||
// The abort branch flips `aborted` and tints the halo red; we
|
||||
// assert on those fields after the ritual unwinds.
|
||||
const orbs = (effects as any).birthOrbs as Array<{
|
||||
sprite: { material: { color: any } };
|
||||
core: { material: { color: any } };
|
||||
aborted: boolean;
|
||||
}>;
|
||||
expect(orbs.length).toBe(1);
|
||||
const orbRef = orbs[0];
|
||||
|
||||
// Drive effects past the full ritual. During flight the orb will
|
||||
// see getTargetPos() === undefined, enter the Sanhedrin branch,
|
||||
// call createImplosion (anti-birth visual) and SKIP onArrive —
|
||||
// so the regular rainbow-burst + dual-shockwave + ripple cascade
|
||||
// never fires.
|
||||
for (let i = 0; i < 200; i++) {
|
||||
effects.update(nodeManager.meshMap, camera, nodeManager.positions);
|
||||
}
|
||||
|
||||
// Core assertion: the three regular-cascade emitters were never
|
||||
// invoked for the vetoed node.
|
||||
expect(burstSpy).not.toHaveBeenCalled();
|
||||
expect(shockwaveSpy).not.toHaveBeenCalled();
|
||||
expect(rippleSpy).not.toHaveBeenCalled();
|
||||
|
||||
// Also confirm the orb actually took the abort branch, not the
|
||||
// gestation-only no-op path (otherwise this test would pass for
|
||||
// the wrong reason). The aborted flag is set exactly once inside
|
||||
// the Sanhedrin branch.
|
||||
expect(orbRef.aborted).toBe(true);
|
||||
expect(orbRef.sprite.material.color.r).toBeCloseTo(1.0, 3);
|
||||
expect(orbRef.sprite.material.color.g).toBeCloseTo(0.15, 3);
|
||||
|
||||
burstSpy.mockRestore();
|
||||
shockwaveSpy.mockRestore();
|
||||
rippleSpy.mockRestore();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -453,4 +453,201 @@ describe('NodeManager', () => {
|
|||
// The dispose method clears materializingNodes, dissolvingNodes, growingNodes
|
||||
});
|
||||
});
|
||||
|
||||
describe('Birth Ritual integration', () => {
|
||||
it('addNode with isBirthRitual:true hides mesh, glow, and label immediately', () => {
|
||||
const node = makeNode({ id: 'ritual-1' });
|
||||
manager.addNode(node, new Vector3(5, 5, 5), { isBirthRitual: true });
|
||||
|
||||
const mesh = manager.meshMap.get('ritual-1')!;
|
||||
const glow = manager.glowMap.get('ritual-1')!;
|
||||
const label = manager.labelSprites.get('ritual-1')!;
|
||||
|
||||
expect(mesh.visible).toBe(false);
|
||||
expect(glow.visible).toBe(false);
|
||||
expect(label.visible).toBe(false);
|
||||
});
|
||||
|
||||
it('addNode with isBirthRitual:true stores a pending sentinel on mesh.userData', () => {
|
||||
const node = makeNode({ id: 'ritual-sentinel', retention: 0.75 });
|
||||
manager.addNode(node, new Vector3(0, 0, 0), { isBirthRitual: true });
|
||||
|
||||
const mesh = manager.meshMap.get('ritual-sentinel')!;
|
||||
const pending = mesh.userData.birthRitualPending as any;
|
||||
expect(pending).toBeDefined();
|
||||
expect(pending.totalFrames).toBe(30);
|
||||
// targetScale = 0.5 + retention * 2 = 0.5 + 0.75 * 2 = 2.0
|
||||
expect(pending.targetScale).toBeCloseTo(2.0, 3);
|
||||
});
|
||||
|
||||
it('addNode with isBirthRitual:true does NOT enqueue materialization', () => {
|
||||
const ritualNode = makeNode({ id: 'ritual-pending', retention: 0.8 });
|
||||
manager.addNode(ritualNode, new Vector3(10, 10, 10), { isBirthRitual: true });
|
||||
|
||||
// In the real runtime the ritual-pending node is .visible=false
|
||||
// AND is not yet in the GraphNode[] list — it only gets added to
|
||||
// the visible node list once igniteNode flips its visibility and
|
||||
// materialization kicks in. So we pass an empty `nodes` array to
|
||||
// animate(), which also exercises that the breathing loop skips
|
||||
// meshes absent from the nodes array.
|
||||
const camera = { position: new Vector3(0, 30, 80) } as any;
|
||||
for (let f = 0; f < 40; f++) {
|
||||
manager.animate(f * 0.016, [], camera);
|
||||
}
|
||||
|
||||
const mesh = manager.meshMap.get('ritual-pending')!;
|
||||
// Materialization queue never pushed — a regular materializing
|
||||
// node would be at scale ≈ targetScale = 2.1 by frame 40. The
|
||||
// ritual-pending node stays at its addNode initial 0.001 because
|
||||
// no animation loop is mutating its scale.
|
||||
expect(mesh.scale.x).toBeCloseTo(0.001, 3);
|
||||
|
||||
// Stronger invariant — the sentinel is still there, confirming
|
||||
// the node never got handed off to the materialization queue.
|
||||
expect(mesh.userData.birthRitualPending).toBeDefined();
|
||||
});
|
||||
|
||||
it('addNode without opts proceeds with normal materialization (old behavior)', () => {
|
||||
const node = makeNode({ id: 'normal-spawn' });
|
||||
manager.addNode(node, new Vector3(1, 2, 3));
|
||||
|
||||
const mesh = manager.meshMap.get('normal-spawn')!;
|
||||
const glow = manager.glowMap.get('normal-spawn')!;
|
||||
const label = manager.labelSprites.get('normal-spawn')!;
|
||||
|
||||
// Default mesh.visible is true in three-mock (Object3D has no explicit field).
|
||||
// Key invariant: visible is NOT explicitly false like the ritual path.
|
||||
expect(mesh.visible).not.toBe(false);
|
||||
expect(glow.visible).not.toBe(false);
|
||||
expect(label.visible).not.toBe(false);
|
||||
|
||||
// And no pending sentinel
|
||||
expect(mesh.userData.birthRitualPending).toBeUndefined();
|
||||
|
||||
// Animation should proceed — scale grows via easeOutElastic
|
||||
const camera = { position: new Vector3(0, 30, 80) } as any;
|
||||
for (let f = 0; f < 20; f++) {
|
||||
manager.animate(f * 0.016, [node], camera);
|
||||
}
|
||||
expect(mesh.scale.x).toBeGreaterThan(0.1);
|
||||
});
|
||||
|
||||
it('igniteNode flips all three visibility flags and queues materialization', () => {
|
||||
const node = makeNode({ id: 'to-ignite', retention: 0.6 });
|
||||
manager.addNode(node, new Vector3(0, 0, 0), { isBirthRitual: true });
|
||||
|
||||
// Pre-ignite: hidden
|
||||
const mesh = manager.meshMap.get('to-ignite')!;
|
||||
const glow = manager.glowMap.get('to-ignite')!;
|
||||
const label = manager.labelSprites.get('to-ignite')!;
|
||||
expect(mesh.visible).toBe(false);
|
||||
|
||||
manager.igniteNode('to-ignite');
|
||||
|
||||
// Post-ignite: visible
|
||||
expect(mesh.visible).toBe(true);
|
||||
expect(glow.visible).toBe(true);
|
||||
expect(label.visible).toBe(true);
|
||||
|
||||
// Sentinel is gone
|
||||
expect(mesh.userData.birthRitualPending).toBeUndefined();
|
||||
|
||||
// Materialization was queued — drive animation and the scale
|
||||
// should grow past the initial 0.001.
|
||||
const camera = { position: new Vector3(0, 30, 80) } as any;
|
||||
for (let f = 0; f < 15; f++) {
|
||||
manager.animate(f * 0.016, [node], camera);
|
||||
}
|
||||
expect(mesh.scale.x).toBeGreaterThan(0.1);
|
||||
});
|
||||
|
||||
it('igniteNode called twice is idempotent (second call is a no-op)', () => {
|
||||
const node = makeNode({ id: 'double-ignite', retention: 0.5 });
|
||||
manager.addNode(node, new Vector3(0, 0, 0), { isBirthRitual: true });
|
||||
|
||||
manager.igniteNode('double-ignite');
|
||||
// Capture scale after one round of animation
|
||||
const camera = { position: new Vector3(0, 30, 80) } as any;
|
||||
for (let f = 0; f < 10; f++) {
|
||||
manager.animate(f * 0.016, [node], camera);
|
||||
}
|
||||
const scaleAfterFirst = manager.meshMap.get('double-ignite')!.scale.x;
|
||||
|
||||
// Second ignite — should NOT push a duplicate materialization entry.
|
||||
// If it did, the extra entry (starting at frame 0) would restart
|
||||
// the scale back near 0.001 or at least visibly reset it.
|
||||
manager.igniteNode('double-ignite');
|
||||
for (let f = 0; f < 5; f++) {
|
||||
manager.animate((f + 10) * 0.016, [node], camera);
|
||||
}
|
||||
const scaleAfterSecond = manager.meshMap.get('double-ignite')!.scale.x;
|
||||
|
||||
// Scale after second ignite should be greater than or roughly equal
|
||||
// to scale after first, NOT reset toward 0.001. A duplicate entry
|
||||
// starting at frame 0 would pull the mesh back near zero on the
|
||||
// very first subsequent animate() tick via mn.mesh.scale.setScalar.
|
||||
expect(scaleAfterSecond).toBeGreaterThanOrEqual(scaleAfterFirst * 0.5);
|
||||
});
|
||||
|
||||
it('igniteNode on a regular (non-ritual) node is a no-op', () => {
|
||||
const node = makeNode({ id: 'regular', retention: 0.5 });
|
||||
manager.addNode(node, new Vector3(0, 0, 0));
|
||||
// Regular addNode already queued materialization. Capture state.
|
||||
const mesh = manager.meshMap.get('regular')!;
|
||||
const visBefore = mesh.visible;
|
||||
|
||||
// Call igniteNode — there's no pending sentinel, should short-circuit.
|
||||
expect(() => manager.igniteNode('regular')).not.toThrow();
|
||||
|
||||
// No pending sentinel means the function returns early after the
|
||||
// sentinel check, so nothing about the mesh changes.
|
||||
expect(mesh.visible).toBe(visBefore);
|
||||
expect(mesh.userData.birthRitualPending).toBeUndefined();
|
||||
});
|
||||
|
||||
it('igniteNode on unknown id is a no-op (no throw)', () => {
|
||||
expect(() => manager.igniteNode('does-not-exist')).not.toThrow();
|
||||
expect(manager.meshMap.has('does-not-exist')).toBe(false);
|
||||
});
|
||||
|
||||
it('position is stored in positions map even when the node is invisible', () => {
|
||||
const node = makeNode({ id: 'invisible-but-positioned' });
|
||||
const spawnPos = new Vector3(42, -17, 8);
|
||||
manager.addNode(node, spawnPos, { isBirthRitual: true });
|
||||
|
||||
// Force simulation + orb getTargetPos() both rely on positions
|
||||
// being live immediately — the ritual only hides visuals, not
|
||||
// physics state.
|
||||
const stored = manager.positions.get('invisible-but-positioned');
|
||||
expect(stored).toBeDefined();
|
||||
expect(stored!.x).toBe(42);
|
||||
expect(stored!.y).toBe(-17);
|
||||
expect(stored!.z).toBe(8);
|
||||
|
||||
// And the mesh itself is still hidden
|
||||
expect(manager.meshMap.get('invisible-but-positioned')!.visible).toBe(false);
|
||||
});
|
||||
|
||||
it('removeNode during pending ritual cancels without materialization', () => {
|
||||
// Sanhedrin abort path at the NodeManager level: a ritual-pending
|
||||
// node gets removed before igniteNode fires. The remove path
|
||||
// should still work (dissolution queue takes over) and igniteNode
|
||||
// called later must not resurrect it.
|
||||
const node = makeNode({ id: 'aborted-ritual' });
|
||||
manager.addNode(node, new Vector3(0, 0, 0), { isBirthRitual: true });
|
||||
|
||||
manager.removeNode('aborted-ritual');
|
||||
|
||||
// Dissolution progresses past totalFrames = 60 and clears state.
|
||||
const camera = { position: new Vector3(0, 30, 80) } as any;
|
||||
for (let f = 0; f < 65; f++) {
|
||||
manager.animate(f * 0.016, [node], camera);
|
||||
}
|
||||
|
||||
expect(manager.meshMap.has('aborted-ritual')).toBe(false);
|
||||
|
||||
// And a late igniteNode call on the dead id is a safe no-op.
|
||||
expect(() => manager.igniteNode('aborted-ritual')).not.toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -210,6 +210,13 @@ export class Color {
|
|||
this.b *= s;
|
||||
return this;
|
||||
}
|
||||
|
||||
setRGB(r: number, g: number, b: number) {
|
||||
this.r = r;
|
||||
this.g = g;
|
||||
this.b = b;
|
||||
return this;
|
||||
}
|
||||
}
|
||||
|
||||
export class BufferAttribute {
|
||||
|
|
|
|||
661
apps/dashboard/src/lib/stores/__tests__/toast.test.ts
Normal file
661
apps/dashboard/src/lib/stores/__tests__/toast.test.ts
Normal file
|
|
@ -0,0 +1,661 @@
|
|||
// Unit tests for the Pulse toast store (v2.2).
|
||||
//
|
||||
// The store subscribes to `eventFeed` from `$stores/websocket` at IMPORT
|
||||
// TIME, so every test re-imports the module via `vi.resetModules()` +
|
||||
// dynamic import to get a fresh `lastSeen` / `nextId` / `lastConnectionAt`
|
||||
// / dwell-timer registry. Without this, the module-level state leaks
|
||||
// between tests (especially the 1500ms ConnectionDiscovered throttle).
|
||||
//
|
||||
// The eventFeed is mocked as a plain writable<VestigeEvent[]> — we push
|
||||
// arrays directly, mirroring the way the real websocket store prepends
|
||||
// new events at index 0.
|
||||
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { writable, get, type Writable } from 'svelte/store';
|
||||
import type { VestigeEvent } from '$types';
|
||||
|
||||
// The mock `eventFeed` is hoisted so vi.mock can reference it.
|
||||
const mockEventFeed: Writable<VestigeEvent[]> = writable<VestigeEvent[]>([]);
|
||||
|
||||
vi.mock('$stores/websocket', () => ({
|
||||
eventFeed: mockEventFeed,
|
||||
}));
|
||||
|
||||
// Helper — make a fresh VestigeEvent with a unique object identity.
|
||||
// The store uses reference equality (e === lastSeen) to detect freshness,
|
||||
// so every emission must be a distinct object.
|
||||
function makeEvent<T extends VestigeEvent['type']>(
|
||||
type: T,
|
||||
data: Record<string, unknown> = {},
|
||||
): VestigeEvent {
|
||||
return { type, data };
|
||||
}
|
||||
|
||||
// Prepend events onto the feed — mirrors the real websocket store, which
|
||||
// does `[parsed, ...events].slice(0, 200)`. Pass a single event or an
|
||||
// array (oldest-last, so `push([newest, older, oldest])` is the shape
|
||||
// the real subscriber sees).
|
||||
function emit(events: VestigeEvent | VestigeEvent[]) {
|
||||
const arr = Array.isArray(events) ? events : [events];
|
||||
mockEventFeed.update((prev) => [...arr, ...prev]);
|
||||
}
|
||||
|
||||
// Reset the feed between tests. Combined with vi.resetModules() this
|
||||
// guarantees each test starts with a virgin toast store.
|
||||
function resetFeed() {
|
||||
mockEventFeed.set([]);
|
||||
}
|
||||
|
||||
// Dynamically import the toast store after resetModules so we get a
|
||||
// fresh subscription + fresh module-level state every test.
|
||||
async function loadToastStore() {
|
||||
const mod = await import('../toast');
|
||||
return mod;
|
||||
}
|
||||
|
||||
describe('toast store', () => {
|
||||
beforeEach(() => {
|
||||
vi.useFakeTimers();
|
||||
resetFeed();
|
||||
vi.resetModules();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
vi.useRealTimers();
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Identity-based batch walk (silent-lobotomy fix)
|
||||
// ---------------------------------------------------------------
|
||||
describe('identity-based batch walk', () => {
|
||||
it('processes ALL events when multiple land in one tick, not just the newest', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
const e1 = makeEvent('DreamCompleted', {
|
||||
memories_replayed: 3,
|
||||
connections_found: 1,
|
||||
insights_generated: 0,
|
||||
duration_ms: 500,
|
||||
});
|
||||
const e2 = makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'semantic',
|
||||
weight: 0.8,
|
||||
});
|
||||
const e3 = makeEvent('MemoryPromoted', { new_retention: 0.9 });
|
||||
|
||||
// All three land in the same tick — emit as a single array
|
||||
// (oldest-last, matching the real store prepend order).
|
||||
emit([e3, e2, e1]);
|
||||
|
||||
const list = get(toasts);
|
||||
expect(list.length).toBe(3);
|
||||
// Queue is newest-first (store prepends): [e3, e2, e1]
|
||||
expect(list[0].type).toBe('MemoryPromoted');
|
||||
expect(list[1].type).toBe('ConnectionDiscovered');
|
||||
expect(list[2].type).toBe('DreamCompleted');
|
||||
});
|
||||
|
||||
it('processes events in OLDEST-first narrative order (DreamCompleted before ConnectionDiscovered)', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
const dream = makeEvent('DreamCompleted', {
|
||||
memories_replayed: 10,
|
||||
connections_found: 2,
|
||||
insights_generated: 1,
|
||||
duration_ms: 800,
|
||||
});
|
||||
const bridge = makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'causal',
|
||||
weight: 0.75,
|
||||
});
|
||||
|
||||
// dream is older, bridge is newer → emit [bridge, dream]
|
||||
emit([bridge, dream]);
|
||||
|
||||
const list = get(toasts);
|
||||
// IDs are assigned sequentially as events are processed. Dream
|
||||
// gets processed first (oldest-first walk) → id=1. Bridge → id=2.
|
||||
// Store prepends, so the queue is [bridge(2), dream(1)].
|
||||
expect(list[0].id).toBeGreaterThan(list[1].id);
|
||||
expect(list[1].type).toBe('DreamCompleted');
|
||||
expect(list[0].type).toBe('ConnectionDiscovered');
|
||||
});
|
||||
|
||||
it('does not duplicate toasts when the subscriber re-fires with no new events', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
const e = makeEvent('MemoryPromoted', { new_retention: 0.85 });
|
||||
emit(e);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
|
||||
// Re-setting the same array (no new events) must NOT produce a
|
||||
// second toast. Also pushing an unrelated no-op update.
|
||||
mockEventFeed.update((prev) => [...prev]);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
});
|
||||
|
||||
it('handles empty feed updates gracefully (no toasts created)', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
// Force the subscriber to fire with an empty array.
|
||||
mockEventFeed.set([]);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
|
||||
it('falls back gracefully when lastSeen is evicted from the capped feed', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
// Emit a first event that becomes lastSeen.
|
||||
const first = makeEvent('MemoryPromoted', { new_retention: 0.8 });
|
||||
emit(first);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
|
||||
// Now emit a burst where the old lastSeen is pushed out. Since
|
||||
// we can never match it by identity, the walk goes to the end
|
||||
// of the array and translates everything.
|
||||
const burst = [
|
||||
makeEvent('MemoryPromoted', { new_retention: 0.81 }),
|
||||
makeEvent('MemoryPromoted', { new_retention: 0.82 }),
|
||||
makeEvent('MemoryPromoted', { new_retention: 0.83 }),
|
||||
];
|
||||
// Replace the feed entirely — the old `first` event is gone.
|
||||
mockEventFeed.set(burst);
|
||||
|
||||
// All three new events get translated. Plus the one we already had.
|
||||
expect(get(toasts).length).toBe(4);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Event translation — one test per meaningful type
|
||||
// ---------------------------------------------------------------
|
||||
describe('event translation', () => {
|
||||
it('DreamCompleted → title + body with replayed/connections/insights/duration', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('DreamCompleted', {
|
||||
memories_replayed: 127,
|
||||
connections_found: 43,
|
||||
insights_generated: 5,
|
||||
duration_ms: 2400,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Dream consolidated');
|
||||
expect(t.body).toContain('Replayed 127 memories');
|
||||
expect(t.body).toContain('43 new connections');
|
||||
expect(t.body).toContain('5 insights');
|
||||
expect(t.body).toContain('2.4s');
|
||||
});
|
||||
|
||||
it('DreamCompleted → singular grammar when replayed === 1 and found === 1', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('DreamCompleted', {
|
||||
memories_replayed: 1,
|
||||
connections_found: 1,
|
||||
insights_generated: 1,
|
||||
duration_ms: 300,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.body).toContain('Replayed 1 memory');
|
||||
expect(t.body).toContain('1 new connection');
|
||||
expect(t.body).not.toContain('1 new connections');
|
||||
expect(t.body).toContain('1 insight');
|
||||
});
|
||||
|
||||
it('ConsolidationCompleted → title + body with nodes/decay/embedded/duration', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('ConsolidationCompleted', {
|
||||
nodes_processed: 892,
|
||||
decay_applied: 156,
|
||||
embeddings_generated: 48,
|
||||
duration_ms: 1100,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Consolidation swept');
|
||||
expect(t.body).toContain('892 nodes');
|
||||
expect(t.body).toContain('156 decayed');
|
||||
expect(t.body).toContain('48 embedded');
|
||||
expect(t.body).toContain('1.1s');
|
||||
});
|
||||
|
||||
it('ConnectionDiscovered → title + connection type + weight', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'semantic',
|
||||
weight: 0.87,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Bridge discovered');
|
||||
expect(t.body).toContain('semantic');
|
||||
expect(t.body).toContain('0.87');
|
||||
});
|
||||
|
||||
it('MemoryPromoted → body includes retention %', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.85 }));
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Memory promoted');
|
||||
expect(t.body).toBe('retention 85%');
|
||||
});
|
||||
|
||||
it('MemoryDemoted → body includes retention %', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryDemoted', { new_retention: 0.42 }));
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Memory demoted');
|
||||
expect(t.body).toBe('retention 42%');
|
||||
});
|
||||
|
||||
it('MemorySuppressed (cascade=0) → suppression # only', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('MemorySuppressed', {
|
||||
suppression_count: 3,
|
||||
estimated_cascade: 0,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Forgetting');
|
||||
expect(t.body).toBe('suppression #3');
|
||||
expect(t.body).not.toContain('Rac1');
|
||||
});
|
||||
|
||||
it('MemorySuppressed (cascade>0) → suppression # + Rac1 cascade mention', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('MemorySuppressed', {
|
||||
suppression_count: 2,
|
||||
estimated_cascade: 8,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.body).toContain('suppression #2');
|
||||
expect(t.body).toContain('Rac1 cascade');
|
||||
expect(t.body).toContain('~8 neighbors');
|
||||
});
|
||||
|
||||
it('MemoryUnsuppressed (remaining>0) → remaining count', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryUnsuppressed', { remaining_count: 2 }));
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Recovered');
|
||||
expect(t.body).toContain('2 suppressions remain');
|
||||
});
|
||||
|
||||
it('MemoryUnsuppressed (remaining=0) → "fully unsuppressed"', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryUnsuppressed', { remaining_count: 0 }));
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.body).toBe('fully unsuppressed');
|
||||
});
|
||||
|
||||
it('Rac1CascadeSwept → seeds + neighbors affected', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('Rac1CascadeSwept', {
|
||||
seeds: 3,
|
||||
neighbors_affected: 14,
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Rac1 cascade');
|
||||
expect(t.body).toContain('3 seeds');
|
||||
expect(t.body).toContain('14 dendritic spines');
|
||||
expect(t.body).toContain('pruned');
|
||||
});
|
||||
|
||||
it('MemoryDeleted → body is id truncated to first 8 chars', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('MemoryDeleted', {
|
||||
id: 'deadbeefcafef00d1234567890abcdef',
|
||||
}),
|
||||
);
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.title).toBe('Memory deleted');
|
||||
expect(t.body).toBe('deadbeef');
|
||||
});
|
||||
|
||||
it.each([
|
||||
['Heartbeat'],
|
||||
['SearchPerformed'],
|
||||
['RetentionDecayed'],
|
||||
['ActivationSpread'],
|
||||
['ImportanceScored'],
|
||||
['MemoryCreated'],
|
||||
['MemoryUpdated'],
|
||||
['DreamStarted'],
|
||||
['DreamProgress'],
|
||||
['ConsolidationStarted'],
|
||||
['Connected'],
|
||||
] as const)('noise event %s produces no toast', async ([type]) => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent(type as VestigeEvent['type'], {}));
|
||||
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// ConnectionDiscovered throttle
|
||||
// ---------------------------------------------------------------
|
||||
describe('ConnectionDiscovered throttle', () => {
|
||||
it('two ConnectionDiscovered within 1500ms → only one toast', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'semantic',
|
||||
weight: 0.8,
|
||||
}),
|
||||
);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
|
||||
// 500ms later — still inside throttle
|
||||
vi.advanceTimersByTime(500);
|
||||
|
||||
emit(
|
||||
makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'causal',
|
||||
weight: 0.9,
|
||||
}),
|
||||
);
|
||||
|
||||
expect(get(toasts).length).toBe(1);
|
||||
});
|
||||
|
||||
it('two ConnectionDiscovered more than 1500ms apart → both toasts', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(
|
||||
makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'semantic',
|
||||
weight: 0.8,
|
||||
}),
|
||||
);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
|
||||
// Wait past the throttle window (1500ms).
|
||||
vi.advanceTimersByTime(1600);
|
||||
|
||||
emit(
|
||||
makeEvent('ConnectionDiscovered', {
|
||||
connection_type: 'causal',
|
||||
weight: 0.9,
|
||||
}),
|
||||
);
|
||||
|
||||
expect(get(toasts).length).toBe(2);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Hover-panic — pauseDwell / resumeDwell
|
||||
// ---------------------------------------------------------------
|
||||
describe('hover-panic (pauseDwell / resumeDwell)', () => {
|
||||
it('auto-dismiss fires after dwellMs when not paused', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
expect(t).toBeDefined();
|
||||
expect(t.dwellMs).toBe(4500);
|
||||
|
||||
// Advance just past the dwell — toast should be gone.
|
||||
vi.advanceTimersByTime(4600);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
|
||||
it('pauseDwell stops the auto-dismiss — toast survives past natural dwellMs', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
|
||||
// 1 second in, pause.
|
||||
vi.advanceTimersByTime(1000);
|
||||
toasts.pauseDwell(t.id, t.dwellMs);
|
||||
|
||||
// Advance WAY past the natural dwell — still there.
|
||||
vi.advanceTimersByTime(10_000);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
expect(get(toasts)[0].id).toBe(t.id);
|
||||
});
|
||||
|
||||
it('resumeDwell schedules dismissal for the REMAINING time, not the full dwellMs', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
expect(t.dwellMs).toBe(4500);
|
||||
|
||||
// 1 second elapsed — pause. Remaining should be ~3500ms.
|
||||
vi.advanceTimersByTime(1000);
|
||||
toasts.pauseDwell(t.id, t.dwellMs);
|
||||
|
||||
// Hold paused for 10s (irrelevant to remaining calc).
|
||||
vi.advanceTimersByTime(10_000);
|
||||
|
||||
// Resume — remaining is ~3500ms.
|
||||
toasts.resumeDwell(t.id);
|
||||
|
||||
// At 3400ms still alive (just under remaining).
|
||||
vi.advanceTimersByTime(3400);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
|
||||
// At 3500ms+, dismissed.
|
||||
vi.advanceTimersByTime(200);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
|
||||
it('double-pause is a safe no-op (second call does not corrupt remaining)', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
|
||||
vi.advanceTimersByTime(500);
|
||||
toasts.pauseDwell(t.id, t.dwellMs);
|
||||
|
||||
// Second pause should not throw or mutate state badly. The
|
||||
// implementation bails early because dwellTimers no longer
|
||||
// contains the id.
|
||||
expect(() => toasts.pauseDwell(t.id, t.dwellMs)).not.toThrow();
|
||||
|
||||
// Still paused — advancing doesn't dismiss.
|
||||
vi.advanceTimersByTime(10_000);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
});
|
||||
|
||||
it('dismiss while paused clears the paused state (no zombie timer)', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
|
||||
vi.advanceTimersByTime(500);
|
||||
toasts.pauseDwell(t.id, t.dwellMs);
|
||||
|
||||
// Programmatic dismiss.
|
||||
toasts.dismiss(t.id);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
|
||||
// A later resume should be a no-op (no zombie re-schedule).
|
||||
toasts.resumeDwell(t.id);
|
||||
vi.advanceTimersByTime(10_000);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
|
||||
it('resumeDwell on a non-paused id is a no-op', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
const t = get(toasts)[0];
|
||||
|
||||
// Without pausing — resume is a no-op and must not schedule
|
||||
// anything new. The original timer is still ticking.
|
||||
toasts.resumeDwell(t.id);
|
||||
|
||||
vi.advanceTimersByTime(4600);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Queue behavior
|
||||
// ---------------------------------------------------------------
|
||||
describe('queue behavior', () => {
|
||||
it('MAX_VISIBLE=4: creating a 5th toast evicts the oldest', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
// Use MemoryPromoted (not throttled) to stack 5 toasts.
|
||||
for (let i = 0; i < 5; i++) {
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.5 + i * 0.01 }));
|
||||
// Small advance so each event has a distinct identity and no batch-merge.
|
||||
vi.advanceTimersByTime(10);
|
||||
}
|
||||
|
||||
const list = get(toasts);
|
||||
expect(list.length).toBe(4);
|
||||
|
||||
// IDs are assigned 1..5 in event-processing order. Store prepends,
|
||||
// so the queue is [id=5, id=4, id=3, id=2]; id=1 was evicted.
|
||||
const ids = list.map((t) => t.id);
|
||||
expect(ids).not.toContain(1);
|
||||
expect(ids).toContain(5);
|
||||
});
|
||||
|
||||
it('clear() dismisses all toasts and cancels all timers', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit([
|
||||
makeEvent('MemoryPromoted', { new_retention: 0.8 }),
|
||||
makeEvent('MemoryDemoted', { new_retention: 0.4 }),
|
||||
]);
|
||||
expect(get(toasts).length).toBe(2);
|
||||
|
||||
toasts.clear();
|
||||
expect(get(toasts).length).toBe(0);
|
||||
|
||||
// Advancing past the dwell must not re-fire anything (no zombie timers).
|
||||
vi.advanceTimersByTime(10_000);
|
||||
expect(get(toasts).length).toBe(0);
|
||||
});
|
||||
|
||||
it('dismissing a specific id leaves the other toasts intact', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.8 }));
|
||||
vi.advanceTimersByTime(10);
|
||||
emit(makeEvent('MemoryDemoted', { new_retention: 0.4 }));
|
||||
|
||||
const list = get(toasts);
|
||||
expect(list.length).toBe(2);
|
||||
const firstId = list[list.length - 1].id; // oldest
|
||||
|
||||
toasts.dismiss(firstId);
|
||||
|
||||
const remaining = get(toasts);
|
||||
expect(remaining.length).toBe(1);
|
||||
expect(remaining[0].id).not.toBe(firstId);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Demo sequence
|
||||
// ---------------------------------------------------------------
|
||||
describe('fireDemoSequence', () => {
|
||||
it('schedules 4 toasts staggered by 800ms', async () => {
|
||||
const { toasts, fireDemoSequence } = await loadToastStore();
|
||||
|
||||
fireDemoSequence();
|
||||
|
||||
// t=0: nothing yet (all are setTimeout, even the first at i=0 * 800 = 0ms).
|
||||
// setTimeout(_, 0) still goes to the next tick under fake timers.
|
||||
expect(get(toasts).length).toBe(0);
|
||||
|
||||
// Flush i=0.
|
||||
vi.advanceTimersByTime(1);
|
||||
expect(get(toasts).length).toBe(1);
|
||||
expect(get(toasts)[0].type).toBe('DreamCompleted');
|
||||
|
||||
// i=1 at 800ms.
|
||||
vi.advanceTimersByTime(800);
|
||||
expect(get(toasts).length).toBe(2);
|
||||
expect(get(toasts)[0].type).toBe('ConnectionDiscovered');
|
||||
|
||||
// i=2 at 1600ms.
|
||||
vi.advanceTimersByTime(800);
|
||||
expect(get(toasts).length).toBe(3);
|
||||
expect(get(toasts)[0].type).toBe('MemorySuppressed');
|
||||
|
||||
// i=3 at 2400ms.
|
||||
vi.advanceTimersByTime(800);
|
||||
expect(get(toasts).length).toBe(4);
|
||||
expect(get(toasts)[0].type).toBe('ConsolidationCompleted');
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------
|
||||
// Toast shape sanity
|
||||
// ---------------------------------------------------------------
|
||||
describe('toast shape', () => {
|
||||
it('each toast has id, createdAt, color, dwellMs fields populated', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.9 }));
|
||||
|
||||
const t = get(toasts)[0];
|
||||
expect(t.id).toBeTypeOf('number');
|
||||
expect(t.createdAt).toBeTypeOf('number');
|
||||
expect(t.color).toBeTypeOf('string');
|
||||
expect(t.dwellMs).toBeTypeOf('number');
|
||||
expect(t.color.length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
it('ids are strictly increasing across successive toasts', async () => {
|
||||
const { toasts } = await loadToastStore();
|
||||
|
||||
emit(makeEvent('MemoryPromoted', { new_retention: 0.9 }));
|
||||
vi.advanceTimersByTime(10);
|
||||
emit(makeEvent('MemoryDemoted', { new_retention: 0.3 }));
|
||||
|
||||
const list = get(toasts);
|
||||
expect(list.length).toBe(2);
|
||||
// Store prepends, so list[0] is newer = higher id.
|
||||
expect(list[0].id).toBeGreaterThan(list[1].id);
|
||||
});
|
||||
});
|
||||
});
|
||||
341
apps/dashboard/src/lib/stores/__tests__/websocket.test.ts
Normal file
341
apps/dashboard/src/lib/stores/__tests__/websocket.test.ts
Normal file
|
|
@ -0,0 +1,341 @@
|
|||
/**
|
||||
* Unit tests for the websocket store.
|
||||
*
|
||||
* Scope: pure-store methods and derived-store behavior that can be tested
|
||||
* without a real WebSocket connection. Connection lifecycle, reconnect
|
||||
* backoff, and live handler wiring are out of scope — those are integration
|
||||
* concerns.
|
||||
*/
|
||||
import { describe, it, expect, beforeEach, vi } from 'vitest';
|
||||
import { get } from 'svelte/store';
|
||||
|
||||
// Stub the global WebSocket BEFORE importing the store, so any accidental
|
||||
// `connect()` path does not attempt a real network call and does not throw.
|
||||
// The FakeWS also captures the most-recently-constructed instance so tests
|
||||
// that want to drive `onmessage` (to exercise the Heartbeat branch of the
|
||||
// internal update handler) can do so.
|
||||
class FakeWS {
|
||||
static last: FakeWS | null = null;
|
||||
static OPEN = 1;
|
||||
readyState = 0;
|
||||
onopen: ((ev?: unknown) => void) | null = null;
|
||||
onclose: ((ev?: unknown) => void) | null = null;
|
||||
onmessage: ((ev: { data: string }) => void) | null = null;
|
||||
onerror: ((ev?: unknown) => void) | null = null;
|
||||
constructor(public url: string) {
|
||||
FakeWS.last = this;
|
||||
}
|
||||
close() {
|
||||
/* no-op */
|
||||
}
|
||||
addEventListener() {
|
||||
/* no-op */
|
||||
}
|
||||
removeEventListener() {
|
||||
/* no-op */
|
||||
}
|
||||
}
|
||||
vi.stubGlobal('WebSocket', FakeWS);
|
||||
|
||||
import {
|
||||
websocket,
|
||||
eventFeed,
|
||||
isConnected,
|
||||
memoryCount,
|
||||
avgRetention,
|
||||
suppressedCount,
|
||||
uptimeSeconds,
|
||||
heartbeat,
|
||||
formatUptime,
|
||||
} from '../websocket';
|
||||
import type { VestigeEvent } from '$types';
|
||||
|
||||
const MAX_EVENTS = 200;
|
||||
|
||||
function makeEvent(
|
||||
type: VestigeEvent['type'] = 'MemoryCreated',
|
||||
data: Record<string, unknown> = {}
|
||||
): VestigeEvent {
|
||||
return { type, data };
|
||||
}
|
||||
|
||||
function makeHeartbeat(data: Record<string, unknown> = {}): VestigeEvent {
|
||||
return {
|
||||
type: 'Heartbeat',
|
||||
data: {
|
||||
memory_count: 0,
|
||||
avg_retention: 0,
|
||||
suppressed_count: 0,
|
||||
uptime_secs: 0,
|
||||
...data,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper: drive a heartbeat into the store via the internal onmessage path.
|
||||
* We cannot reach `update()` directly, and `injectEvent()` deliberately
|
||||
* does NOT treat heartbeats specially, so the only way to populate
|
||||
* `lastHeartbeat` is to route through the WebSocket handler.
|
||||
*/
|
||||
function deliverHeartbeat(hb: VestigeEvent) {
|
||||
// Disconnect to reset any prior state, then connect to install handlers
|
||||
// on a fresh FakeWS instance whose onmessage we can invoke.
|
||||
websocket.disconnect();
|
||||
websocket.connect('ws://test.invalid/ws');
|
||||
const ws = FakeWS.last;
|
||||
if (!ws || !ws.onmessage) throw new Error('FakeWS onmessage not wired');
|
||||
ws.onmessage({ data: JSON.stringify(hb) });
|
||||
}
|
||||
|
||||
beforeEach(() => {
|
||||
// Reset the events array between tests; lastHeartbeat is explicitly left
|
||||
// alone here because `clearEvents()` preserves it (that is itself tested
|
||||
// below). For the derived-store defaults tests we call disconnect() to
|
||||
// fully reset the store.
|
||||
websocket.clearEvents();
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// injectEvent
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('injectEvent', () => {
|
||||
it('adds a single event at index 0', () => {
|
||||
const evt = makeEvent('MemoryCreated', { id: 'a' });
|
||||
websocket.injectEvent(evt);
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(1);
|
||||
expect(feed[0]).toEqual(evt);
|
||||
});
|
||||
|
||||
it('prepends: newest injected ends up at index 0', () => {
|
||||
const first = makeEvent('MemoryCreated', { id: 'first' });
|
||||
const second = makeEvent('MemoryUpdated', { id: 'second' });
|
||||
const third = makeEvent('MemoryDeleted', { id: 'third' });
|
||||
websocket.injectEvent(first);
|
||||
websocket.injectEvent(second);
|
||||
websocket.injectEvent(third);
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(3);
|
||||
expect(feed[0]).toEqual(third);
|
||||
expect(feed[1]).toEqual(second);
|
||||
expect(feed[2]).toEqual(first);
|
||||
});
|
||||
|
||||
it('caps the events array at MAX_EVENTS (200)', () => {
|
||||
for (let i = 0; i < MAX_EVENTS + 50; i++) {
|
||||
websocket.injectEvent(makeEvent('MemoryCreated', { seq: i }));
|
||||
}
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(MAX_EVENTS);
|
||||
});
|
||||
|
||||
it('evicts the oldest entry when at capacity (FIFO drop)', () => {
|
||||
// Fill to exactly capacity, then push one more: seq=0 should be gone.
|
||||
for (let i = 0; i < MAX_EVENTS; i++) {
|
||||
websocket.injectEvent(makeEvent('MemoryCreated', { seq: i }));
|
||||
}
|
||||
websocket.injectEvent(makeEvent('MemoryCreated', { seq: 999 }));
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(MAX_EVENTS);
|
||||
expect(feed[0].data.seq).toBe(999);
|
||||
// Oldest (seq=0) evicted; tail is now the prior second-oldest (seq=1).
|
||||
expect(feed[feed.length - 1].data.seq).toBe(1);
|
||||
expect(feed.some((e) => e.data.seq === 0)).toBe(false);
|
||||
});
|
||||
|
||||
it('triggers the eventFeed derived store to emit on each injection', () => {
|
||||
const observed: number[] = [];
|
||||
const unsub = eventFeed.subscribe((events) => {
|
||||
observed.push(events.length);
|
||||
});
|
||||
// Initial subscription fires once with current length (0 after beforeEach).
|
||||
const initialEmitCount = observed.length;
|
||||
websocket.injectEvent(makeEvent());
|
||||
websocket.injectEvent(makeEvent());
|
||||
unsub();
|
||||
// Two injections should produce two additional emits beyond the initial one.
|
||||
expect(observed.length).toBe(initialEmitCount + 2);
|
||||
expect(observed[observed.length - 1]).toBe(2);
|
||||
});
|
||||
|
||||
it('does NOT treat Heartbeat-typed events specially when injected', () => {
|
||||
// Documented behavior: injectEvent is a raw prepend. Only the real
|
||||
// onmessage handler branches on type === 'Heartbeat'. If a caller
|
||||
// injects a Heartbeat, it lands in the events array, and lastHeartbeat
|
||||
// is untouched. Callers who want a heartbeat-like derived-store update
|
||||
// must route through the WebSocket handler instead.
|
||||
websocket.disconnect(); // reset lastHeartbeat to null
|
||||
const hb = makeHeartbeat({ memory_count: 42 });
|
||||
websocket.injectEvent(hb);
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(1);
|
||||
expect(feed[0]).toEqual(hb);
|
||||
// memoryCount still 0 because lastHeartbeat was never written.
|
||||
expect(get(memoryCount)).toBe(0);
|
||||
expect(get(heartbeat)).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Derived store defaults (no heartbeat yet)
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('derived stores — defaults with no heartbeat', () => {
|
||||
beforeEach(() => {
|
||||
// Full reset so lastHeartbeat is null.
|
||||
websocket.disconnect();
|
||||
});
|
||||
|
||||
it('isConnected is false after disconnect', () => {
|
||||
expect(get(isConnected)).toBe(false);
|
||||
});
|
||||
|
||||
it('heartbeat is null when no heartbeat has arrived', () => {
|
||||
expect(get(heartbeat)).toBeNull();
|
||||
});
|
||||
|
||||
it('memoryCount returns 0 when no heartbeat has arrived', () => {
|
||||
expect(get(memoryCount)).toBe(0);
|
||||
});
|
||||
|
||||
it('avgRetention returns 0 when no heartbeat has arrived', () => {
|
||||
expect(get(avgRetention)).toBe(0);
|
||||
});
|
||||
|
||||
it('suppressedCount returns 0 when no heartbeat has arrived', () => {
|
||||
expect(get(suppressedCount)).toBe(0);
|
||||
});
|
||||
|
||||
it('uptimeSeconds returns 0 when no heartbeat has arrived', () => {
|
||||
expect(get(uptimeSeconds)).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Derived stores after heartbeat
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('derived stores — after heartbeat delivery', () => {
|
||||
it('memoryCount, avgRetention, suppressedCount, uptimeSeconds all update', () => {
|
||||
deliverHeartbeat(
|
||||
makeHeartbeat({
|
||||
memory_count: 123,
|
||||
avg_retention: 0.74,
|
||||
suppressed_count: 5,
|
||||
uptime_secs: 3661,
|
||||
})
|
||||
);
|
||||
expect(get(memoryCount)).toBe(123);
|
||||
expect(get(avgRetention)).toBeCloseTo(0.74);
|
||||
expect(get(suppressedCount)).toBe(5);
|
||||
expect(get(uptimeSeconds)).toBe(3661);
|
||||
const hb = get(heartbeat);
|
||||
expect(hb).not.toBeNull();
|
||||
expect(hb?.type).toBe('Heartbeat');
|
||||
});
|
||||
|
||||
it('heartbeat events do NOT enter the events array (handled by onmessage)', () => {
|
||||
websocket.disconnect();
|
||||
deliverHeartbeat(makeHeartbeat({ memory_count: 1 }));
|
||||
expect(get(eventFeed).length).toBe(0);
|
||||
});
|
||||
|
||||
it('non-heartbeat events delivered via onmessage enter the events array', () => {
|
||||
websocket.disconnect();
|
||||
websocket.connect('ws://test.invalid/ws');
|
||||
const ws = FakeWS.last!;
|
||||
ws.onmessage!({ data: JSON.stringify(makeEvent('MemoryCreated', { id: 'x' })) });
|
||||
const feed = get(eventFeed);
|
||||
expect(feed.length).toBe(1);
|
||||
expect(feed[0].type).toBe('MemoryCreated');
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// clearEvents
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('clearEvents', () => {
|
||||
it('empties the events array', () => {
|
||||
websocket.injectEvent(makeEvent());
|
||||
websocket.injectEvent(makeEvent());
|
||||
expect(get(eventFeed).length).toBe(2);
|
||||
websocket.clearEvents();
|
||||
expect(get(eventFeed).length).toBe(0);
|
||||
});
|
||||
|
||||
it('preserves lastHeartbeat (does NOT clear it)', () => {
|
||||
deliverHeartbeat(makeHeartbeat({ memory_count: 77 }));
|
||||
expect(get(memoryCount)).toBe(77);
|
||||
websocket.injectEvent(makeEvent('MemoryCreated'));
|
||||
websocket.clearEvents();
|
||||
expect(get(eventFeed).length).toBe(0);
|
||||
// lastHeartbeat untouched, so memoryCount still reflects the heartbeat.
|
||||
expect(get(memoryCount)).toBe(77);
|
||||
expect(get(heartbeat)).not.toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// formatUptime
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
describe('formatUptime', () => {
|
||||
it("returns '—' for negative input", () => {
|
||||
expect(formatUptime(-1)).toBe('—');
|
||||
});
|
||||
|
||||
it("returns '—' for non-finite input (NaN, Infinity)", () => {
|
||||
expect(formatUptime(NaN)).toBe('—');
|
||||
expect(formatUptime(Infinity)).toBe('—');
|
||||
expect(formatUptime(-Infinity)).toBe('—');
|
||||
});
|
||||
|
||||
it("returns '0s' for 0 (boundary: non-negative, all units zero)", () => {
|
||||
// secs=0 is NOT < 0, so it falls through to the '${s}s' branch.
|
||||
expect(formatUptime(0)).toBe('0s');
|
||||
});
|
||||
|
||||
it('seconds-only branch: 47s', () => {
|
||||
expect(formatUptime(47)).toBe('47s');
|
||||
});
|
||||
|
||||
it('seconds boundary: 59s → "59s"', () => {
|
||||
expect(formatUptime(59)).toBe('59s');
|
||||
});
|
||||
|
||||
it('minute boundary: 60s → "1m" (no trailing 0s)', () => {
|
||||
expect(formatUptime(60)).toBe('1m');
|
||||
});
|
||||
|
||||
it('minutes + seconds: 190s → "3m 10s"', () => {
|
||||
expect(formatUptime(190)).toBe('3m 10s');
|
||||
});
|
||||
|
||||
it('hour boundary minus one: 3599s → "59m 59s"', () => {
|
||||
expect(formatUptime(3599)).toBe('59m 59s');
|
||||
});
|
||||
|
||||
it('hour boundary: 3600s → "1h" (no trailing 0m)', () => {
|
||||
expect(formatUptime(3600)).toBe('1h');
|
||||
});
|
||||
|
||||
it('hours + minutes: 11520s (3h 12m) → "3h 12m"', () => {
|
||||
expect(formatUptime(3 * 3600 + 12 * 60)).toBe('3h 12m');
|
||||
});
|
||||
|
||||
it('day boundary minus one: 86399s → "23h 59m"', () => {
|
||||
// Two-most-significant-units rule: hours + minutes, seconds dropped.
|
||||
expect(formatUptime(86399)).toBe('23h 59m');
|
||||
});
|
||||
|
||||
it('day boundary: 86400s → "1d" (no trailing 0h)', () => {
|
||||
expect(formatUptime(86400)).toBe('1d');
|
||||
});
|
||||
|
||||
it('days + hours: 4d 2h → "4d 2h" (minutes dropped)', () => {
|
||||
expect(formatUptime(4 * 86400 + 2 * 3600 + 37 * 60)).toBe('4d 2h');
|
||||
});
|
||||
});
|
||||
Loading…
Add table
Add a link
Reference in a new issue