mirror of
https://github.com/rowboatlabs/rowboat.git
synced 2026-05-03 12:22:38 +02:00
add posthog analytics for llm usage and auth events
Captures per-LLM-call token usage tagged by feature (copilot chat, track block, meeting note, knowledge sync), plus sign-in / sign-out and identity. Renderer and main share one PostHog identity so events from either process resolve to the same user. See apps/x/ANALYTICS.md for the event catalog, person properties, use-case taxonomy, and how to add new events. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
d42fb26bcc
commit
43c1ba719f
31 changed files with 625 additions and 36 deletions
|
|
@ -109,6 +109,7 @@ Long-form docs for specific features. Read the relevant file before making chang
|
||||||
| Feature | Doc |
|
| Feature | Doc |
|
||||||
|---------|-----|
|
|---------|-----|
|
||||||
| Track Blocks — auto-updating note content (scheduled / event-driven / manual), Copilot skill, prompts catalog | `apps/x/TRACKS.md` |
|
| Track Blocks — auto-updating note content (scheduled / event-driven / manual), Copilot skill, prompts catalog | `apps/x/TRACKS.md` |
|
||||||
|
| Analytics — PostHog event catalog, person properties, use-case taxonomy, how to add a new event | `apps/x/ANALYTICS.md` |
|
||||||
|
|
||||||
## Common Tasks
|
## Common Tasks
|
||||||
|
|
||||||
|
|
|
||||||
145
apps/x/ANALYTICS.md
Normal file
145
apps/x/ANALYTICS.md
Normal file
|
|
@ -0,0 +1,145 @@
|
||||||
|
# Analytics
|
||||||
|
|
||||||
|
> PostHog instrumentation for `apps/x`. We capture LLM token usage (broken down by feature) and identity/auth events. Renderer (`posthog-js`) and main (`posthog-node`) share one stable distinct_id and one identified user, so events from either process resolve to the same person.
|
||||||
|
|
||||||
|
## Identity model
|
||||||
|
|
||||||
|
- **Anonymous distinct_id** = `installationId` from `~/.rowboat/config/installation.json` (auto-generated on first run; see `packages/core/src/analytics/installation.ts`).
|
||||||
|
- Renderer fetches it from main on startup via the `analytics:bootstrap` IPC channel and passes it as PostHog's `bootstrap.distinctID`. Main uses it directly in `posthog-node`.
|
||||||
|
- **On rowboat sign-in**: `posthog.identify(rowboatUserId)` runs in **both** processes.
|
||||||
|
- Main does it from `apps/main/src/oauth-handler.ts:285` (after `getBillingInfo()` resolves) — this is the load-bearing call, since main always runs.
|
||||||
|
- Renderer mirrors via `apps/renderer/src/hooks/useAnalyticsIdentity.ts` listening on the `oauth:didConnect` IPC event.
|
||||||
|
- Main also calls `alias()` so events emitted under the anonymous installation_id are linked to the identified user retroactively.
|
||||||
|
- **On rowboat sign-out**: `posthog.reset()` in both processes; future events resolve to the installation_id again.
|
||||||
|
- **`email`** is set on `identify` from main only (sourced from `/v1/me`). Person properties are server-side, so the renderer's events resolve to the same record without redundantly setting it.
|
||||||
|
|
||||||
|
## Event catalog
|
||||||
|
|
||||||
|
### `llm_usage`
|
||||||
|
|
||||||
|
Emitted whenever ai-sdk returns token usage (one event per LLM call, not per run).
|
||||||
|
|
||||||
|
| Property | Type | Notes |
|
||||||
|
|---|---|---|
|
||||||
|
| `use_case` | enum | `copilot_chat` / `track_block` / `meeting_note` / `knowledge_sync` |
|
||||||
|
| `sub_use_case` | string? | Refines `use_case` — see taxonomy table below |
|
||||||
|
| `agent_name` | string? | Present when the call goes through an agent run (`createRun`); omitted for direct `generateText`/`generateObject` |
|
||||||
|
| `model` | string | e.g. `claude-sonnet-4-6` |
|
||||||
|
| `provider` | string | `rowboat` = cloud LLM gateway; otherwise the BYOK provider (`openai`, `anthropic`, `ollama`, etc.) |
|
||||||
|
| `input_tokens` | number | |
|
||||||
|
| `output_tokens` | number | |
|
||||||
|
| `total_tokens` | number | |
|
||||||
|
| `cached_input_tokens` | number? | When the provider reports it |
|
||||||
|
| `reasoning_tokens` | number? | When the provider reports it |
|
||||||
|
|
||||||
|
#### Use-case taxonomy
|
||||||
|
|
||||||
|
Every `llm_usage` emit point in the codebase:
|
||||||
|
|
||||||
|
| `use_case` | `sub_use_case` | `agent_name`? | Where | File:line |
|
||||||
|
|---|---|---|---|---|
|
||||||
|
| `copilot_chat` | (none) | yes | User chat in renderer (default for any `createRun` without `useCase`) | `packages/core/src/agents/runtime.ts:1313` (finish-step in `streamLlm`) |
|
||||||
|
| `copilot_chat` | `scheduled` | yes | Background scheduled agent runner | `packages/core/src/agent-schedule/runner.ts:167` |
|
||||||
|
| `copilot_chat` | `file_parse` | inherits | `parseFile` builtin tool inside any chat | `packages/core/src/application/lib/builtin-tools.ts:770` |
|
||||||
|
| `track_block` | `routing` | no | Pass 1 routing classifier (`generateObject`) | `packages/core/src/knowledge/track/routing.ts:104` |
|
||||||
|
| `track_block` | `run` | yes | Pass 2 track block execution | `packages/core/src/knowledge/track/runner.ts:109` (createRun) |
|
||||||
|
| `meeting_note` | (none) | no | Meeting transcript summarizer (`generateText`) | `packages/core/src/knowledge/summarize_meeting.ts:161` |
|
||||||
|
| `knowledge_sync` | `agent_notes` | yes | Agent notes learning service | `packages/core/src/knowledge/agent_notes.ts:309` (createRun) |
|
||||||
|
| `knowledge_sync` | `tag_notes` | yes | Note tagging | `packages/core/src/knowledge/tag_notes.ts:86` (createRun) |
|
||||||
|
| `knowledge_sync` | `build_graph` | yes | Knowledge graph note creation | `packages/core/src/knowledge/build_graph.ts:253` (createRun) |
|
||||||
|
| `knowledge_sync` | `label_emails` | yes | Email labeling | `packages/core/src/knowledge/label_emails.ts:73` (createRun) |
|
||||||
|
| `knowledge_sync` | `inline_task_run` | yes | Inline `@rowboat` task execution (two call sites) | `packages/core/src/knowledge/inline_tasks.ts:471, 552` (createRun) |
|
||||||
|
| `knowledge_sync` | `inline_task_classify` | no | Inline task scheduling classifier (`generateText`) | `packages/core/src/knowledge/inline_tasks.ts:673` |
|
||||||
|
| `knowledge_sync` | `pre_built` | yes | Pre-built scheduled agents | `packages/core/src/pre_built/runner.ts:43` (createRun) |
|
||||||
|
|
||||||
|
`testModelConnection` in `packages/core/src/models/models.ts` is **not** instrumented (diagnostic only — would skew per-model counts).
|
||||||
|
|
||||||
|
### `user_signed_in`
|
||||||
|
|
||||||
|
Emitted when rowboat OAuth completes. Properties: `plan`, `status` (subscription state from `/v1/me`).
|
||||||
|
|
||||||
|
Emitted from **both** processes:
|
||||||
|
- Main (`apps/main/src/oauth-handler.ts:290`) — always fires; load-bearing.
|
||||||
|
- Renderer (`apps/renderer/src/hooks/useAnalyticsIdentity.ts:75`) — fires only when the renderer is open. Same distinct_id, so dedup is automatic in PostHog dashboards.
|
||||||
|
|
||||||
|
### `user_signed_out`
|
||||||
|
|
||||||
|
Emitted on rowboat disconnect. No properties. Followed immediately by `posthog.reset()`.
|
||||||
|
|
||||||
|
Emit points: `apps/main/src/oauth-handler.ts:369` and `apps/renderer/src/hooks/useAnalyticsIdentity.ts:82`.
|
||||||
|
|
||||||
|
### Other events (pre-existing, not added by the LLM-usage work)
|
||||||
|
|
||||||
|
All in `apps/renderer/src/lib/analytics.ts`:
|
||||||
|
|
||||||
|
- `chat_session_created` — `{ run_id }`
|
||||||
|
- `chat_message_sent` — `{ voice_input, voice_output, search_enabled }`
|
||||||
|
- `oauth_connected` / `oauth_disconnected` — `{ provider }`
|
||||||
|
- `voice_input_started` — no properties
|
||||||
|
- `search_executed` — `{ types: string[] }`
|
||||||
|
- `note_exported` — `{ format }`
|
||||||
|
|
||||||
|
## Person properties
|
||||||
|
|
||||||
|
Persistent across sessions for the same user. Set via `posthog.people.set` or as the `properties` arg to `identify`.
|
||||||
|
|
||||||
|
| Property | Set by | Notes |
|
||||||
|
|---|---|---|
|
||||||
|
| `email` | main on identify | From `/v1/me`; powers PostHog cohort match + integrations |
|
||||||
|
| `plan`, `status` | main on identify | Subscription state |
|
||||||
|
| `api_url` | both processes (init + identify) | Distinguishes prod / staging / custom — assign meaning in PostHog dashboard. `https://api.x.rowboatlabs.com` = production |
|
||||||
|
| `signed_in` | renderer | `true` while rowboat OAuth is connected |
|
||||||
|
| `{provider}_connected` | renderer | One of `gmail`, `calendar`, `slack`, `rowboat` |
|
||||||
|
| `total_notes` | renderer (init) | Workspace size signal |
|
||||||
|
| `has_used_search`, `has_used_voice` | renderer | One-shot first-use flags |
|
||||||
|
|
||||||
|
## How to add a new event
|
||||||
|
|
||||||
|
1. **Naming**: `snake_case`, `[object]_[verb]` shape (e.g. `note_exported`, not `exportedNote`). Matches PostHog convention.
|
||||||
|
2. **Pick the right helper**:
|
||||||
|
- LLM token usage → `captureLlmUsage()` from `@x/core/dist/analytics/usage.js`. Always include `useCase`; add `subUseCase` if it refines an existing top-level case.
|
||||||
|
- Anything else from main → `capture()` from `@x/core/dist/analytics/posthog.js`.
|
||||||
|
- Anything else from renderer → add a typed wrapper to `apps/renderer/src/lib/analytics.ts` and call it from the UI code (don't call `posthog.capture()` directly from components).
|
||||||
|
3. **If it's a new LLM call site**:
|
||||||
|
- Goes through `createRun`? Pass `useCase` (and optionally `subUseCase`) to the create call. The runtime auto-emits at every `finish-step` — no further code needed.
|
||||||
|
- Direct `generateText` / `generateObject`? Call `captureLlmUsage` after the call with `model`, `provider`, `usage` from the result.
|
||||||
|
- Inside a builtin tool? Call `getCurrentUseCase()` from `analytics/use_case.ts` first — the parent run's tag is propagated via `AsyncLocalStorage`. Use `ctx?.useCase ?? 'copilot_chat'` as fallback.
|
||||||
|
4. **Update this file in the same PR.** That's the contract — without it, dashboards and downstream consumers drift.
|
||||||
|
|
||||||
|
## How to add a new use-case sub-case
|
||||||
|
|
||||||
|
- **New `sub_use_case` under an existing top-level case**: just pick a string and add a row to the taxonomy table above. No code changes beyond the call site.
|
||||||
|
- **New top-level `use_case`**: edit the `UseCase` enum in `packages/shared/src/runs.ts` and the matching `UseCase` type in `packages/core/src/analytics/use_case.ts`. Then update this doc.
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
PostHog credentials live in two env vars (also baked into the binary at packaging time — never set at runtime in distributed builds):
|
||||||
|
|
||||||
|
- `VITE_PUBLIC_POSTHOG_KEY` — project API key (e.g. `phc_xxx`). Public-facing — safe to commit if you'd rather hardcode.
|
||||||
|
- `VITE_PUBLIC_POSTHOG_HOST` — e.g. `https://us.i.posthog.com`. Defaults to US cloud if unset.
|
||||||
|
|
||||||
|
Where they're consumed:
|
||||||
|
- **Renderer** (Vite): `import.meta.env.VITE_PUBLIC_POSTHOG_*` — inlined at build time.
|
||||||
|
- **Main** (esbuild via `apps/main/bundle.mjs`): inlined into `main.cjs` at packaging time using esbuild `define`. In dev (`npm run dev`), main reads them from `process.env` at runtime.
|
||||||
|
|
||||||
|
For GitHub Actions / packaged builds: set both as workflow env vars (from secrets) on the step that runs `npm run package` or `npm run make`. They'll be baked in.
|
||||||
|
|
||||||
|
If unset, analytics no-op silently — you'll see `[Analytics] POSTHOG_KEY not set; analytics disabled` in main-process logs.
|
||||||
|
|
||||||
|
`installationId`: stored in `~/.rowboat/config/installation.json`, generated on first run.
|
||||||
|
|
||||||
|
## File map
|
||||||
|
|
||||||
|
| File | Purpose |
|
||||||
|
|---|---|
|
||||||
|
| `packages/core/src/analytics/installation.ts` | Stable per-install distinct_id |
|
||||||
|
| `packages/core/src/analytics/posthog.ts` | Main-process client (`capture`, `identify`, `reset`, `shutdown`) |
|
||||||
|
| `packages/core/src/analytics/usage.ts` | `captureLlmUsage()` helper |
|
||||||
|
| `packages/core/src/analytics/use_case.ts` | `AsyncLocalStorage` for tool-internal LLM call inheritance |
|
||||||
|
| `apps/renderer/src/lib/analytics.ts` | Renderer event wrappers |
|
||||||
|
| `apps/renderer/src/hooks/useAnalyticsIdentity.ts` | Renderer identify/reset on OAuth events |
|
||||||
|
| `apps/main/src/oauth-handler.ts` | Main-side identify/reset/sign-in/sign-out events |
|
||||||
|
| `apps/main/src/main.ts` | `before-quit` hook flushes queued events |
|
||||||
|
| `packages/shared/src/ipc.ts` | `analytics:bootstrap` IPC channel definition |
|
||||||
|
| `apps/main/src/ipc.ts` | `analytics:bootstrap` handler + forwards `userId` on `oauth:didConnect` |
|
||||||
|
| `apps/main/bundle.mjs` | Bakes `POSTHOG_KEY`/`POSTHOG_HOST` into packaged `main.cjs` |
|
||||||
|
|
@ -31,6 +31,11 @@ await esbuild.build({
|
||||||
// Replace import.meta.url directly with our polyfill variable
|
// Replace import.meta.url directly with our polyfill variable
|
||||||
define: {
|
define: {
|
||||||
'import.meta.url': '__import_meta_url',
|
'import.meta.url': '__import_meta_url',
|
||||||
|
// Inject PostHog credentials at build time. Reuse the renderer's
|
||||||
|
// VITE_PUBLIC_* envs so packaging only needs one set of values.
|
||||||
|
// Empty strings disable analytics gracefully.
|
||||||
|
'process.env.POSTHOG_KEY': JSON.stringify(process.env.VITE_PUBLIC_POSTHOG_KEY ?? ''),
|
||||||
|
'process.env.POSTHOG_HOST': JSON.stringify(process.env.VITE_PUBLIC_POSTHOG_HOST ?? 'https://us.i.posthog.com'),
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -46,6 +46,8 @@ import { getAccessToken } from '@x/core/dist/auth/tokens.js';
|
||||||
import { getRowboatConfig } from '@x/core/dist/config/rowboat.js';
|
import { getRowboatConfig } from '@x/core/dist/config/rowboat.js';
|
||||||
import { triggerTrackUpdate } from '@x/core/dist/knowledge/track/runner.js';
|
import { triggerTrackUpdate } from '@x/core/dist/knowledge/track/runner.js';
|
||||||
import { trackBus } from '@x/core/dist/knowledge/track/bus.js';
|
import { trackBus } from '@x/core/dist/knowledge/track/bus.js';
|
||||||
|
import { getInstallationId } from '@x/core/dist/analytics/installation.js';
|
||||||
|
import { API_URL } from '@x/core/dist/config/env.js';
|
||||||
import {
|
import {
|
||||||
fetchYaml,
|
fetchYaml,
|
||||||
updateTrackBlock,
|
updateTrackBlock,
|
||||||
|
|
@ -342,7 +344,7 @@ function emitServiceEvent(event: z.infer<typeof ServiceEvent>): void {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function emitOAuthEvent(event: { provider: string; success: boolean; error?: string }): void {
|
export function emitOAuthEvent(event: { provider: string; success: boolean; error?: string; userId?: string }): void {
|
||||||
const windows = BrowserWindow.getAllWindows();
|
const windows = BrowserWindow.getAllWindows();
|
||||||
for (const win of windows) {
|
for (const win of windows) {
|
||||||
if (!win.isDestroyed() && win.webContents) {
|
if (!win.isDestroyed() && win.webContents) {
|
||||||
|
|
@ -415,6 +417,12 @@ export function setupIpcHandlers() {
|
||||||
// args is null for this channel (no request payload)
|
// args is null for this channel (no request payload)
|
||||||
return getVersions();
|
return getVersions();
|
||||||
},
|
},
|
||||||
|
'analytics:bootstrap': async () => {
|
||||||
|
return {
|
||||||
|
installationId: getInstallationId(),
|
||||||
|
apiUrl: API_URL,
|
||||||
|
};
|
||||||
|
},
|
||||||
'workspace:getRoot': async () => {
|
'workspace:getRoot': async () => {
|
||||||
return workspace.getRoot();
|
return workspace.getRoot();
|
||||||
},
|
},
|
||||||
|
|
|
||||||
|
|
@ -26,6 +26,7 @@ import { init as initAgentNotes } from "@x/core/dist/knowledge/agent_notes.js";
|
||||||
import { init as initTrackScheduler } from "@x/core/dist/knowledge/track/scheduler.js";
|
import { init as initTrackScheduler } from "@x/core/dist/knowledge/track/scheduler.js";
|
||||||
import { init as initTrackEventProcessor } from "@x/core/dist/knowledge/track/events.js";
|
import { init as initTrackEventProcessor } from "@x/core/dist/knowledge/track/events.js";
|
||||||
import { init as initLocalSites, shutdown as shutdownLocalSites } from "@x/core/dist/local-sites/server.js";
|
import { init as initLocalSites, shutdown as shutdownLocalSites } from "@x/core/dist/local-sites/server.js";
|
||||||
|
import { shutdown as shutdownAnalytics } from "@x/core/dist/analytics/posthog.js";
|
||||||
|
|
||||||
import { initConfigs } from "@x/core/dist/config/initConfigs.js";
|
import { initConfigs } from "@x/core/dist/config/initConfigs.js";
|
||||||
import started from "electron-squirrel-startup";
|
import started from "electron-squirrel-startup";
|
||||||
|
|
@ -318,4 +319,7 @@ app.on("before-quit", () => {
|
||||||
shutdownLocalSites().catch((error) => {
|
shutdownLocalSites().catch((error) => {
|
||||||
console.error('[LocalSites] Failed to shut down cleanly:', error);
|
console.error('[LocalSites] Failed to shut down cleanly:', error);
|
||||||
});
|
});
|
||||||
|
shutdownAnalytics().catch((error) => {
|
||||||
|
console.error('[Analytics] Failed to flush on quit:', error);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
|
||||||
|
|
@ -12,6 +12,7 @@ import { triggerSync as triggerCalendarSync } from '@x/core/dist/knowledge/sync_
|
||||||
import { triggerSync as triggerFirefliesSync } from '@x/core/dist/knowledge/sync_fireflies.js';
|
import { triggerSync as triggerFirefliesSync } from '@x/core/dist/knowledge/sync_fireflies.js';
|
||||||
import { emitOAuthEvent } from './ipc.js';
|
import { emitOAuthEvent } from './ipc.js';
|
||||||
import { getBillingInfo } from '@x/core/dist/billing/billing.js';
|
import { getBillingInfo } from '@x/core/dist/billing/billing.js';
|
||||||
|
import { capture as analyticsCapture, identify as analyticsIdentify, reset as analyticsReset } from '@x/core/dist/analytics/posthog.js';
|
||||||
|
|
||||||
const REDIRECT_URI = 'http://localhost:8080/oauth/callback';
|
const REDIRECT_URI = 'http://localhost:8080/oauth/callback';
|
||||||
|
|
||||||
|
|
@ -275,16 +276,33 @@ export async function connectProvider(provider: string, credentials?: { clientId
|
||||||
// For Rowboat sign-in, ensure user + Stripe customer exist before
|
// For Rowboat sign-in, ensure user + Stripe customer exist before
|
||||||
// notifying the renderer. Without this, parallel API calls from
|
// notifying the renderer. Without this, parallel API calls from
|
||||||
// multiple renderer hooks race to create the user, causing duplicates.
|
// multiple renderer hooks race to create the user, causing duplicates.
|
||||||
|
let signedInUserId: string | undefined;
|
||||||
if (provider === 'rowboat') {
|
if (provider === 'rowboat') {
|
||||||
try {
|
try {
|
||||||
await getBillingInfo();
|
const billing = await getBillingInfo();
|
||||||
|
if (billing.userId) {
|
||||||
|
signedInUserId = billing.userId;
|
||||||
|
analyticsIdentify(billing.userId, {
|
||||||
|
...(billing.userEmail ? { email: billing.userEmail } : {}),
|
||||||
|
plan: billing.subscriptionPlan,
|
||||||
|
status: billing.subscriptionStatus,
|
||||||
|
});
|
||||||
|
analyticsCapture('user_signed_in', {
|
||||||
|
plan: billing.subscriptionPlan,
|
||||||
|
status: billing.subscriptionStatus,
|
||||||
|
});
|
||||||
|
}
|
||||||
} catch (meError) {
|
} catch (meError) {
|
||||||
console.error('[OAuth] Failed to initialize user via /v1/me:', meError);
|
console.error('[OAuth] Failed to initialize user via /v1/me:', meError);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Emit success event to renderer
|
// Emit success event to renderer
|
||||||
emitOAuthEvent({ provider, success: true });
|
emitOAuthEvent({
|
||||||
|
provider,
|
||||||
|
success: true,
|
||||||
|
...(signedInUserId ? { userId: signedInUserId } : {}),
|
||||||
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('OAuth token exchange failed:', error);
|
console.error('OAuth token exchange failed:', error);
|
||||||
// Log cause chain for debugging (e.g. OAUTH_INVALID_RESPONSE -> OperationProcessingError)
|
// Log cause chain for debugging (e.g. OAUTH_INVALID_RESPONSE -> OperationProcessingError)
|
||||||
|
|
@ -347,6 +365,10 @@ export async function disconnectProvider(provider: string): Promise<{ success: b
|
||||||
try {
|
try {
|
||||||
const oauthRepo = getOAuthRepo();
|
const oauthRepo = getOAuthRepo();
|
||||||
await oauthRepo.delete(provider);
|
await oauthRepo.delete(provider);
|
||||||
|
if (provider === 'rowboat') {
|
||||||
|
analyticsCapture('user_signed_out');
|
||||||
|
analyticsReset();
|
||||||
|
}
|
||||||
// Notify renderer so sidebar, voice, and billing re-check state
|
// Notify renderer so sidebar, voice, and billing re-check state
|
||||||
emitOAuthEvent({ provider, success: false });
|
emitOAuthEvent({ provider, success: false });
|
||||||
return { success: true };
|
return { success: true };
|
||||||
|
|
|
||||||
|
|
@ -58,15 +58,29 @@ export function useAnalyticsIdentity() {
|
||||||
// Listen for OAuth connect/disconnect events to update identity
|
// Listen for OAuth connect/disconnect events to update identity
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const cleanup = window.ipc.on('oauth:didConnect', (event) => {
|
const cleanup = window.ipc.on('oauth:didConnect', (event) => {
|
||||||
if (!event.success) return
|
if (event.provider !== 'rowboat') {
|
||||||
|
// Other providers: just toggle the connection flag
|
||||||
// If Rowboat provider connected, identify user
|
if (event.success) {
|
||||||
if (event.provider === 'rowboat' && event.userId) {
|
posthog.people.set({ [`${event.provider}_connected`]: true })
|
||||||
posthog.identify(event.userId)
|
}
|
||||||
posthog.people.set({ signed_in: true })
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
posthog.people.set({ [`${event.provider}_connected`]: true })
|
// Rowboat sign-in
|
||||||
|
if (event.success) {
|
||||||
|
if (event.userId) {
|
||||||
|
posthog.identify(event.userId)
|
||||||
|
}
|
||||||
|
posthog.people.set({ signed_in: true, rowboat_connected: true })
|
||||||
|
posthog.capture('user_signed_in')
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rowboat sign-out — flip flags, capture, and reset distinct_id so
|
||||||
|
// future events on this device don't get attributed to the prior user.
|
||||||
|
posthog.people.set({ signed_in: false, rowboat_connected: false })
|
||||||
|
posthog.capture('user_signed_out')
|
||||||
|
posthog.reset()
|
||||||
})
|
})
|
||||||
|
|
||||||
return cleanup
|
return cleanup
|
||||||
|
|
|
||||||
|
|
@ -2,15 +2,31 @@ import { StrictMode } from 'react'
|
||||||
import { createRoot } from 'react-dom/client'
|
import { createRoot } from 'react-dom/client'
|
||||||
import './index.css'
|
import './index.css'
|
||||||
import App from './App.tsx'
|
import App from './App.tsx'
|
||||||
|
import posthog from 'posthog-js'
|
||||||
import { PostHogProvider } from 'posthog-js/react'
|
import { PostHogProvider } from 'posthog-js/react'
|
||||||
import { ThemeProvider } from '@/contexts/theme-context'
|
import { ThemeProvider } from '@/contexts/theme-context'
|
||||||
|
|
||||||
const options = {
|
// Fetch the stable installation ID from main so renderer + main share one
|
||||||
|
// PostHog distinct_id. Falls back to PostHog's auto-generated anonymous ID
|
||||||
|
// if the IPC call fails (rare — main is always up before renderer).
|
||||||
|
async function bootstrap() {
|
||||||
|
let installationId: string | undefined
|
||||||
|
let apiUrl: string | undefined
|
||||||
|
try {
|
||||||
|
const result = await window.ipc.invoke('analytics:bootstrap', null)
|
||||||
|
installationId = result.installationId
|
||||||
|
apiUrl = result.apiUrl
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] Failed to bootstrap from main:', err)
|
||||||
|
}
|
||||||
|
|
||||||
|
const options = {
|
||||||
api_host: import.meta.env.VITE_PUBLIC_POSTHOG_HOST,
|
api_host: import.meta.env.VITE_PUBLIC_POSTHOG_HOST,
|
||||||
defaults: '2025-11-30',
|
defaults: '2025-11-30',
|
||||||
} as const
|
...(installationId ? { bootstrap: { distinctID: installationId } } : {}),
|
||||||
|
} as const
|
||||||
|
|
||||||
createRoot(document.getElementById('root')!).render(
|
createRoot(document.getElementById('root')!).render(
|
||||||
<StrictMode>
|
<StrictMode>
|
||||||
<PostHogProvider apiKey={import.meta.env.VITE_PUBLIC_POSTHOG_KEY} options={options}>
|
<PostHogProvider apiKey={import.meta.env.VITE_PUBLIC_POSTHOG_KEY} options={options}>
|
||||||
<ThemeProvider defaultTheme="system">
|
<ThemeProvider defaultTheme="system">
|
||||||
|
|
@ -18,4 +34,13 @@ createRoot(document.getElementById('root')!).render(
|
||||||
</ThemeProvider>
|
</ThemeProvider>
|
||||||
</PostHogProvider>
|
</PostHogProvider>
|
||||||
</StrictMode>,
|
</StrictMode>,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
// Tag the active person record with api_url so anonymous users are also
|
||||||
|
// segmentable by environment.
|
||||||
|
if (apiUrl) {
|
||||||
|
posthog.people.set({ api_url: apiUrl })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
bootstrap()
|
||||||
|
|
|
||||||
|
|
@ -37,6 +37,7 @@
|
||||||
"openid-client": "^6.8.1",
|
"openid-client": "^6.8.1",
|
||||||
"papaparse": "^5.5.3",
|
"papaparse": "^5.5.3",
|
||||||
"pdf-parse": "^2.4.5",
|
"pdf-parse": "^2.4.5",
|
||||||
|
"posthog-node": "^4.18.0",
|
||||||
"react": "^19.2.3",
|
"react": "^19.2.3",
|
||||||
"xlsx": "^0.18.5",
|
"xlsx": "^0.18.5",
|
||||||
"yaml": "^2.8.2",
|
"yaml": "^2.8.2",
|
||||||
|
|
|
||||||
|
|
@ -164,7 +164,11 @@ async function runAgent(
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Create a new run via core (resolves agent + default model+provider).
|
// Create a new run via core (resolves agent + default model+provider).
|
||||||
const run = await createRun({ agentId: agentName });
|
const run = await createRun({
|
||||||
|
agentId: agentName,
|
||||||
|
useCase: 'copilot_chat',
|
||||||
|
subUseCase: 'scheduled',
|
||||||
|
});
|
||||||
console.log(`[AgentRunner] Created run ${run.id} for agent ${agentName}`);
|
console.log(`[AgentRunner] Created run ${run.id} for agent ${agentName}`);
|
||||||
|
|
||||||
// Add the starting message as a user message
|
// Add the starting message as a user message
|
||||||
|
|
|
||||||
|
|
@ -26,6 +26,8 @@ import { IRunsLock } from "../runs/lock.js";
|
||||||
import { IAbortRegistry } from "../runs/abort-registry.js";
|
import { IAbortRegistry } from "../runs/abort-registry.js";
|
||||||
import { PrefixLogger } from "@x/shared";
|
import { PrefixLogger } from "@x/shared";
|
||||||
import { parse } from "yaml";
|
import { parse } from "yaml";
|
||||||
|
import { captureLlmUsage } from "../analytics/usage.js";
|
||||||
|
import { enterUseCase, type UseCase } from "../analytics/use_case.js";
|
||||||
import { getRaw as getNoteCreationRaw } from "../knowledge/note_creation.js";
|
import { getRaw as getNoteCreationRaw } from "../knowledge/note_creation.js";
|
||||||
import { getRaw as getLabelingAgentRaw } from "../knowledge/labeling_agent.js";
|
import { getRaw as getLabelingAgentRaw } from "../knowledge/labeling_agent.js";
|
||||||
import { getRaw as getNoteTaggingAgentRaw } from "../knowledge/note_tagging_agent.js";
|
import { getRaw as getNoteTaggingAgentRaw } from "../knowledge/note_tagging_agent.js";
|
||||||
|
|
@ -650,6 +652,8 @@ export class AgentState {
|
||||||
agentName: string | null = null;
|
agentName: string | null = null;
|
||||||
runModel: string | null = null;
|
runModel: string | null = null;
|
||||||
runProvider: string | null = null;
|
runProvider: string | null = null;
|
||||||
|
runUseCase: UseCase | null = null;
|
||||||
|
runSubUseCase: string | null = null;
|
||||||
messages: z.infer<typeof MessageList> = [];
|
messages: z.infer<typeof MessageList> = [];
|
||||||
lastAssistantMsg: z.infer<typeof AssistantMessage> | null = null;
|
lastAssistantMsg: z.infer<typeof AssistantMessage> | null = null;
|
||||||
subflowStates: Record<string, AgentState> = {};
|
subflowStates: Record<string, AgentState> = {};
|
||||||
|
|
@ -765,6 +769,8 @@ export class AgentState {
|
||||||
this.agentName = event.agentName;
|
this.agentName = event.agentName;
|
||||||
this.runModel = event.model;
|
this.runModel = event.model;
|
||||||
this.runProvider = event.provider;
|
this.runProvider = event.provider;
|
||||||
|
this.runUseCase = event.useCase ?? null;
|
||||||
|
this.runSubUseCase = event.subUseCase ?? null;
|
||||||
break;
|
break;
|
||||||
case "spawn-subflow":
|
case "spawn-subflow":
|
||||||
// Seed the subflow state with its agent so downstream loadAgent works.
|
// Seed the subflow state with its agent so downstream loadAgent works.
|
||||||
|
|
@ -775,6 +781,8 @@ export class AgentState {
|
||||||
this.subflowStates[event.toolCallId].agentName = event.agentName;
|
this.subflowStates[event.toolCallId].agentName = event.agentName;
|
||||||
this.subflowStates[event.toolCallId].runModel = this.runModel;
|
this.subflowStates[event.toolCallId].runModel = this.runModel;
|
||||||
this.subflowStates[event.toolCallId].runProvider = this.runProvider;
|
this.subflowStates[event.toolCallId].runProvider = this.runProvider;
|
||||||
|
this.subflowStates[event.toolCallId].runUseCase = this.runUseCase;
|
||||||
|
this.subflowStates[event.toolCallId].runSubUseCase = this.runSubUseCase;
|
||||||
break;
|
break;
|
||||||
case "message":
|
case "message":
|
||||||
this.messages.push(event.message);
|
this.messages.push(event.message);
|
||||||
|
|
@ -881,6 +889,14 @@ export async function* streamAgent({
|
||||||
const model = provider.languageModel(modelId);
|
const model = provider.languageModel(modelId);
|
||||||
logger.log(`using model: ${modelId} (provider: ${state.runProvider})`);
|
logger.log(`using model: ${modelId} (provider: ${state.runProvider})`);
|
||||||
|
|
||||||
|
// Install use-case context for tool-internal LLM calls (e.g. parseFile)
|
||||||
|
// so they can tag their `llm_usage` events with the parent run's category.
|
||||||
|
enterUseCase({
|
||||||
|
useCase: state.runUseCase ?? "copilot_chat",
|
||||||
|
...(state.runSubUseCase ? { subUseCase: state.runSubUseCase } : {}),
|
||||||
|
...(state.agentName ? { agentName: state.agentName } : {}),
|
||||||
|
});
|
||||||
|
|
||||||
let loopCounter = 0;
|
let loopCounter = 0;
|
||||||
let voiceInput = false;
|
let voiceInput = false;
|
||||||
let voiceOutput: 'summary' | 'full' | null = null;
|
let voiceOutput: 'summary' | 'full' | null = null;
|
||||||
|
|
@ -1114,6 +1130,13 @@ export async function* streamAgent({
|
||||||
instructionsWithDateTime,
|
instructionsWithDateTime,
|
||||||
tools,
|
tools,
|
||||||
signal,
|
signal,
|
||||||
|
{
|
||||||
|
useCase: state.runUseCase ?? "copilot_chat",
|
||||||
|
...(state.runSubUseCase ? { subUseCase: state.runSubUseCase } : {}),
|
||||||
|
agentName: state.agentName ?? undefined,
|
||||||
|
modelId,
|
||||||
|
providerName: state.runProvider!,
|
||||||
|
},
|
||||||
)) {
|
)) {
|
||||||
messageBuilder.ingest(event);
|
messageBuilder.ingest(event);
|
||||||
yield* processEvent({
|
yield* processEvent({
|
||||||
|
|
@ -1201,12 +1224,21 @@ export async function* streamAgent({
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface StreamLlmAnalytics {
|
||||||
|
useCase: UseCase;
|
||||||
|
subUseCase?: string;
|
||||||
|
agentName?: string;
|
||||||
|
modelId: string;
|
||||||
|
providerName: string;
|
||||||
|
}
|
||||||
|
|
||||||
async function* streamLlm(
|
async function* streamLlm(
|
||||||
model: LanguageModel,
|
model: LanguageModel,
|
||||||
messages: z.infer<typeof MessageList>,
|
messages: z.infer<typeof MessageList>,
|
||||||
instructions: string,
|
instructions: string,
|
||||||
tools: ToolSet,
|
tools: ToolSet,
|
||||||
signal?: AbortSignal,
|
signal?: AbortSignal,
|
||||||
|
analytics?: StreamLlmAnalytics,
|
||||||
): AsyncGenerator<z.infer<typeof LlmStepStreamEvent>, void, unknown> {
|
): AsyncGenerator<z.infer<typeof LlmStepStreamEvent>, void, unknown> {
|
||||||
const converted = convertFromMessages(messages);
|
const converted = convertFromMessages(messages);
|
||||||
console.log(`! SENDING payload to model: `, JSON.stringify(converted))
|
console.log(`! SENDING payload to model: `, JSON.stringify(converted))
|
||||||
|
|
@ -1277,6 +1309,16 @@ async function* streamLlm(
|
||||||
};
|
};
|
||||||
break;
|
break;
|
||||||
case "finish-step":
|
case "finish-step":
|
||||||
|
if (analytics) {
|
||||||
|
captureLlmUsage({
|
||||||
|
useCase: analytics.useCase,
|
||||||
|
...(analytics.subUseCase ? { subUseCase: analytics.subUseCase } : {}),
|
||||||
|
...(analytics.agentName ? { agentName: analytics.agentName } : {}),
|
||||||
|
model: analytics.modelId,
|
||||||
|
provider: analytics.providerName,
|
||||||
|
usage: event.usage,
|
||||||
|
});
|
||||||
|
}
|
||||||
yield {
|
yield {
|
||||||
type: "finish-step",
|
type: "finish-step",
|
||||||
usage: event.usage,
|
usage: event.usage,
|
||||||
|
|
|
||||||
37
apps/x/packages/core/src/analytics/installation.ts
Normal file
37
apps/x/packages/core/src/analytics/installation.ts
Normal file
|
|
@ -0,0 +1,37 @@
|
||||||
|
import fs from 'node:fs';
|
||||||
|
import path from 'node:path';
|
||||||
|
import { randomUUID } from 'node:crypto';
|
||||||
|
import { WorkDir } from '../config/config.js';
|
||||||
|
|
||||||
|
const INSTALLATION_PATH = path.join(WorkDir, 'config', 'installation.json');
|
||||||
|
|
||||||
|
let cached: string | null = null;
|
||||||
|
|
||||||
|
export function getInstallationId(): string {
|
||||||
|
if (cached) return cached;
|
||||||
|
try {
|
||||||
|
if (fs.existsSync(INSTALLATION_PATH)) {
|
||||||
|
const raw = fs.readFileSync(INSTALLATION_PATH, 'utf-8');
|
||||||
|
const parsed = JSON.parse(raw) as { installationId?: string };
|
||||||
|
if (parsed.installationId && typeof parsed.installationId === 'string') {
|
||||||
|
cached = parsed.installationId;
|
||||||
|
return cached;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] Failed to read installation.json:', err);
|
||||||
|
}
|
||||||
|
|
||||||
|
const id = randomUUID();
|
||||||
|
try {
|
||||||
|
const dir = path.dirname(INSTALLATION_PATH);
|
||||||
|
if (!fs.existsSync(dir)) {
|
||||||
|
fs.mkdirSync(dir, { recursive: true });
|
||||||
|
}
|
||||||
|
fs.writeFileSync(INSTALLATION_PATH, JSON.stringify({ installationId: id }, null, 2));
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] Failed to write installation.json:', err);
|
||||||
|
}
|
||||||
|
cached = id;
|
||||||
|
return id;
|
||||||
|
}
|
||||||
90
apps/x/packages/core/src/analytics/posthog.ts
Normal file
90
apps/x/packages/core/src/analytics/posthog.ts
Normal file
|
|
@ -0,0 +1,90 @@
|
||||||
|
import { PostHog } from 'posthog-node';
|
||||||
|
import { getInstallationId } from './installation.js';
|
||||||
|
import { API_URL } from '../config/env.js';
|
||||||
|
|
||||||
|
// Build-time injected via esbuild `define` (apps/main/bundle.mjs).
|
||||||
|
// In dev/tsc, fall back to process.env so local runs work too.
|
||||||
|
const POSTHOG_KEY = process.env.POSTHOG_KEY ?? process.env.VITE_PUBLIC_POSTHOG_KEY ?? '';
|
||||||
|
const POSTHOG_HOST = process.env.POSTHOG_HOST ?? process.env.VITE_PUBLIC_POSTHOG_HOST ?? 'https://us.i.posthog.com';
|
||||||
|
|
||||||
|
let client: PostHog | null = null;
|
||||||
|
let initAttempted = false;
|
||||||
|
let identifiedUserId: string | null = null;
|
||||||
|
|
||||||
|
function getClient(): PostHog | null {
|
||||||
|
if (initAttempted) return client;
|
||||||
|
initAttempted = true;
|
||||||
|
if (!POSTHOG_KEY) {
|
||||||
|
console.log('[Analytics] POSTHOG_KEY not set; analytics disabled');
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
client = new PostHog(POSTHOG_KEY, {
|
||||||
|
host: POSTHOG_HOST,
|
||||||
|
flushAt: 20,
|
||||||
|
flushInterval: 10_000,
|
||||||
|
});
|
||||||
|
// Tag the install with api_url as a person property up-front,
|
||||||
|
// so anonymous users are also segmentable by environment (api_url
|
||||||
|
// distinguishes prod / staging / custom — meaning is assigned in PostHog).
|
||||||
|
client.identify({
|
||||||
|
distinctId: getInstallationId(),
|
||||||
|
properties: { api_url: API_URL },
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] Failed to init PostHog:', err);
|
||||||
|
client = null;
|
||||||
|
}
|
||||||
|
return client;
|
||||||
|
}
|
||||||
|
|
||||||
|
function activeDistinctId(): string {
|
||||||
|
return identifiedUserId ?? getInstallationId();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function capture(event: string, properties?: Record<string, unknown>): void {
|
||||||
|
const ph = getClient();
|
||||||
|
if (!ph) return;
|
||||||
|
try {
|
||||||
|
ph.capture({
|
||||||
|
distinctId: activeDistinctId(),
|
||||||
|
event,
|
||||||
|
properties,
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] capture failed:', err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function identify(userId: string, properties?: Record<string, unknown>): void {
|
||||||
|
const ph = getClient();
|
||||||
|
if (!ph) return;
|
||||||
|
try {
|
||||||
|
// Alias the anonymous installation ID to the rowboat user ID so historical
|
||||||
|
// anonymous events are linked to the identified user.
|
||||||
|
ph.alias({ distinctId: userId, alias: getInstallationId() });
|
||||||
|
ph.identify({
|
||||||
|
distinctId: userId,
|
||||||
|
properties: {
|
||||||
|
...properties,
|
||||||
|
api_url: API_URL,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
identifiedUserId = userId;
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] identify failed:', err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function reset(): void {
|
||||||
|
identifiedUserId = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function shutdown(): Promise<void> {
|
||||||
|
if (!client) return;
|
||||||
|
try {
|
||||||
|
await client.shutdown();
|
||||||
|
} catch (err) {
|
||||||
|
console.error('[Analytics] shutdown failed:', err);
|
||||||
|
}
|
||||||
|
}
|
||||||
38
apps/x/packages/core/src/analytics/usage.ts
Normal file
38
apps/x/packages/core/src/analytics/usage.ts
Normal file
|
|
@ -0,0 +1,38 @@
|
||||||
|
import { capture } from './posthog.js';
|
||||||
|
import type { UseCase } from './use_case.js';
|
||||||
|
|
||||||
|
// Shape compatible with ai-sdk v5 `LanguageModelUsage`.
|
||||||
|
// All fields are optional because providers report subsets.
|
||||||
|
export interface LlmUsageInput {
|
||||||
|
inputTokens?: number;
|
||||||
|
outputTokens?: number;
|
||||||
|
totalTokens?: number;
|
||||||
|
reasoningTokens?: number;
|
||||||
|
cachedInputTokens?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CaptureLlmUsageArgs {
|
||||||
|
useCase: UseCase;
|
||||||
|
subUseCase?: string;
|
||||||
|
agentName?: string;
|
||||||
|
model: string;
|
||||||
|
provider: string;
|
||||||
|
usage: LlmUsageInput | undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function captureLlmUsage(args: CaptureLlmUsageArgs): void {
|
||||||
|
const usage = args.usage ?? {};
|
||||||
|
const properties: Record<string, unknown> = {
|
||||||
|
use_case: args.useCase,
|
||||||
|
model: args.model,
|
||||||
|
provider: args.provider,
|
||||||
|
input_tokens: usage.inputTokens ?? 0,
|
||||||
|
output_tokens: usage.outputTokens ?? 0,
|
||||||
|
total_tokens: usage.totalTokens ?? (usage.inputTokens ?? 0) + (usage.outputTokens ?? 0),
|
||||||
|
};
|
||||||
|
if (args.subUseCase) properties.sub_use_case = args.subUseCase;
|
||||||
|
if (args.agentName) properties.agent_name = args.agentName;
|
||||||
|
if (usage.cachedInputTokens != null) properties.cached_input_tokens = usage.cachedInputTokens;
|
||||||
|
if (usage.reasoningTokens != null) properties.reasoning_tokens = usage.reasoningTokens;
|
||||||
|
capture('llm_usage', properties);
|
||||||
|
}
|
||||||
28
apps/x/packages/core/src/analytics/use_case.ts
Normal file
28
apps/x/packages/core/src/analytics/use_case.ts
Normal file
|
|
@ -0,0 +1,28 @@
|
||||||
|
import { AsyncLocalStorage } from 'node:async_hooks';
|
||||||
|
|
||||||
|
export type UseCase = 'copilot_chat' | 'track_block' | 'meeting_note' | 'knowledge_sync';
|
||||||
|
|
||||||
|
export interface UseCaseContext {
|
||||||
|
useCase: UseCase;
|
||||||
|
subUseCase?: string;
|
||||||
|
agentName?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const storage = new AsyncLocalStorage<UseCaseContext>();
|
||||||
|
|
||||||
|
export function withUseCase<T>(ctx: UseCaseContext, fn: () => T): T {
|
||||||
|
return storage.run(ctx, fn);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Permanently install a use-case context for the current async chain.
|
||||||
|
* Use inside generator functions where wrapping with `withUseCase()` doesn't
|
||||||
|
* compose. Child async work (e.g. tool execution) will inherit it.
|
||||||
|
*/
|
||||||
|
export function enterUseCase(ctx: UseCaseContext): void {
|
||||||
|
storage.enterWith(ctx);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getCurrentUseCase(): UseCaseContext | undefined {
|
||||||
|
return storage.getStore();
|
||||||
|
}
|
||||||
|
|
@ -22,6 +22,8 @@ import type { ToolContext } from "./exec-tool.js";
|
||||||
import { generateText } from "ai";
|
import { generateText } from "ai";
|
||||||
import { createProvider } from "../../models/models.js";
|
import { createProvider } from "../../models/models.js";
|
||||||
import { getDefaultModelAndProvider, resolveProviderConfig } from "../../models/defaults.js";
|
import { getDefaultModelAndProvider, resolveProviderConfig } from "../../models/defaults.js";
|
||||||
|
import { captureLlmUsage } from "../../analytics/usage.js";
|
||||||
|
import { getCurrentUseCase } from "../../analytics/use_case.js";
|
||||||
import { isSignedIn } from "../../account/account.js";
|
import { isSignedIn } from "../../account/account.js";
|
||||||
import { getAccessToken } from "../../auth/tokens.js";
|
import { getAccessToken } from "../../auth/tokens.js";
|
||||||
import { API_URL } from "../../config/env.js";
|
import { API_URL } from "../../config/env.js";
|
||||||
|
|
@ -764,6 +766,16 @@ export const BuiltinTools: z.infer<typeof BuiltinToolsSchema> = {
|
||||||
],
|
],
|
||||||
});
|
});
|
||||||
|
|
||||||
|
const ctx = getCurrentUseCase();
|
||||||
|
captureLlmUsage({
|
||||||
|
useCase: ctx?.useCase ?? 'copilot_chat',
|
||||||
|
subUseCase: 'file_parse',
|
||||||
|
...(ctx?.agentName ? { agentName: ctx.agentName } : {}),
|
||||||
|
model: modelId,
|
||||||
|
provider: providerName,
|
||||||
|
usage: response.usage,
|
||||||
|
});
|
||||||
|
|
||||||
return {
|
return {
|
||||||
success: true,
|
success: true,
|
||||||
fileName,
|
fileName,
|
||||||
|
|
|
||||||
|
|
@ -306,7 +306,12 @@ async function processAgentNotes(): Promise<void> {
|
||||||
const timestamp = new Date().toISOString();
|
const timestamp = new Date().toISOString();
|
||||||
const message = `Current timestamp: ${timestamp}\n\nProcess the following source material and update the Agent Notes folder accordingly.\n\n${messageParts.join('\n\n')}`;
|
const message = `Current timestamp: ${timestamp}\n\nProcess the following source material and update the Agent Notes folder accordingly.\n\n${messageParts.join('\n\n')}`;
|
||||||
|
|
||||||
const agentRun = await createRun({ agentId: AGENT_ID, model: await getKgModel() });
|
const agentRun = await createRun({
|
||||||
|
agentId: AGENT_ID,
|
||||||
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'agent_notes',
|
||||||
|
});
|
||||||
await createMessage(agentRun.id, message);
|
await createMessage(agentRun.id, message);
|
||||||
await waitForRunCompletion(agentRun.id);
|
await waitForRunCompletion(agentRun.id);
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -252,6 +252,8 @@ async function createNotesFromBatch(
|
||||||
// Create a run for the note creation agent
|
// Create a run for the note creation agent
|
||||||
const run = await createRun({
|
const run = await createRun({
|
||||||
agentId: NOTE_CREATION_AGENT,
|
agentId: NOTE_CREATION_AGENT,
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'build_graph',
|
||||||
});
|
});
|
||||||
const suggestedTopicsContent = readSuggestedTopicsFile();
|
const suggestedTopicsContent = readSuggestedTopicsFile();
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,6 +10,7 @@ import type { IModelConfigRepo } from '../models/repo.js';
|
||||||
import { createProvider } from '../models/models.js';
|
import { createProvider } from '../models/models.js';
|
||||||
import { inlineTask } from '@x/shared';
|
import { inlineTask } from '@x/shared';
|
||||||
import { extractAgentResponse, waitForRunCompletion } from '../agents/utils.js';
|
import { extractAgentResponse, waitForRunCompletion } from '../agents/utils.js';
|
||||||
|
import { captureLlmUsage } from '../analytics/usage.js';
|
||||||
|
|
||||||
const SYNC_INTERVAL_MS = 15 * 1000; // 15 seconds
|
const SYNC_INTERVAL_MS = 15 * 1000; // 15 seconds
|
||||||
const INLINE_TASK_AGENT = 'inline_task_agent';
|
const INLINE_TASK_AGENT = 'inline_task_agent';
|
||||||
|
|
@ -468,7 +469,12 @@ async function processInlineTasks(): Promise<void> {
|
||||||
console.log(`[InlineTasks] Running task: "${task.instruction.slice(0, 80)}..."`);
|
console.log(`[InlineTasks] Running task: "${task.instruction.slice(0, 80)}..."`);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const run = await createRun({ agentId: INLINE_TASK_AGENT, model: await getKgModel() });
|
const run = await createRun({
|
||||||
|
agentId: INLINE_TASK_AGENT,
|
||||||
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'inline_task_run',
|
||||||
|
});
|
||||||
|
|
||||||
const message = [
|
const message = [
|
||||||
`Execute the following instruction from the note "${relativePath}":`,
|
`Execute the following instruction from the note "${relativePath}":`,
|
||||||
|
|
@ -548,7 +554,12 @@ export async function processRowboatInstruction(
|
||||||
scheduleLabel: string | null;
|
scheduleLabel: string | null;
|
||||||
response: string | null;
|
response: string | null;
|
||||||
}> {
|
}> {
|
||||||
const run = await createRun({ agentId: INLINE_TASK_AGENT, model: await getKgModel() });
|
const run = await createRun({
|
||||||
|
agentId: INLINE_TASK_AGENT,
|
||||||
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'inline_task_run',
|
||||||
|
});
|
||||||
|
|
||||||
const message = [
|
const message = [
|
||||||
`Process the following @rowboat instruction from the note "${notePath}":`,
|
`Process the following @rowboat instruction from the note "${notePath}":`,
|
||||||
|
|
@ -659,6 +670,14 @@ Respond with ONLY valid JSON: either a schedule object or null. No other text.`;
|
||||||
prompt: instruction,
|
prompt: instruction,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
captureLlmUsage({
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'inline_task_classify',
|
||||||
|
model: config.model,
|
||||||
|
provider: config.provider.flavor,
|
||||||
|
usage: result.usage,
|
||||||
|
});
|
||||||
|
|
||||||
let text = result.text.trim();
|
let text = result.text.trim();
|
||||||
console.log('[classifySchedule] LLM response:', text);
|
console.log('[classifySchedule] LLM response:', text);
|
||||||
// Strip markdown code fences if the LLM wraps the JSON
|
// Strip markdown code fences if the LLM wraps the JSON
|
||||||
|
|
|
||||||
|
|
@ -73,6 +73,8 @@ async function labelEmailBatch(
|
||||||
const run = await createRun({
|
const run = await createRun({
|
||||||
agentId: LABELING_AGENT,
|
agentId: LABELING_AGENT,
|
||||||
model: await getKgModel(),
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'label_emails',
|
||||||
});
|
});
|
||||||
|
|
||||||
let message = `Label the following ${files.length} email files by prepending YAML frontmatter.\n\n`;
|
let message = `Label the following ${files.length} email files by prepending YAML frontmatter.\n\n`;
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,7 @@ import { generateText } from 'ai';
|
||||||
import { createProvider } from '../models/models.js';
|
import { createProvider } from '../models/models.js';
|
||||||
import { getDefaultModelAndProvider, getMeetingNotesModel, resolveProviderConfig } from '../models/defaults.js';
|
import { getDefaultModelAndProvider, getMeetingNotesModel, resolveProviderConfig } from '../models/defaults.js';
|
||||||
import { WorkDir } from '../config/config.js';
|
import { WorkDir } from '../config/config.js';
|
||||||
|
import { captureLlmUsage } from '../analytics/usage.js';
|
||||||
|
|
||||||
const CALENDAR_SYNC_DIR = path.join(WorkDir, 'calendar_sync');
|
const CALENDAR_SYNC_DIR = path.join(WorkDir, 'calendar_sync');
|
||||||
|
|
||||||
|
|
@ -157,5 +158,12 @@ export async function summarizeMeeting(transcript: string, meetingStartTime?: st
|
||||||
prompt,
|
prompt,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
captureLlmUsage({
|
||||||
|
useCase: 'meeting_note',
|
||||||
|
model: modelId,
|
||||||
|
provider: providerName,
|
||||||
|
usage: result.usage,
|
||||||
|
});
|
||||||
|
|
||||||
return result.text.trim();
|
return result.text.trim();
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -86,6 +86,8 @@ async function tagNoteBatch(
|
||||||
const run = await createRun({
|
const run = await createRun({
|
||||||
agentId: NOTE_TAGGING_AGENT,
|
agentId: NOTE_TAGGING_AGENT,
|
||||||
model: await getKgModel(),
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'tag_notes',
|
||||||
});
|
});
|
||||||
|
|
||||||
let message = `Tag the following ${files.length} knowledge notes by prepending YAML frontmatter with appropriate tags.\n\n`;
|
let message = `Tag the following ${files.length} knowledge notes by prepending YAML frontmatter with appropriate tags.\n\n`;
|
||||||
|
|
|
||||||
|
|
@ -3,6 +3,7 @@ import { trackBlock, PrefixLogger } from '@x/shared';
|
||||||
import type { KnowledgeEvent } from '@x/shared/dist/track-block.js';
|
import type { KnowledgeEvent } from '@x/shared/dist/track-block.js';
|
||||||
import { createProvider } from '../../models/models.js';
|
import { createProvider } from '../../models/models.js';
|
||||||
import { getDefaultModelAndProvider, getTrackBlockModel, resolveProviderConfig } from '../../models/defaults.js';
|
import { getDefaultModelAndProvider, getTrackBlockModel, resolveProviderConfig } from '../../models/defaults.js';
|
||||||
|
import { captureLlmUsage } from '../../analytics/usage.js';
|
||||||
|
|
||||||
const log = new PrefixLogger('TrackRouting');
|
const log = new PrefixLogger('TrackRouting');
|
||||||
|
|
||||||
|
|
@ -34,10 +35,14 @@ Rules:
|
||||||
- For each candidate, return BOTH trackId and filePath exactly as given. trackIds are not globally unique.`;
|
- For each candidate, return BOTH trackId and filePath exactly as given. trackIds are not globally unique.`;
|
||||||
|
|
||||||
async function resolveModel() {
|
async function resolveModel() {
|
||||||
const model = await getTrackBlockModel();
|
const modelId = await getTrackBlockModel();
|
||||||
const { provider } = await getDefaultModelAndProvider();
|
const { provider } = await getDefaultModelAndProvider();
|
||||||
const config = await resolveProviderConfig(provider);
|
const config = await resolveProviderConfig(provider);
|
||||||
return createProvider(config).languageModel(model);
|
return {
|
||||||
|
model: createProvider(config).languageModel(modelId),
|
||||||
|
modelId,
|
||||||
|
providerName: provider,
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
function buildRoutingPrompt(event: KnowledgeEvent, batch: ParsedTrack[]): string {
|
function buildRoutingPrompt(event: KnowledgeEvent, batch: ParsedTrack[]): string {
|
||||||
|
|
@ -84,19 +89,26 @@ export async function findCandidates(
|
||||||
|
|
||||||
log.log(`Routing event ${event.id} against ${filtered.length} track(s)`);
|
log.log(`Routing event ${event.id} against ${filtered.length} track(s)`);
|
||||||
|
|
||||||
const model = await resolveModel();
|
const { model, modelId, providerName } = await resolveModel();
|
||||||
const candidateKeys = new Set<string>();
|
const candidateKeys = new Set<string>();
|
||||||
|
|
||||||
for (let i = 0; i < filtered.length; i += BATCH_SIZE) {
|
for (let i = 0; i < filtered.length; i += BATCH_SIZE) {
|
||||||
const batch = filtered.slice(i, i + BATCH_SIZE);
|
const batch = filtered.slice(i, i + BATCH_SIZE);
|
||||||
try {
|
try {
|
||||||
const { object } = await generateObject({
|
const result = await generateObject({
|
||||||
model,
|
model,
|
||||||
system: ROUTING_SYSTEM_PROMPT,
|
system: ROUTING_SYSTEM_PROMPT,
|
||||||
prompt: buildRoutingPrompt(event, batch),
|
prompt: buildRoutingPrompt(event, batch),
|
||||||
schema: trackBlock.Pass1OutputSchema,
|
schema: trackBlock.Pass1OutputSchema,
|
||||||
});
|
});
|
||||||
for (const c of object.candidates) {
|
captureLlmUsage({
|
||||||
|
useCase: 'track_block',
|
||||||
|
subUseCase: 'routing',
|
||||||
|
model: modelId,
|
||||||
|
provider: providerName,
|
||||||
|
usage: result.usage,
|
||||||
|
});
|
||||||
|
for (const c of result.object.candidates) {
|
||||||
candidateKeys.add(trackKey(c.trackId, c.filePath));
|
candidateKeys.add(trackKey(c.trackId, c.filePath));
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
|
|
|
||||||
|
|
@ -110,6 +110,8 @@ export async function triggerTrackUpdate(
|
||||||
agentId: 'track-run',
|
agentId: 'track-run',
|
||||||
model,
|
model,
|
||||||
...(track.track.provider ? { provider: track.track.provider } : {}),
|
...(track.track.provider ? { provider: track.track.provider } : {}),
|
||||||
|
useCase: 'track_block',
|
||||||
|
subUseCase: 'run',
|
||||||
});
|
});
|
||||||
|
|
||||||
// Set lastRunAt and lastRunId immediately (before agent executes) so
|
// Set lastRunAt and lastRunId immediately (before agent executes) so
|
||||||
|
|
|
||||||
|
|
@ -43,6 +43,8 @@ async function runAgent(agentName: string): Promise<void> {
|
||||||
const run = await createRun({
|
const run = await createRun({
|
||||||
agentId: agentName,
|
agentId: agentName,
|
||||||
model: await getKgModel(),
|
model: await getKgModel(),
|
||||||
|
useCase: 'knowledge_sync',
|
||||||
|
subUseCase: 'pre_built',
|
||||||
});
|
});
|
||||||
|
|
||||||
// Build trigger message with user context
|
// Build trigger message with user context
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,7 @@ import path from "path";
|
||||||
import fsp from "fs/promises";
|
import fsp from "fs/promises";
|
||||||
import fs from "fs";
|
import fs from "fs";
|
||||||
import readline from "readline";
|
import readline from "readline";
|
||||||
import { Run, RunEvent, StartEvent, CreateRunOptions, ListRunsResponse, MessageEvent } from "@x/shared/dist/runs.js";
|
import { Run, RunEvent, StartEvent, ListRunsResponse, MessageEvent, UseCase } from "@x/shared/dist/runs.js";
|
||||||
import { getDefaultModelAndProvider } from "../models/defaults.js";
|
import { getDefaultModelAndProvider } from "../models/defaults.js";
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
@ -24,7 +24,13 @@ const LegacyStartEvent = StartEvent.extend({
|
||||||
});
|
});
|
||||||
const ReadRunEvent = RunEvent.or(LegacyStartEvent);
|
const ReadRunEvent = RunEvent.or(LegacyStartEvent);
|
||||||
|
|
||||||
export type CreateRunRepoOptions = Required<z.infer<typeof CreateRunOptions>>;
|
export type CreateRunRepoOptions = {
|
||||||
|
agentId: string;
|
||||||
|
model: string;
|
||||||
|
provider: string;
|
||||||
|
useCase: z.infer<typeof UseCase>;
|
||||||
|
subUseCase?: string;
|
||||||
|
};
|
||||||
|
|
||||||
export interface IRunsRepo {
|
export interface IRunsRepo {
|
||||||
create(options: CreateRunRepoOptions): Promise<z.infer<typeof Run>>;
|
create(options: CreateRunRepoOptions): Promise<z.infer<typeof Run>>;
|
||||||
|
|
@ -187,6 +193,8 @@ export class FSRunsRepo implements IRunsRepo {
|
||||||
agentName: options.agentId,
|
agentName: options.agentId,
|
||||||
model: options.model,
|
model: options.model,
|
||||||
provider: options.provider,
|
provider: options.provider,
|
||||||
|
useCase: options.useCase,
|
||||||
|
...(options.subUseCase ? { subUseCase: options.subUseCase } : {}),
|
||||||
subflow: [],
|
subflow: [],
|
||||||
ts,
|
ts,
|
||||||
};
|
};
|
||||||
|
|
@ -197,6 +205,8 @@ export class FSRunsRepo implements IRunsRepo {
|
||||||
agentId: options.agentId,
|
agentId: options.agentId,
|
||||||
model: options.model,
|
model: options.model,
|
||||||
provider: options.provider,
|
provider: options.provider,
|
||||||
|
useCase: options.useCase,
|
||||||
|
...(options.subUseCase ? { subUseCase: options.subUseCase } : {}),
|
||||||
log: [start],
|
log: [start],
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
@ -230,6 +240,8 @@ export class FSRunsRepo implements IRunsRepo {
|
||||||
agentId: start.agentName,
|
agentId: start.agentName,
|
||||||
model: start.model,
|
model: start.model,
|
||||||
provider: start.provider,
|
provider: start.provider,
|
||||||
|
...(start.useCase ? { useCase: start.useCase } : {}),
|
||||||
|
...(start.subUseCase ? { subUseCase: start.subUseCase } : {}),
|
||||||
log: events,
|
log: events,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -23,8 +23,15 @@ export async function createRun(opts: z.infer<typeof CreateRunOptions>): Promise
|
||||||
const defaults = await getDefaultModelAndProvider();
|
const defaults = await getDefaultModelAndProvider();
|
||||||
const model = opts.model ?? agent.model ?? defaults.model;
|
const model = opts.model ?? agent.model ?? defaults.model;
|
||||||
const provider = opts.provider ?? agent.provider ?? defaults.provider;
|
const provider = opts.provider ?? agent.provider ?? defaults.provider;
|
||||||
|
const useCase = opts.useCase ?? "copilot_chat";
|
||||||
|
|
||||||
const run = await repo.create({ agentId: opts.agentId, model, provider });
|
const run = await repo.create({
|
||||||
|
agentId: opts.agentId,
|
||||||
|
model,
|
||||||
|
provider,
|
||||||
|
useCase,
|
||||||
|
...(opts.subUseCase ? { subUseCase: opts.subUseCase } : {}),
|
||||||
|
});
|
||||||
await bus.publish(run.log[0]);
|
await bus.publish(run.log[0]);
|
||||||
return run;
|
return run;
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -25,6 +25,13 @@ const ipcSchemas = {
|
||||||
electron: z.string(),
|
electron: z.string(),
|
||||||
}),
|
}),
|
||||||
},
|
},
|
||||||
|
'analytics:bootstrap': {
|
||||||
|
req: z.null(),
|
||||||
|
res: z.object({
|
||||||
|
installationId: z.string(),
|
||||||
|
apiUrl: z.string(),
|
||||||
|
}),
|
||||||
|
},
|
||||||
'workspace:getRoot': {
|
'workspace:getRoot': {
|
||||||
req: z.null(),
|
req: z.null(),
|
||||||
res: z.object({
|
res: z.object({
|
||||||
|
|
|
||||||
|
|
@ -21,6 +21,15 @@ export const StartEvent = BaseRunEvent.extend({
|
||||||
agentName: z.string(),
|
agentName: z.string(),
|
||||||
model: z.string(),
|
model: z.string(),
|
||||||
provider: z.string(),
|
provider: z.string(),
|
||||||
|
// useCase/subUseCase tag the run for analytics. Optional on read so legacy
|
||||||
|
// run files written before these fields existed still parse cleanly.
|
||||||
|
useCase: z.enum([
|
||||||
|
"copilot_chat",
|
||||||
|
"track_block",
|
||||||
|
"meeting_note",
|
||||||
|
"knowledge_sync",
|
||||||
|
]).optional(),
|
||||||
|
subUseCase: z.string().optional(),
|
||||||
});
|
});
|
||||||
|
|
||||||
export const SpawnSubFlowEvent = BaseRunEvent.extend({
|
export const SpawnSubFlowEvent = BaseRunEvent.extend({
|
||||||
|
|
@ -118,6 +127,13 @@ export const AskHumanResponsePayload = AskHumanResponseEvent.pick({
|
||||||
response: true,
|
response: true,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
export const UseCase = z.enum([
|
||||||
|
"copilot_chat",
|
||||||
|
"track_block",
|
||||||
|
"meeting_note",
|
||||||
|
"knowledge_sync",
|
||||||
|
]);
|
||||||
|
|
||||||
export const Run = z.object({
|
export const Run = z.object({
|
||||||
id: z.string(),
|
id: z.string(),
|
||||||
title: z.string().optional(),
|
title: z.string().optional(),
|
||||||
|
|
@ -125,6 +141,8 @@ export const Run = z.object({
|
||||||
agentId: z.string(),
|
agentId: z.string(),
|
||||||
model: z.string(),
|
model: z.string(),
|
||||||
provider: z.string(),
|
provider: z.string(),
|
||||||
|
useCase: UseCase.optional(),
|
||||||
|
subUseCase: z.string().optional(),
|
||||||
log: z.array(RunEvent),
|
log: z.array(RunEvent),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
@ -142,4 +160,6 @@ export const CreateRunOptions = z.object({
|
||||||
agentId: z.string(),
|
agentId: z.string(),
|
||||||
model: z.string().optional(),
|
model: z.string().optional(),
|
||||||
provider: z.string().optional(),
|
provider: z.string().optional(),
|
||||||
|
useCase: UseCase.optional(),
|
||||||
|
subUseCase: z.string().optional(),
|
||||||
});
|
});
|
||||||
|
|
|
||||||
13
apps/x/pnpm-lock.yaml
generated
13
apps/x/pnpm-lock.yaml
generated
|
|
@ -404,6 +404,9 @@ importers:
|
||||||
pdf-parse:
|
pdf-parse:
|
||||||
specifier: ^2.4.5
|
specifier: ^2.4.5
|
||||||
version: 2.4.5
|
version: 2.4.5
|
||||||
|
posthog-node:
|
||||||
|
specifier: ^4.18.0
|
||||||
|
version: 4.18.0
|
||||||
react:
|
react:
|
||||||
specifier: ^19.2.3
|
specifier: ^19.2.3
|
||||||
version: 19.2.3
|
version: 19.2.3
|
||||||
|
|
@ -6471,6 +6474,10 @@ packages:
|
||||||
posthog-js@1.332.0:
|
posthog-js@1.332.0:
|
||||||
resolution: {integrity: sha512-w3+sL+IFK4mpfFmgTW7On8cR+z34pre+SOewx+eHZQSYF9RYqXsLIhrxagWbQKkowPd4tCwUHrkS1+VHsjnPqA==}
|
resolution: {integrity: sha512-w3+sL+IFK4mpfFmgTW7On8cR+z34pre+SOewx+eHZQSYF9RYqXsLIhrxagWbQKkowPd4tCwUHrkS1+VHsjnPqA==}
|
||||||
|
|
||||||
|
posthog-node@4.18.0:
|
||||||
|
resolution: {integrity: sha512-XROs1h+DNatgKh/AlIlCtDxWzwrKdYDb2mOs58n4yN8BkGN9ewqeQwG5ApS4/IzwCb7HPttUkOVulkYatd2PIw==}
|
||||||
|
engines: {node: '>=15.0.0'}
|
||||||
|
|
||||||
postject@1.0.0-alpha.6:
|
postject@1.0.0-alpha.6:
|
||||||
resolution: {integrity: sha512-b9Eb8h2eVqNE8edvKdwqkrY6O7kAwmI8kcnBv1NScolYJbo59XUF0noFq+lxbC1yN20bmC0WBEbDC5H/7ASb0A==}
|
resolution: {integrity: sha512-b9Eb8h2eVqNE8edvKdwqkrY6O7kAwmI8kcnBv1NScolYJbo59XUF0noFq+lxbC1yN20bmC0WBEbDC5H/7ASb0A==}
|
||||||
engines: {node: '>=14.0.0'}
|
engines: {node: '>=14.0.0'}
|
||||||
|
|
@ -15203,6 +15210,12 @@ snapshots:
|
||||||
query-selector-shadow-dom: 1.0.1
|
query-selector-shadow-dom: 1.0.1
|
||||||
web-vitals: 4.2.4
|
web-vitals: 4.2.4
|
||||||
|
|
||||||
|
posthog-node@4.18.0:
|
||||||
|
dependencies:
|
||||||
|
axios: 1.13.2
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- debug
|
||||||
|
|
||||||
postject@1.0.0-alpha.6:
|
postject@1.0.0-alpha.6:
|
||||||
dependencies:
|
dependencies:
|
||||||
commander: 9.5.0
|
commander: 9.5.0
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue