mirror of
https://github.com/trustgraph-ai/trustgraph.git
synced 2026-04-26 08:56:21 +02:00
Introduces `workspace` as the isolation boundary for config, flows,
library, and knowledge data. Removes `user` as a schema-level field
throughout the code, API specs, and tests; workspace provides the
same separation more cleanly at the trusted flow.workspace layer
rather than through client-supplied message fields.
Design
------
- IAM tech spec (docs/tech-specs/iam.md) documents current state,
proposed auth/access model, and migration direction.
- Data ownership model (docs/tech-specs/data-ownership-model.md)
captures the workspace/collection/flow hierarchy.
Schema + messaging
------------------
- Drop `user` field from AgentRequest/Step, GraphRagQuery,
DocumentRagQuery, Triples/Graph/Document/Row EmbeddingsRequest,
Sparql/Rows/Structured QueryRequest, ToolServiceRequest.
- Keep collection/workspace routing via flow.workspace at the
service layer.
- Translators updated to not serialise/deserialise user.
API specs
---------
- OpenAPI schemas and path examples cleaned of user fields.
- Websocket async-api messages updated.
- Removed the unused parameters/User.yaml.
Services + base
---------------
- Librarian, collection manager, knowledge, config: all operations
scoped by workspace. Config client API takes workspace as first
positional arg.
- `flow.workspace` set at flow start time by the infrastructure;
no longer pass-through from clients.
- Tool service drops user-personalisation passthrough.
CLI + SDK
---------
- tg-init-workspace and workspace-aware import/export.
- All tg-* commands drop user args; accept --workspace.
- Python API/SDK (flow, socket_client, async_*, explainability,
library) drop user kwargs from every method signature.
MCP server
----------
- All tool endpoints drop user parameters; socket_manager no longer
keyed per user.
Flow service
------------
- Closure-based topic cleanup on flow stop: only delete topics
whose blueprint template was parameterised AND no remaining
live flow (across all workspaces) still resolves to that topic.
Three scopes fall out naturally from template analysis:
* {id} -> per-flow, deleted on stop
* {blueprint} -> per-blueprint, kept while any flow of the
same blueprint exists
* {workspace} -> per-workspace, kept while any flow in the
workspace exists
* literal -> global, never deleted (e.g. tg.request.librarian)
Fixes a bug where stopping a flow silently destroyed the global
librarian exchange, wedging all library operations until manual
restart.
RabbitMQ backend
----------------
- heartbeat=60, blocked_connection_timeout=300. Catches silently
dead connections (broker restart, orphaned channels, network
partitions) within ~2 heartbeat windows, so the consumer
reconnects and re-binds its queue rather than sitting forever
on a zombie connection.
Tests
-----
- Full test refresh: unit, integration, contract, provenance.
- Dropped user-field assertions and constructor kwargs across
~100 test files.
- Renamed user-collection isolation tests to workspace-collection.
143 lines
4.9 KiB
YAML
143 lines
4.9 KiB
YAML
post:
|
|
tags:
|
|
- Flow Services
|
|
summary: Agent service - conversational AI with reasoning
|
|
description: |
|
|
AI agent that can understand questions, reason about them, and take actions.
|
|
|
|
## Agent Overview
|
|
|
|
The agent service provides a conversational AI that:
|
|
- Understands natural language questions
|
|
- Reasons about problems using thoughts
|
|
- Takes actions to gather information
|
|
- Provides coherent answers
|
|
|
|
## Request Format
|
|
|
|
Send a question with optional:
|
|
- **state**: Continue from previous conversation
|
|
- **history**: Previous agent steps for context
|
|
- **group**: Collaborative agent identifiers
|
|
- **streaming**: Enable streaming responses
|
|
|
|
## Response Modes
|
|
|
|
### Streaming Mode (streaming: true)
|
|
Responses arrive as chunks with `chunk-type`:
|
|
- `thought`: Agent's reasoning process
|
|
- `action`: Action being taken
|
|
- `observation`: Result from action
|
|
- `answer`: Final response to user
|
|
- `explain`: Provenance event with inline triples (`explain_triples`)
|
|
- `error`: Error occurred
|
|
|
|
Each chunk may have multiple messages. Check flags:
|
|
- `end-of-message`: Current chunk type complete
|
|
- `end-of-dialog`: Entire conversation complete
|
|
|
|
### Legacy Mode (streaming: false)
|
|
Single response with:
|
|
- `answer`: Complete answer
|
|
- `thought`: Reasoning (if any)
|
|
- `observation`: Observations (if any)
|
|
|
|
## Multi-turn Conversations
|
|
|
|
Include `history` array with previous steps to maintain context.
|
|
Each step has: thought, action, arguments, observation.
|
|
|
|
operationId: agentService
|
|
security:
|
|
- bearerAuth: []
|
|
parameters:
|
|
- name: flow
|
|
in: path
|
|
required: true
|
|
schema:
|
|
type: string
|
|
description: Flow instance ID
|
|
example: my-flow
|
|
requestBody:
|
|
required: true
|
|
content:
|
|
application/json:
|
|
schema:
|
|
$ref: '../../components/schemas/agent/AgentRequest.yaml'
|
|
examples:
|
|
simpleQuestion:
|
|
summary: Simple question
|
|
value:
|
|
question: What is the capital of France?
|
|
streamingQuestion:
|
|
summary: Question with streaming enabled
|
|
value:
|
|
question: Explain quantum computing
|
|
streaming: true
|
|
conversationWithHistory:
|
|
summary: Multi-turn conversation
|
|
value:
|
|
question: And what about its population?
|
|
history:
|
|
- thought: User is asking about the capital of France
|
|
action: search
|
|
arguments:
|
|
query: "capital of France"
|
|
observation: "Paris is the capital of France"
|
|
responses:
|
|
'200':
|
|
description: Successful response
|
|
content:
|
|
application/json:
|
|
schema:
|
|
$ref: '../../components/schemas/agent/AgentResponse.yaml'
|
|
examples:
|
|
streamingThought:
|
|
summary: Streaming thought chunk
|
|
value:
|
|
chunk-type: thought
|
|
content: I need to search for information about quantum computing
|
|
end-of-message: false
|
|
end-of-dialog: false
|
|
streamingAnswer:
|
|
summary: Streaming answer chunk
|
|
value:
|
|
chunk-type: answer
|
|
content: Quantum computing uses quantum mechanics principles...
|
|
end-of-message: false
|
|
end-of-dialog: false
|
|
streamingComplete:
|
|
summary: Streaming complete marker
|
|
value:
|
|
chunk-type: answer
|
|
content: ""
|
|
end-of-message: true
|
|
end-of-dialog: true
|
|
explainEvent:
|
|
summary: Explain event with inline provenance triples
|
|
value:
|
|
chunk-type: explain
|
|
content: ""
|
|
explain_id: urn:trustgraph:agent:abc123
|
|
explain_graph: urn:graph:retrieval
|
|
explain_triples:
|
|
- s: {t: i, i: "urn:trustgraph:agent:abc123"}
|
|
p: {t: i, i: "http://www.w3.org/1999/02/22-rdf-syntax-ns#type"}
|
|
o: {t: i, i: "https://trustgraph.ai/ns/AgentSession"}
|
|
- s: {t: i, i: "urn:trustgraph:agent:abc123"}
|
|
p: {t: i, i: "https://trustgraph.ai/ns/query"}
|
|
o: {t: l, v: "Explain quantum computing"}
|
|
end-of-message: true
|
|
end-of-dialog: false
|
|
legacyResponse:
|
|
summary: Legacy non-streaming response
|
|
value:
|
|
answer: Paris is the capital of France.
|
|
thought: User is asking about the capital of France
|
|
observation: ""
|
|
end-of-message: false
|
|
end-of-dialog: false
|
|
'401':
|
|
$ref: '../../components/responses/Unauthorized.yaml'
|
|
'500':
|
|
$ref: '../../components/responses/Error.yaml'
|