
[](https://pypi.org/project/trustgraph/) 
[](https://discord.gg/sQMwkRz5GX) [](https://deepwiki.com/trustgraph-ai/trustgraph)
[**Website**](https://trustgraph.ai) | [**Docs**](https://docs.trustgraph.ai) | [**YouTube**](https://www.youtube.com/@TrustGraphAI?sub_confirmation=1) | [**Configuration Builder**](https://config-ui.demo.trustgraph.ai/) | [**Discord**](https://discord.gg/sQMwkRz5GX) | [**Blog**](https://blog.trustgraph.ai/subscribe)

# The context backend for AI agents
Durable agent memory you can trust. Build, version, and retrieve grounded context from a context graph.
- Give agents **memory** that persists across sessions and deployments.
- Reduce hallucinations with **grounded context retrieval**
- Ship reusable, portable [Context Cores](#context-cores) (packaged context you can move between projects/environments).
The context backend:
- [x] Multi-model and multimodal database system
- [x] Tabular/relational, key-value
- [x] Document, graph, and vectors
- [x] Images, video, and audio
- [x] Automated data ingest and loading
- [x] Quick ingest with semantic similarity retrieval
- [x] Ontology structuring for precision retrieval
- [x] Out-of-the-box RAG pipelines
- [x] DocumentRAG
- [x] GraphRAG
- [x] OntologyRAG
- [x] 3D GraphViz for exploring context
- [x] Fully Agentic System
- [x] Single Agent
- [x] Multi Agent
- [x] MCP integration
- [x] Run anywhere
- [x] Deploy locally with Docker
- [x] Deploy in cloud with Kubernetes
- [x] Support for all major LLMs
- [x] API support for Anthropic, Cohere, Gemini, Mistral, OpenAI, and others
- [x] Model inferencing with vLLM, Ollama, TGI, LM Studio, and Llamafiles
- [x] Developer friendly
- [x] REST API [Docs](https://docs.trustgraph.ai/reference/apis/rest.html)
- [x] Websocket API [Docs](https://docs.trustgraph.ai/reference/apis/websocket.html)
- [x] Python API [Docs](https://docs.trustgraph.ai/reference/apis/python)
- [x] CLI [Docs](https://docs.trustgraph.ai/reference/cli/)
## Quickstart
```
npx @trustgraph/config
```
TrustGraph downloads as Docker containers and can be run locally with Docker, Podman, or Minikube. The config tool will generate:
- `deploy.zip` with either a `docker-compose.yaml` file for a Docker/Podman deploy or `resources.yaml` for Kubernetes
- Deployment instructions as `INSTALLATION.md`