Merge branch 'release/v0.11' into release/v0.12

This commit is contained in:
Cyber MacGeddon 2024-10-04 14:26:44 +01:00
commit 090f09fa38
5 changed files with 128 additions and 41 deletions

78
DEVELOPER_GUIDE.md Normal file
View file

@ -0,0 +1,78 @@
# Developer's guide
## Release management
To do a public release you need to...
- Get the git directory ready for the release
- Tag the repo e.g.
```
git tag -a v1.2.3 -m ''
git push --tags
```
- Generate the deploy templates, don't add the `v` to the version number.
```
templates/generate-all deploy.zip 1.2.3
```
- Release
- Go to github, on Code tab, select tags, and find the right version
- Select create release
- Select the right previous version and generate release notes
- At the bottom of the form, find the upload pad, click that and add the
deploy.zip created earlier
- Create Python packages. You need a PyPi token with access to our repos
- make packages
- make pypi-upload
- Create containers. You need a docker hub token with acccess to our repos
- make
- make push
## Local build
To do a local build, you need to...
- Consider what version you want to build at, and change this in Makefile.
It doesn't really matter so long as there isn't a clash with what's in
the public repos. You could stick with the version that's there, or
change to 0.0.0 if you're paranoid about pushing something accidentally.
- If you changed the version to generate templates with your version, or
changed deployment templates, you need to recreate launch assets to
a deploy.zip file:
```
templates/generate-all deploy.zip V.V.V
```
- Build containers
```
make
```
- If you changed anything which affects command line stuff (which maybe
you could do if you changed schemas), then
```
make packages
```
That puts Python packages in dist/ You then need to install some or
all of those packages. Typically you only need -base and -cli to
an appropriate environment e.g. use Python `venv` to create a virtual
environment and install them there.

View file

@ -1,6 +1,6 @@
# VERSION=$(shell git describe | sed 's/^v//')
VERSION=0.11.17
VERSION=0.11.19
DOCKER=podman

View file

@ -1,7 +1,7 @@
# TrustGraph
![TrustGraph banner](TG_Banner_readme.png)
![TrustGraph banner](TG_readme.png)
🚀 [Full Documentation](https://trustgraph.ai/docs/getstarted)
💬 [Join the Discord](https://discord.gg/AXpxVjwzAw)
@ -16,54 +16,58 @@ The pipeline processing components are interconnected with a pub/sub engine to m
The processing showcases the reliability and efficiences of GraphRAG algorithms which can capture contextual language flags that are missed in conventional RAG approaches. Graph querying algorithms enable retrieving not just relevant knowledge but language cues essential to understanding semantic uses unique to a text corpus.
## Deploy in Minutes
TrustGraph is designed to deploy all the services and stores needed for a scalable GraphRAG infrastructure as quickly and simply as possible.
### Install Requirements
## Install the TrustGraph CLI
```
python3 -m venv env
. env/bin/activate
pip3 install pulsar-client
pip3 install cassandra-driver
export PYTHON_PATH=.
pip3 install trustgraph-cli
```
### Download TrustGraph
## Download TrustGraph
```
git clone https://github.com/trustgraph-ai/trustgraph trustgraph
cd trustgraph
```
TrustGraph releases are available [here](https://github.com/trustgraph-ai/trustgraph/releases). Download `deploy.zip` for the desired release version.
TrustGraph is fully containerized and is launched with a Docker Compose `YAML` file. These files are prebuilt and included in the download main directory. Simply select the file that matches your desired model deployment and graph store configuration.
| Release Type | Release Version |
| ------------ | --------------- |
| Latest | [0.11.19](https://github.com/trustgraph-ai/trustgraph/releases/download/v0.11.19/deploy.zip) |
| Stable | [0.11.19](https://github.com/trustgraph-ai/trustgraph/releases/download/v0.11.19/deploy.zip) |
TrustGraph is fully containerized and is launched with a `YAML` configuration file. Unzipping the `deploy.zip` will add the `deploy` directory with the following subdirectories:
- `docker-compose`
- `minikube-k8s`
- `gcp-k8s`
Each directory contains the pre-built `YAML` configuration files needed to launch TrustGraph:
| Model Deployment | Graph Store | Launch File |
| ---------------- | ------------ | ----------- |
| AWS Bedrock | Cassandra | `tg-launch-bedrock-cassandra.yaml` |
| AWS Bedrock | Neo4j | `tg-launch-bedrock-neo4j.yaml` |
| AzureAI Serverless Endpoint | Cassandra | `tg-launch-azure-cassandra.yaml` |
| AzureAI Serverless Endpoint | Neo4j | `tg-launch-azure-neo4j.yaml` |
| Anthropic API | Cassandra | `tg-launch-claude-cassandra.yaml` |
| Anthropic API | Neo4j | `tg-launch-claude-neo4j.yaml` |
| Cohere API | Cassandra | `tg-launch-cohere-cassandra.yaml` |
| Cohere API | Neo4j | `tg-launch-cohere-neo4j.yaml` |
| Llamafile | Cassandra | `tg-launch-llamafile-cassandra.yaml` |
| Llamafile | Neo4j | `tg-launch-llamafile-neo4j.yaml` |
| Mixed Depoloyment | Cassandra | `tg-launch-mix-cassandra.yaml` |
| Mixed Depoloyment | Neo4j | `tg-launch-mix-neo4j.yaml` |
| Ollama | Cassandra | `tg-launch-ollama-cassandra.yaml` |
| Ollama | Neo4j | `tg-launch-ollama-neo4j.yaml` |
| OpenAI | Cassandra | `tg-launch-openai-cassandra.yaml` |
| OpenAI | Neo4j | `tg-launch-openai-neo4j.yaml` |
| VertexAI | Cassandra | `tg-launch-vertexai-cassandra.yaml` |
| VertexAI | Neo4j | `tg-launch-vertexai-neo4j.yaml` |
| AWS Bedrock API | Cassandra | `tg-bedrock-cassandra.yaml` |
| AWS Bedrock API | Neo4j | `tg-bedrock-neo4j.yaml` |
| AzureAI Serverless API | Cassandra | `tg-azure-cassandra.yaml` |
| AzureAI Serverless API | Neo4j | `tg-azure-neo4j.yaml` |
| Anthropic API | Cassandra | `tg-claude-cassandra.yaml` |
| Anthropic API | Neo4j | `tg-claude-neo4j.yaml` |
| Cohere API | Cassandra | `tg-cohere-cassandra.yaml` |
| Cohere API | Neo4j | `tg-cohere-neo4j.yaml` |
| Llamafile API | Cassandra | `tg-llamafile-cassandra.yaml` |
| Llamafile API | Neo4j | `tg-llamafile-neo4j.yaml` |
| Ollama API | Cassandra | `tg-ollama-cassandra.yaml` |
| Ollama API | Neo4j | `tg-ollama-neo4j.yaml` |
| OpenAI API | Cassandra | `tg-openai-cassandra.yaml` |
| OpenAI API | Neo4j | `tg-openai-neo4j.yaml` |
| VertexAI API | Cassandra | `tg-vertexai-cassandra.yaml` |
| VertexAI API | Neo4j | `tg-vertexai-neo4j.yaml` |
Launching TrustGraph is as simple as running one line:
**Docker**:
```
docker compose -f <launch-file> up -d
docker compose -f <launch-file.yaml> up -d
```
**Kubernetes**:
```
kubectl apply -f <launch-file.yaml>
```
## Core TrustGraph Features
@ -80,7 +84,7 @@ docker compose -f <launch-file> up -d
- GraphRAG query service
- [Grafana](https://github.com/grafana/) telemetry dashboard
- Module integration with [Apache Pulsar](https://github.com/apache/pulsar/)
- Container orchestration with `Docker` or [Podman](http://podman.io/)
- Container orchestration with `Docker`, `Podman`, or `Minikube`
## Architecture
@ -105,12 +109,12 @@ The agent prompts are built through templates, enabling customized extraction ag
PDF file:
```
scripts/load-pdf -f sample-text-corpus.pdf
tg-load-pdf sample-text-corpus.pdf
```
Text file:
```
scripts/load-text -f sample-text-corpus.txt
tg-load-text sample-text-corpus.txt
```
## GraphRAG Queries
@ -118,7 +122,7 @@ scripts/load-text -f sample-text-corpus.txt
Once the knowledge graph has been built or a knowledge core has been loaded, GraphRAG queries are launched with a single line:
```
scripts/query-graph-rag -q "Write a blog post about the 5 key takeaways from SB1047 and how they will impact AI development."
tg-query-graph-rag -q "Write a blog post about the 5 key takeaways from SB1047 and how they will impact AI development."
```
## Deploy and Manage TrustGraph

Binary file not shown.

Before

Width:  |  Height:  |  Size: 159 KiB

After

Width:  |  Height:  |  Size: 158 KiB

Before After
Before After

View file

@ -87,8 +87,13 @@ local url = import "values/url.jsonnet";
"kg-extract-topics", [ container ]
);
local service =
engine.internalService(containerSet)
.with_port(8000, 8000, "metrics");
engine.resources([
containerSet,
service,
])
},