Update doc (#178)

* Update doc

* Update links
This commit is contained in:
Shuguang Chen 2024-10-10 22:30:54 -07:00 committed by GitHub
parent 7b51cce2f7
commit 11fba23f1f
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
14 changed files with 82 additions and 118 deletions

View file

@ -5,14 +5,14 @@ Agentic Workflow
Arch helps you easily personalize your applications by calling application-specific (API) functions
via user prompts. This involves any predefined functions or APIs you want to expose to users to perform tasks,
gather information, or manipulate data. This capability is generally referred to as **function calling**, where
gather information, or manipulate data. This capability is generally referred to as :ref:`function calling <function_calling>`, where
you have the flexibility to support “agentic” apps tailored to specific use cases - from updating insurance
claims to creating ad campaigns - via prompts.
Arch analyzes prompts, extracts critical information from prompts, engages in lightweight conversation with
the user to gather any missing parameters and makes API calls so that you can focus on writing business logic.
Arch does this via its purpose-built :ref:`Arch-Function <function_calling>` - the fastest (200ms p90 - 10x faser than GPT-4o)
and cheapest (100x than GPT-40) function-calling LLM that matches performance with frontier models.
Arch does this via its purpose-built `Arch-Function <https://huggingface.co/collections/katanemo/arch-function-66f209a693ea8df14317ad68>`_ - the fastest (200ms p90 - 10x faser than GPT-4o)
and cheapest (100x than GPT-4o) function calling LLM that matches performance with frontier models.
.. image:: includes/agent/function-calling-flow.jpg
:width: 100%
@ -31,7 +31,7 @@ Step 1: Define Prompt Targets
.. literalinclude:: includes/agent/function-calling-agent.yaml
:language: yaml
:linenos:
:emphasize-lines: 21-34
:emphasize-lines: 19-49
:caption: Prompt Target Example Configuration
Step 2: Process Request Parameters
@ -66,5 +66,5 @@ Example of Multiple Prompt Targets in YAML:
.. literalinclude:: includes/agent/function-calling-agent.yaml
:language: yaml
:linenos:
:emphasize-lines: 21-34
:emphasize-lines: 19-49
:caption: Prompt Target Example Configuration

View file

@ -46,7 +46,7 @@ prompt_targets:
- name: time_range
type: int
description: Time range in days for which to gather device statistics. Defaults to 7.
default: "7"
default: 7
# Arch creates a round-robin load balancing between different endpoints, managed via the cluster subsystem.
endpoints:

View file

@ -9,7 +9,7 @@ Retrieval-Augmented Generation (RAG) applications.
Parameter Extraction for RAG
----------------------------
To build RAG (Retrieval-Augmented Generation) applications, you can configure prompt targets with parameters,
To build RAG (Retrieval Augmented Generation) applications, you can configure prompt targets with parameters,
enabling Arch to retrieve critical information in a structured way for processing. This approach improves the
retrieval quality and speed of your application. By extracting parameters from the conversation, you can pull
the appropriate chunks from a vector database or SQL-like data store to enhance accuracy. With Arch, you can
@ -37,12 +37,12 @@ Once the prompt targets are configured as above, handling those parameters is
-----------------------------------------------------------------------------------------------------------------------------------------
Developers struggle to efficiently handle ``follow-up`` or ``clarification`` questions. Specifically, when users ask for
changes or additions to previous responses their AI applications often generate entirely new responses instead of adjusting
previous ones.Arch offers **intent** tracking as a feature so that developers can know when the user has shifted away from a
previous ones. Arch offers ``intent tracking`` as a feature so that developers can know when the user has shifted away from a
previous intent so that they can dramatically improve retrieval accuracy, lower overall token cost and improve the speed of
their responses back to users.
Arch uses its built-in lightweight NLI and embedding models to know if the user has steered away from an active intent.
Arch's intent-drift detection mechanism is based on its' :ref:`prompt_targets <prompt_target>` primtive. Arch tries to match an incoming
Arch's intent-drift detection mechanism is based on its :ref:`prompt target <prompt_target>` primtive. Arch tries to match an incoming
prompt to one of the prompt_targets configured in the gateway. Once it detects that the user has moved away from an active
active intent, Arch adds the ``x-arch-intent-marker`` headers to the request before sending it your application servers.
@ -50,15 +50,15 @@ active intent, Arch adds the ``x-arch-intent-marker`` headers to the request bef
:language: python
:linenos:
:lines: 101-157
:emphasize-lines: 14-24
:emphasize-lines: 14-25
:caption: Intent Detection Example
.. Note::
Arch is (mostly) stateless so that it can scale in an embarrassingly parrallel fashion. So, while Arch offers
intent-drift detetction, you still have to maintain converational state with intent drift as meta-data. The
following code snippets show how easily you can build and enrich conversational history with Langchain (in python),
intent-drift detetction, you still have to maintain converational state with intent drift as metadata. The
following code snippets show how easily you can build and enrich conversational history with Langchain (in Python),
so that you can use the most relevant prompts for your retrieval and for prompting upstream LLMs.