diff --git a/docs/source/guides/observability/tracing.rst b/docs/source/guides/observability/tracing.rst index 23bf80d2..8dda14fd 100644 --- a/docs/source/guides/observability/tracing.rst +++ b/docs/source/guides/observability/tracing.rst @@ -289,6 +289,53 @@ To send tracing data to `Datadog `_: + +1. **Configure Arch**: Make sure Arch is installed and setup correctly. For more information, refer to the `installation guide `_. + +2. **Install Langtrace**: Install the Langtrace SDK.: + + .. code-block:: console + + $ pip install langtrace-python-sdk + +3. **Set Environment Variables**: Provide your Langtrace API key. + + .. code-block:: console + + $ export LANGTRACE_API_KEY= + +4. **Trace Requests**: Once you have Langtrace set up, you can start tracing requests. + + Here's an example of how to trace a request using the Langtrace Python SDK: + + .. code-block:: python + + import os + from langtrace_python_sdk import langtrace # Must precede any llm module imports + from openai import OpenAI + + langtrace.init(api_key=os.environ['LANGTRACE_API_KEY']) + + client = OpenAI(api_key=os.environ['OPENAI_API_KEY'], base_url="http://localhost:12000/v1") + + response = client.chat.completions.create( + model="gpt-4o-mini", + messages=[ + {"role": "system", "content": "You are a helpful assistant"}, + {"role": "user", "content": "Hello"}, + ] + ) + + print(chat_completion.choices[0].message.content) + +5. **Verify Traces**: Access the Langtrace dashboard to view your traces. + Best Practices -------------- @@ -312,6 +359,7 @@ Additional Resources - `W3C Trace Context Specification `_ - `AWS X-Ray Exporter `_ - `Datadog Exporter `_ +- `Langtrace Documentation `_ .. Note:: Replace placeholders such as ```` and ```` with your actual configurations.