Skip to main content

Integrating Prompt Templates into Spans

By instrumenting the prompt template, users can fully utilize Future AGI’s prompt playground. There’s no need to deploy a new template version to test if changes in prompt text or variables achieve the desired effect. Instead, you can experiment with these modifications directly in the playground UI.

Implementation Details

We provide a using_prompt_template context manager to add a prompt template into the current OpenTelemetry Context. FI auto-instrumentors will read this Context and pass the prompt template fields as span attributes, adhering to the traceAI semantic conventions.

Required Parameters

ParameterTypeDescriptionExample
templatestrThe string for the prompt template”Please describe the weather forecast for on
versionstrIdentifier for the template version”v1.0”
variablesDict[str]Dictionary containing variables to fill the template{"city": "San Francisco", "date": "March 27"}

Sample Implementation

Begin by installing the necessary dependencies:
pip install fi-instrumentation-otel traceai_openai openai
Below is a comprehensive example demonstrating how to implement prompt template tracing:
import os
from fi_instrumentation import register, Transport, using_attributes
from traceai_openai import OpenAIInstrumentor
from fi_instrumentation.fi_types import ProjectType
from traceai_langchain import LangChainInstrumentor


from fi_instrumentation import register
from fi_instrumentation.fi_types import (
    ProjectType,

)
from traceai_langchain import LangChainInstrumentor



os.environ["GOOGLE_API_KEY"] = "google_api_key"
print(os.environ.get("GOOGLE_API_KEY"))


os.environ["OPENAI_API_KEY"] = "futureagi_openai_api_key"
os.environ["FI_API_KEY"] = "futureagi_api_key"
os.environ["FI_SECRET_KEY"] = "futureagi_secret_key"



# Setup OTel via our register function
trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="<project_name>",            # Your project name
    transport=Transport.HTTP,           # Transport mechanism for your traces
)


OpenAIInstrumentor().instrument(tracer_provider=trace_provider)
LangChainInstrumentor().instrument(tracer_provider=trace_provider)


from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

with using_attributes(
    prompt_template="<prompt_template_name>",
    prompt_template_label="<prompt_template_label>",

):

    prompt = ChatPromptTemplate.from_template("{x} {y} {z}?").partial(x="why is", z="blue")
    chain = prompt | ChatOpenAI(model_name="gpt-3.5-turbo")

    result = chain.invoke({"y": "sky"})

    print(f"Response: {result}")