跳到内容

设置 OpenTelemetry POC

来源 https://github.com/vllm-project/vllm/tree/main/examples/online_serving/opentelemetry

  1. 安装 OpenTelemetry 包

    pip install \
      'opentelemetry-sdk>=1.26.0,<1.27.0' \
      'opentelemetry-api>=1.26.0,<1.27.0' \
      'opentelemetry-exporter-otlp>=1.26.0,<1.27.0' \
      'opentelemetry-semantic-conventions-ai>=0.4.1,<0.5.0'
    
  2. 在 Docker 容器中启动 Jaeger

    # From: https://jaeger.golang.ac.cn/docs/1.57/getting-started/
    docker run --rm --name jaeger \
        -e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
        -p 6831:6831/udp \
        -p 6832:6832/udp \
        -p 5778:5778 \
        -p 16686:16686 \
        -p 4317:4317 \
        -p 4318:4318 \
        -p 14250:14250 \
        -p 14268:14268 \
        -p 14269:14269 \
        -p 9411:9411 \
        jaegertracing/all-in-one:1.57
    
  3. 在新终端中,导出 Jaeger IP

    export JAEGER_IP=$(docker inspect   --format '{{ .NetworkSettings.IPAddress }}' jaeger)
    export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=grpc://$JAEGER_IP:4317
    

    然后设置 vLLM 的 OpenTelemetry 服务名称,启用与 Jaeger 的不安全连接并运行 vLLM

    export OTEL_SERVICE_NAME="vllm-server"
    export OTEL_EXPORTER_OTLP_TRACES_INSECURE=true
    vllm serve facebook/opt-125m --otlp-traces-endpoint="$OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"
    
  4. 在新终端中,使用模拟客户端发送带有追踪上下文的请求

    export JAEGER_IP=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' jaeger)
    export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=grpc://$JAEGER_IP:4317
    export OTEL_EXPORTER_OTLP_TRACES_INSECURE=true
    export OTEL_SERVICE_NAME="client-service"
    python dummy_client.py
    
  5. 打开 Jaeger webui: https://:16686/

    在搜索面板中,选择 vllm-server 服务并点击 Find Traces。您应该会得到一个追踪列表,每个请求一个。 追踪

  6. 点击一个追踪会显示其 spans 和它们的 tags。在此演示中,每个追踪有 2 个 spans。一个来自模拟客户端,包含提示文本,另一个来自 vLLM,包含有关请求的元数据。 Spans 详情

导出器协议

OpenTelemetry 在导出器中支持 grpchttp/protobuf 作为追踪数据的传输协议。默认情况下,使用 grpc。要将 http/protobuf 设置为协议,请按如下方式配置 OTEL_EXPORTER_OTLP_TRACES_PROTOCOL 环境变量

export OTEL_EXPORTER_OTLP_TRACES_PROTOCOL=http/protobuf
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://$JAEGER_IP:4318/v1/traces
vllm serve facebook/opt-125m --otlp-traces-endpoint="$OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"

FastAPI 的插装

OpenTelemetry 允许自动插装 FastAPI。

  1. 安装插装库

    pip install opentelemetry-instrumentation-fastapi
    
  2. 使用 opentelemetry-instrument 运行 vLLM

    opentelemetry-instrument vllm serve facebook/opt-125m
    
  3. 向 vLLM 发送请求并在 Jaeger 中查找其追踪。它应该包含来自 FastAPI 的 spans。

FastAPI Spans

示例材料

dummy_client.py
# SPDX-License-Identifier: Apache-2.0
# SPDX-FileCopyrightText: Copyright contributors to the vLLM project

import requests
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter
from opentelemetry.trace import SpanKind, set_tracer_provider
from opentelemetry.trace.propagation.tracecontext import TraceContextTextMapPropagator

trace_provider = TracerProvider()
set_tracer_provider(trace_provider)

trace_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
trace_provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))

tracer = trace_provider.get_tracer("dummy-client")

url = "https://:8000/v1/completions"
with tracer.start_as_current_span("client-span", kind=SpanKind.CLIENT) as span:
    prompt = "San Francisco is a"
    span.set_attribute("prompt", prompt)
    headers = {}
    TraceContextTextMapPropagator().inject(headers)
    payload = {
        "model": "facebook/opt-125m",
        "prompt": prompt,
        "max_tokens": 10,
        "n": 3,
        "use_beam_search": "true",
        "temperature": 0.0,
        # "stream": True,
    }
    response = requests.post(url, headers=headers, json=payload)