实现分布式追踪
有时,您需要跨多个服务追踪请求。
LangSmith 开箱即用支持分布式追踪,通过上下文传播头(langsmith-trace 和可选的 baggage 用于元数据/标签)链接跨服务的追踪内的运行。
示例客户端 - 服务器设置:
- 追踪在客户端开始
- 继续在服务器上
Python 中的分布式追踪
# client.py
from langsmith.run_helpers import get_current_run_tree, traceable
import httpx
@traceable
async def my_client_function():
headers = {}
async with httpx.AsyncClient(base_url="...") as client:
if run_tree := get_current_run_tree():
# add langsmith-id to headers
headers.update(run_tree.to_headers())
return await client.post("/my-route", headers=headers)
随后,服务器(或其他服务)可以通过适当处理请求头来继续跟踪链路。 如果您使用的是 ASGI 应用 Starlette 或 FastAPI,您可以使用 LangSmith 的 TracingMiddleware 连接分布式跟踪。"
}
信息
TracingMiddleware 类在 langsmith==0.1.133 中添加。
使用 FastAPI 的示例:
from langsmith import traceable
from langsmith.middleware import TracingMiddleware
from fastapi import FastAPI, Request
app = FastAPI() # Or Flask, Django, or any other framework
app.add_middleware(TracingMiddleware)
@traceable
async def some_function():
...
@app.post("/my-route")
async def fake_route(request: Request):
return await some_function()
或在 Starlette 中:
from starlette.applications import Starlette
from starlette.middleware import Middleware
from langsmith.middleware import TracingMiddleware
routes = ...
middleware = [
Middleware(TracingMiddleware),
]
app = Starlette(..., middleware=middleware)
如果您使用其他服务器框架,始终可以通过将标头通过 langsmith_extra 传递来“接收”分布式跟踪:
# server.py
from langsmith import traceable
from langsmith.run_helpers import tracing_context
from fastapi import FastAPI, Request
@traceable
async def my_application():
...
app = FastAPI() # Or Flask, Django, or any other framework
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
# as well as optional metadata/tags in `baggage`
with tracing_context(parent=request.headers):
return await my_application()
上面的示例使用了 tracing_context 上下文管理器。您也可以在用 @traceable 包装的方法的 langsmith_extra 参数中直接指定父级运行上下文。
from langsmith.run_helpers import traceable, trace
# ... same as above
@app.post("/my-route")
async def fake_route(request: Request):
# request.headers: {"langsmith-trace": "..."}
my_application(langsmith_extra={"parent": request.headers})
TypeScript 中的分布式追踪
注意
TypeScript 中的分布式追踪需要 langsmith 版本 >=0.1.31
首先,我们从客户端获取当前的运行树,并将其转换为 langsmith-trace 和 baggage 的头部值,我们可以将这些值传递给服务器:
// client.mts
import { getCurrentRunTree, traceable } from "langsmith/traceable";
const client = traceable(
async () => {
const runTree = getCurrentRunTree();
return await fetch("...", {
method: "POST",
headers: runTree.toHeaders(),
}).then((a) => a.text());
},
{ name: "client" }
);
await client();
然后,服务器将请求头转换回运行树,并用于继续跟踪过程。
为了将新创建的运行树传递给可追踪函数,我们可以使用 withRunTree 辅助函数,它将确保运行树在可追踪调用中得到传播。
- Express.JS
- Hono
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import express from "express";
import bodyParser from "body-parser";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = express();
app.use(bodyParser.text());
app.post("/", async (req, res) => {
const runTree = RunTree.fromHeaders(req.headers);
const result = await withRunTree(runTree, () => server(req.body));
res.send(result);
});
// server.mts
import { RunTree } from "langsmith";
import { traceable, withRunTree } from "langsmith/traceable";
import { Hono } from "hono";
const server = traceable(
(text: string) => `Hello from the server! Received "${text}"`,
{ name: "server" }
);
const app = new Hono();
app.post("/", async (c) => {
const body = await c.req.text();
const runTree = RunTree.fromHeaders(c.req.raw.headers);
const result = await withRunTree(runTree, () => server(body));
return c.body(result);
});