跳转到内容

包装器

langsmith.wrappers

该模块为常用库提供了便捷的跟踪包装器。

函数 描述
wrap_anthropic

修补 Anthropic 客户端,使其可被跟踪。

wrap_openai

修补 OpenAI 客户端,使其可被跟踪。

wrap_anthropic

wrap_anthropic(client: C, *, tracing_extra: TracingExtra | None = None) -> C

修补 Anthropic 客户端,使其可被跟踪。

参数 描述
client

要修补的客户端。

类型: Anthropic | AsyncAnthropic

tracing_extra

额外的跟踪信息。默认为 None。

类型: TracingExtra | None 默认值: None

返回 描述
C

Union[Anthropic, AsyncAnthropic]: 修补后的客户端。

示例

.. code-block:: python

    import anthropic
    from langsmith import wrappers

    client = wrappers.wrap_anthropic(anthropic.Anthropic())

    # Use Anthropic client same as you normally would:
    system = "You are a helpful assistant."
    messages = [
        {
            "role": "user",
            "content": "What physics breakthroughs do you predict will happen by 2300?",
        }
    ]
    completion = client.messages.create(
        model="claude-3-5-sonnet-latest",
        messages=messages,
        max_tokens=1000,
        system=system,
    )
    print(completion.content)

    # You can also use the streaming context manager:
    with client.messages.stream(
        model="claude-3-5-sonnet-latest",
        messages=messages,
        max_tokens=1000,
        system=system,
    ) as stream:
        for text in stream.text_stream:
            print(text, end="", flush=True)
        message = stream.get_final_message()

wrap_openai

wrap_openai(
    client: C,
    *,
    tracing_extra: TracingExtra | None = None,
    chat_name: str = "ChatOpenAI",
    completions_name: str = "OpenAI",
) -> C

修补 OpenAI 客户端,使其可被跟踪。

支持
  • 聊天和响应 API
  • 同步和异步 OpenAI 客户端
  • create() 和 parse() 方法
  • 带流和不带流
参数 描述
client

要修补的客户端。

类型: OpenAI | AsyncOpenAI

tracing_extra

额外的跟踪信息。默认为 None。

类型: TracingExtra | None 默认值: None

chat_name

聊天补全端点的运行名称。默认为“ChatOpenAI”。

类型: str 默认值: 'ChatOpenAI'

completions_name

补全端点的运行名称。默认为“OpenAI”。

类型: str 默认值: 'OpenAI'

返回 描述
C

Union[OpenAI, AsyncOpenAI]: 修补后的客户端。

示例

.. code-block:: python

    import openai
    from langsmith import wrappers

    # Use OpenAI client same as you normally would.
    client = wrappers.wrap_openai(openai.OpenAI())

    # Chat API:
    messages = [
        {"role": "system", "content": "You are a helpful assistant."},
        {
            "role": "user",
            "content": "What physics breakthroughs do you predict will happen by 2300?",
        },
    ]
    completion = client.chat.completions.create(
        model="gpt-4o-mini", messages=messages
    )
    print(completion.choices[0].message.content)

    # Responses API:
    response = client.responses.create(
        model="gpt-4o-mini",
        messages=messages,
    )
    print(response.output_text)

.. versionchanged:: 0.3.16

Support for Responses API added.
© . This site is unofficial and not affiliated with LangChain, Inc.