Skip to main content
Open In Colab在 GitHub 上打开

如何链接 runnables

先决条件

关于 LangChain 表达式语言的一点是,任何两个 runnables 都可以“链接”在一起形成序列。上一个 runnable 的.invoke()call 作为 input 传递给下一个 Runnable。这可以使用管道运算符 (|) 或更明确的.pipe()方法,它执行相同的作。

结果RunnableSequence本身是一个 Runnable,这意味着它可以像任何其他 Runnable 一样被调用、流式传输或进一步链接。以这种方式链接 runnables 的优点是高效的流式处理(序列将在输出可用时立即流式处理),以及使用 LangSmith 等工具进行调试和跟踪。

管道作员:|

为了展示它是如何工作的,让我们看一个例子。我们将介绍 LangChain 中的一种常见模式:使用提示模板将输入格式化为聊天模型,最后使用输出解析器将聊天消息输出转换为字符串。

pip install -qU "langchain[openai]"
import getpass
import os

if not os.environ.get("OPENAI_API_KEY"):
os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter API key for OpenAI: ")

from langchain.chat_models import init_chat_model

model = init_chat_model("gpt-4o-mini", model_provider="openai")
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")

chain = prompt | model | StrOutputParser()

Prompts 和 models 都是可运行的,并且 Prompt 调用的输出类型与聊天模型的 input 类型相同,因此我们可以将它们链接在一起。然后,我们可以像任何其他 runnable 一样调用结果序列:

chain.invoke({"topic": "bears"})
"Here's a bear joke for you:\n\nWhy did the bear dissolve in water?\nBecause it was a polar bear!"

强迫

我们甚至可以将此链与更多 runnables 组合在一起,以创建另一个链。这可能涉及使用其他类型的可运行项进行一些输入/输出格式化,具体取决于链组件所需的输入和输出。

例如,假设我们想用另一个链来构建笑话生成链,该链评估生成的笑话是否有趣。

我们需要小心如何将 input 格式化到下一个链中。在下面的示例中,链中的 dict 被自动解析并转换为RunnableParallel,它并行运行其所有值并返回包含结果的 dict。

这恰好与下一个 prompt 模板所需的格式相同。这是它的实际应用:

from langchain_core.output_parsers import StrOutputParser

analysis_prompt = ChatPromptTemplate.from_template("is this a funny joke? {joke}")

composed_chain = {"joke": chain} | analysis_prompt | model | StrOutputParser()

composed_chain.invoke({"topic": "bears"})
API 参考:StrOutputParser
'Haha, that\'s a clever play on words! Using "polar" to imply the bear dissolved or became polar/polarized when put in water. Not the most hilarious joke ever, but it has a cute, groan-worthy pun that makes it mildly amusing. I appreciate a good pun or wordplay joke.'

函数也将被强制转换为 runnables,因此你也可以将自定义 logic 添加到你的链中。下面的链会产生与以前相同的逻辑流:

composed_chain_with_lambda = (
chain
| (lambda input: {"joke": input})
| analysis_prompt
| model
| StrOutputParser()
)

composed_chain_with_lambda.invoke({"topic": "beets"})
"Haha, that's a cute and punny joke! I like how it plays on the idea of beets blushing or turning red like someone blushing. Food puns can be quite amusing. While not a total knee-slapper, it's a light-hearted, groan-worthy dad joke that would make me chuckle and shake my head. Simple vegetable humor!"

但是,请记住,使用此类函数可能会干扰流式处理等作。有关更多信息,请参阅此部分

.pipe()方法

我们还可以使用.pipe()方法。这是它的样子:

from langchain_core.runnables import RunnableParallel

composed_chain_with_pipe = (
RunnableParallel({"joke": chain})
.pipe(analysis_prompt)
.pipe(model)
.pipe(StrOutputParser())
)

composed_chain_with_pipe.invoke({"topic": "battlestar galactica"})
API 参考:RunnableParallel
"I cannot reproduce any copyrighted material verbatim, but I can try to analyze the humor in the joke you provided without quoting it directly.\n\nThe joke plays on the idea that the Cylon raiders, who are the antagonists in the Battlestar Galactica universe, failed to locate the human survivors after attacking their home planets (the Twelve Colonies) due to using an outdated and poorly performing operating system (Windows Vista) for their targeting systems.\n\nThe humor stems from the juxtaposition of a futuristic science fiction setting with a relatable real-world frustration – the use of buggy, slow, or unreliable software or technology. It pokes fun at the perceived inadequacies of Windows Vista, which was widely criticized for its performance issues and other problems when it was released.\n\nBy attributing the Cylons' failure to locate the humans to their use of Vista, the joke creates an amusing and unexpected connection between a fictional advanced race of robots and a familiar technological annoyance experienced by many people in the real world.\n\nOverall, the joke relies on incongruity and relatability to generate humor, but without reproducing any copyrighted material directly."

或缩写:

composed_chain_with_pipe = RunnableParallel({"joke": chain}).pipe(
analysis_prompt, model, StrOutputParser()
)
  • 流式处理:查看流式处理指南以了解链的流式处理行为