Skip to main content
Open In ColabOpen on GitHub

序言

使用Prolog规则生成答案的LangChain工具。

概览

The PrologTool 类允许生成使用Prolog 规则生成答案的langchain 工具。

设置

让我们在文件family.pl中使用以下Prolog规则:

parent(约翰, 贝anca, 玛丽).
parent(约翰, 贝anca, 迈克尔).
parent(peter, patricia, jennifer).
partner(X, Y) :- parent(X, Y, _).

#!pip install langchain-prolog

from langchain_prolog import PrologConfig, PrologRunnable, PrologTool

TEST_SCRIPT = "family.pl"

Instantiation

首先创建Prolog工具:

schema = PrologRunnable.create_schema("parent", ["men", "women", "child"])
config = PrologConfig(
rules_file=TEST_SCRIPT,
query_schema=schema,
)
prolog_tool = PrologTool(
prolog_config=config,
name="family_query",
description="""
Query family relationships using Prolog.
parent(X, Y, Z) implies only that Z is a child of X and Y.
Input can be a query string like 'parent(john, X, Y)' or 'john, X, Y'"
You have to specify 3 parameters: men, woman, child. Do not use quotes.
""",
)

Invocation

使用Prolog工具与LLM和函数调用

#!pip install python-dotenv

from dotenv import find_dotenv, load_dotenv

load_dotenv(find_dotenv(), override=True)

#!pip install langchain-openai

from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI

要使用该工具,请将其绑定到LLM模型中:

llm = ChatOpenAI(model="gpt-4o-mini")
llm_with_tools = llm.bind_tools([prolog_tool])

然后查询模型:<br>

query = "Who are John's children?"
messages = [HumanMessage(query)]
response = llm_with_tools.invoke(messages)

该大语言模型将会回应一个工具调用请求:

messages.append(response)
response.tool_calls[0]
{'name': 'family_query',
'args': {'men': 'john', 'women': None, 'child': None},
'id': 'call_gH8rWamYXITrkfvRP2s5pkbF',
'type': 'tool_call'}

The tool takes this request and queries the Prolog database:

tool_msg = prolog_tool.invoke(response.tool_calls[0])

The tool returns a list with all the solutions for the query:

messages.append(tool_msg)
tool_msg
ToolMessage(content='[{"Women": "bianca", "Child": "mary"}, {"Women": "bianca", "Child": "michael"}]', name='family_query', tool_call_id='call_gH8rWamYXITrkfvRP2s5pkbF')

我们随后将其传递给LLM,然后LLM使用工具响应来回答原始查询:

answer = llm_with_tools.invoke(messages)
print(answer.content)
John has two children: Mary and Michael, with Bianca as their mother.

链式调用

使用Prolog工具与代理

要将prolog工具与代理一起使用,请将其传递给代理的构造函数:

#!pip install langgraph

from langgraph.prebuilt import create_react_agent

agent_executor = create_react_agent(llm, [prolog_tool])

该代理接收查询并在需要时使用Prolog工具:

messages = agent_executor.invoke({"messages": [("human", query)]})

然后智能体接收工具响应并生成答案:<br>

messages["messages"][-1].pretty_print()
================================== Ai Message ==================================

John has two children: Mary and Michael, with Bianca as their mother.

API 参考

https://langchain-prolog.readthedocs.io/en/latest/modules.html 以获取详细信息。