Skip to main content
Open In Colab在 GitHub 上打开

ChatMistralAI

这将帮助您开始使用 Mistral 聊天模型。有关所有ChatMistralAI功能和配置可参考 API 参考。这ChatMistralAI类构建在 Mistral API 之上。有关 Mistral 支持的所有型号的列表,请查看此页面

概述

集成详细信息

本地化序列 化JS 支持软件包下载最新包装
ChatMistralAIlangchain_mistralaibetaPyPI - DownloadsPyPI - Version

模型特点

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流式处理本机异步Token 使用情况日志

设置

要访问ChatMistralAI模型,您需要创建 Mistral 账户,获取 API 密钥,并安装langchain_mistralai集成包。

凭据

需要有效的 API 密钥才能与 API 通信。完成此作后,设置 MISTRAL_API_KEY 环境变量:

import getpass
import os

if "MISTRAL_API_KEY" not in os.environ:
os.environ["MISTRAL_API_KEY"] = getpass.getpass("Enter your Mistral API key: ")

要启用模型调用的自动跟踪,请设置您的 LangSmith API 密钥:

# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")
# os.environ["LANGSMITH_TRACING"] = "true"

安装

LangChain Mistral 集成位于langchain_mistralai包:

%pip install -qU langchain_mistralai

实例

现在我们可以实例化我们的 Model 对象并生成聊天补全:

from langchain_mistralai import ChatMistralAI

llm = ChatMistralAI(
model="mistral-large-latest",
temperature=0,
max_retries=2,
# other params...
)
API 参考:ChatMistralAI

调用

messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
ai_msg
AIMessage(content='Sure, I\'d be happy to help you translate that sentence into French! The English sentence "I love programming" translates to "J\'aime programmer" in French. Let me know if you have any other questions or need further assistance!', response_metadata={'token_usage': {'prompt_tokens': 32, 'total_tokens': 84, 'completion_tokens': 52}, 'model': 'mistral-small', 'finish_reason': 'stop'}, id='run-64bac156-7160-4b68-b67e-4161f63e021f-0', usage_metadata={'input_tokens': 32, 'output_tokens': 52, 'total_tokens': 84})
print(ai_msg.content)
Sure, I'd be happy to help you translate that sentence into French! The English sentence "I love programming" translates to "J'aime programmer" in French. Let me know if you have any other questions or need further assistance!

链接

我们可以用 prompt 模板链接我们的模型,如下所示:

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm
chain.invoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API 参考:ChatPromptTemplate
AIMessage(content='Ich liebe Programmierung. (German translation)', response_metadata={'token_usage': {'prompt_tokens': 26, 'total_tokens': 38, 'completion_tokens': 12}, 'model': 'mistral-small', 'finish_reason': 'stop'}, id='run-dfd4094f-e347-47b0-9056-8ebd7ea35fe7-0', usage_metadata={'input_tokens': 26, 'output_tokens': 12, 'total_tokens': 38})

API 参考

前往 API 参考 获取所有属性和方法的详细文档。