Skip to main content
Open In ColabOpen on GitHub

ChatGoodfire

这将帮助您开始使用Goodfire 聊天模型。要查看所有ChatGoodfire功能和配置的详细文档,请前往PyPI项目页面,或直接访问Goodfire SDK文档。Goodfire特定的功能(例如SAE功能、变体等)可通过主goodfire包获得。此集成是围绕Goodfire SDK的封装。

概览

集成细节

Class本地序列化JS支持Package downloadsPackage 最新版本
ChatGoodfirelangchain-goodfirePyPI - DownloadsPyPI - Version

模型特性

工具调用结构化输出JSON 模式图像输入音频输入视频输入Token级流式传输原生异步Token 使用对数概率

设置

要访问Goodfire模型,您需要创建一个Goodfire账号、获取API密钥,并安装langchain-goodfire集成包。

Credentials

请前往 Goodfire 设置 注册 Goodfire 并生成 API 密钥。完成这些步骤后,请设置 GOODFIRE_API_KEY 环境变量。

import getpass
import os

if not os.getenv("GOODFIRE_API_KEY"):
os.environ["GOODFIRE_API_KEY"] = getpass.getpass("Enter your Goodfire API key: ")

要启用对您的模型调用的自动跟踪,请设置您的LangSmith API密钥:

# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

安装

The LangChain Goodfire 整合存在于 langchain-goodfire 包中:

%pip install -qU langchain-goodfire
Note: you may need to restart the kernel to use updated packages.

Instantiation

现在我们就可以实例化我们的模型对象并生成聊天完成内容:

import goodfire
from langchain_goodfire import ChatGoodfire

base_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

llm = ChatGoodfire(
model=base_variant,
temperature=0,
max_completion_tokens=1000,
seed=42,
)
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.

Invocation

messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = await llm.ainvoke(messages)
ai_msg
AIMessage(content="J'adore la programmation.", additional_kwargs={}, response_metadata={}, id='run-8d43cf35-bce8-4827-8935-c64f8fb78cd0-0', usage_metadata={'input_tokens': 51, 'output_tokens': 39, 'total_tokens': 90})
print(ai_msg.content)
J'adore la programmation.

链式调用

我们可以通过以下方式将模型与提示模板进行链接

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm
await chain.ainvoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
AIMessage(content='Ich liebe das Programmieren. How can I help you with programming today?', additional_kwargs={}, response_metadata={}, id='run-03d1a585-8234-46f1-a8df-bf9143fe3309-0', usage_metadata={'input_tokens': 46, 'output_tokens': 46, 'total_tokens': 92})

Goodfire 特定功能

要使用Goodfire特定的功能,例如SAE功能和变体,您可以直接使用goodfire包。

client = goodfire.Client(api_key=os.environ["GOODFIRE_API_KEY"])

pirate_features = client.features.search(
"assistant should roleplay as a pirate", base_variant
)
pirate_features
FeatureGroup([
0: "The assistant should adopt the persona of a pirate",
1: "The assistant should roleplay as a pirate",
2: "The assistant should engage with pirate-themed content or roleplay as a pirate",
3: "The assistant should roleplay as a character",
4: "The assistant should roleplay as a specific character",
5: "The assistant should roleplay as a game character or NPC",
6: "The assistant should roleplay as a human character",
7: "Requests for the assistant to roleplay or pretend to be something else",
8: "Requests for the assistant to roleplay or pretend to be something",
9: "The assistant is being assigned a role or persona to roleplay"
])
pirate_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

pirate_variant.set(pirate_features[0], 0.4)
pirate_variant.set(pirate_features[1], 0.3)

await llm.ainvoke("Tell me a joke", model=pirate_variant)
AIMessage(content='Why did the scarecrow win an award? Because he was outstanding in his field! Arrr! Hope that made ye laugh, matey!', additional_kwargs={}, response_metadata={}, id='run-7d8bd30f-7f80-41cb-bdb6-25c29c22a7ce-0', usage_metadata={'input_tokens': 35, 'output_tokens': 60, 'total_tokens': 95})

API 参考

要详细了解所有ChatGoodfire功能和配置,请参阅API参考