Skip to main content
Open In Colab在 GitHub 上打开

ChatSeekrFlow

Seekr 为结构化、可解释和透明的 AI 交互提供 AI 驱动的解决方案。

此笔记本提供了 Seekr 聊天模型入门的快速概述。有关所有ChatSeekrFlow功能和配置,请前往 API 参考

概述

ChatSeekrFlowclass 包装了 SeekrFlow 上托管的聊天模型终端节点,从而实现了与 LangChain 应用程序的无缝集成。

集成详细信息

本地化序列 化软件包下载最新包装
ChatSeekrFlowseekraibetaPyPI - DownloadsPyPI - Version

型号特点

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流式处理本机异步Token 使用情况日志

支持的方法

ChatSeekrFlow支持所有方法ChatModel,异步 API 除外

端点要求

服务终端节点ChatSeekrFlowwraps 必须具有与 OpenAI 兼容的聊天输入/输出格式。它可用于:

  1. 微调的 Seekr 模型
  2. 自定义 SeekrFlow 模型
  3. 使用 Seekr 检索系统的启用 RAG 的模型

异步使用请参考AsyncChatSeekrFlow(即将推出)。

LangChain 中的 ChatSeekrFlow 入门

本笔记本介绍了如何在 LangChain 中使用 SeekrFlow 作为聊天模型。

设置

确保您已安装必要的依赖项:

pip install seekrai langchain langchain-community

您还必须拥有 Seekr 的 API 密钥才能对请求进行身份验证。

# Standard library
import getpass
import os

# Third-party
from langchain.prompts import ChatPromptTemplate
from langchain.schema import HumanMessage
from langchain_core.runnables import RunnableSequence

# OSS SeekrFlow integration
from langchain_seekrflow import ChatSeekrFlow
from seekrai import SeekrFlow

API 密钥设置

您需要将 API 密钥设置为环境变量才能对请求进行身份验证。

运行下面的单元格。

或者在运行查询之前手动分配它:

SEEKR_API_KEY = "your-api-key-here"
os.environ["SEEKR_API_KEY"] = getpass.getpass("Enter your Seekr API key:")

实例

os.environ["SEEKR_API_KEY"]
seekr_client = SeekrFlow(api_key=SEEKR_API_KEY)

llm = ChatSeekrFlow(
client=seekr_client, model_name="meta-llama/Meta-Llama-3-8B-Instruct"
)

调用

response = llm.invoke([HumanMessage(content="Hello, Seekr!")])
print(response.content)
Hello there! I'm Seekr, nice to meet you! What brings you here today? Do you have a question, or are you looking for some help with something? I'm all ears (or rather, all text)!

链接

prompt = ChatPromptTemplate.from_template("Translate to French: {text}")

chain: RunnableSequence = prompt | llm
result = chain.invoke({"text": "Good morning"})
print(result)
content='The translation of "Good morning" in French is:\n\n"Bonne journée"' additional_kwargs={} response_metadata={}
def test_stream():
"""Test synchronous invocation in streaming mode."""
print("\n🔹 Testing Sync `stream()` (Streaming)...")

for chunk in llm.stream([HumanMessage(content="Write me a haiku.")]):
print(chunk.content, end="", flush=True)


# ✅ Ensure streaming is enabled
llm = ChatSeekrFlow(
client=seekr_client,
model_name="meta-llama/Meta-Llama-3-8B-Instruct",
streaming=True, # ✅ Enable streaming
)

# ✅ Run sync streaming test
test_stream()

🔹 Testing Sync `stream()` (Streaming)...
Here is a haiku:

Golden sunset fades
Ripples on the quiet lake
Peaceful evening sky

错误处理和调试

# Define a minimal mock SeekrFlow client
class MockSeekrClient:
"""Mock SeekrFlow API client that mimics the real API structure."""

class MockChat:
"""Mock Chat object with a completions method."""

class MockCompletions:
"""Mock Completions object with a create method."""

def create(self, *args, **kwargs):
return {
"choices": [{"message": {"content": "Mock response"}}]
} # Mimic API response

completions = MockCompletions()

chat = MockChat()


def test_initialization_errors():
"""Test that invalid ChatSeekrFlow initializations raise expected errors."""

test_cases = [
{
"name": "Missing Client",
"args": {"client": None, "model_name": "seekrflow-model"},
"expected_error": "SeekrFlow client cannot be None.",
},
{
"name": "Missing Model Name",
"args": {"client": MockSeekrClient(), "model_name": ""},
"expected_error": "A valid model name must be provided.",
},
]

for test in test_cases:
try:
print(f"Running test: {test['name']}")
faulty_llm = ChatSeekrFlow(**test["args"])

# If no error is raised, fail the test
print(f"❌ Test '{test['name']}' failed: No error was raised!")
except Exception as e:
error_msg = str(e)
assert test["expected_error"] in error_msg, f"Unexpected error: {error_msg}"
print(f"✅ Expected Error: {error_msg}")


# Run test
test_initialization_errors()
Running test: Missing Client
✅ Expected Error: SeekrFlow client cannot be None.
Running test: Missing Model Name
✅ Expected Error: A valid model name must be provided.

API 参考