DeepInfra
DeepInfra 是一种无服务器推理即服务,可提供对多种 大语言模型(LLMs) 和 嵌入模型(embeddings models) 的访问。本笔记本介绍了如何在 LangChain 中使用 DeepInfra 的语言模型。
设置环境API密钥
请从DeepInfra处获取您的API密钥。您需要登录并获得一个新的令牌。
您将获得1小时免费的无服务器GPU计算时间以测试不同的模型。(详见这里)
您可以使用deepctl auth token打印您的令牌
# get a new token: https://deepinfra.com/login?from=%2Fdash
from getpass import getpass
DEEPINFRA_API_TOKEN = getpass()
········
import os
os.environ["DEEPINFRA_API_TOKEN"] = DEEPINFRA_API_TOKEN
创建 DeepInfra 实例
您还可以使用我们的开源 deepctl 工具 来管理您的模型部署。您可以在此处查看可用参数的列表 这里。
from langchain_community.llms import DeepInfra
llm = DeepInfra(model_id="meta-llama/Llama-2-70b-chat-hf")
llm.model_kwargs = {
"temperature": 0.7,
"repetition_penalty": 1.2,
"max_new_tokens": 250,
"top_p": 0.9,
}
API 参考:DeepInfra
# run inferences directly via wrapper
llm("Who let the dogs out?")
'This is a question that has puzzled many people'
# run streaming inference
for chunk in llm.stream("Who let the dogs out?"):
print(chunk)
Will
Smith
.
创建一个提示模板
我们将为问答创建一个提示模板。
from langchain_core.prompts import PromptTemplate
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)
API 参考:提示模板
启动LLMChain
from langchain.chains import LLMChain
llm_chain = LLMChain(prompt=prompt, llm=llm)
API 参考:LLM链
运行LLMChain
提供一个问题并运行LLMChain。
question = "Can penguins reach the North pole?"
llm_chain.run(question)
"Penguins are found in Antarctica and the surrounding islands, which are located at the southernmost tip of the planet. The North Pole is located at the northernmost tip of the planet, and it would be a long journey for penguins to get there. In fact, penguins don't have the ability to fly or migrate over such long distances. So, no, penguins cannot reach the North Pole. "