langserve 帮langchain的作为一个http服务,用于其他端访问
一、langservie的安装
pip install langserve
二、代码示例
OLLAMA_HOST ="http://xxxxx:11434"
MODEL_NAME ="qwen:32b"
from langchain_ollamaimport OllamaLLM
from langchain.schemaimport HumanMessage,SystemMessage
from langchain_core.promptsimport ChatPromptTemplate
from langchain_core.output_parsersimport StrOutputParser
llm = OllamaLLM(base_url = OLLAMA_HOST,model = MODEL_NAME)
temp = ChatPromptTemplate.from_messages([
("user","我是一个{city}人,我爱吃{food}"),
("system","帮我翻译为{language}")
])
chain = temp | llm | StrOutputParser()
app =FastAPI(title="我的Langchian服务",version='V1.0',description="guzp")
add_routes(
app,
chain,
path="/chatdemo"
)
if __name__ =='__main__':
import uvicorn
uvicorn.run(app,port=8000)
三、总结
add_routes 中三个参数,第一个是 fastapi那个app,第二个是生成的chain,path是路由,访问的时候 (要帮服务启起来)可以
1、postman进行提交,注意url和add_routes里的 path 一致。post提交,json里要有input,内容和模版的key一致
2、python代码的方式
from langserveimport RemoteRunnable
if __name__ =="__main__":
remoteClient = RemoteRunnable("http://127.0.0.1:8000/chatdemo")
print(remoteClient.invoke({"city":"南通","food":"西瓜","language":"法语"}))








网友评论