大模型开发中LCEL与LLMChain响应度的对比
管道连接
import time
from langchain_community.chat_models import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import PromptTemplate
t1 = time.time()
llm = ChatOpenAI(
)
resp_prompt_path = 'response_prompt.md'
prompt = PromptTemplate.from_file(resp_prompt_path,encoding='utf-8')
prompt = prompt.partial(
query="现在客运量是多少?",
result="### rt_schema:['客运量'], rt_result:[888461]",
reply_nodata="昨日数据未完成结算, 请12点以后查看。",
today="2024-11-27")
chain = prompt | llm | StrOutputParser()
print(chain.invoke({"query":"现在客运量是多少?","result":"### rt_schema:['客运量'], rt_result:[888461]","today":"2024-11-27"}))
print(time.time()-t1)
其中,要求prompt类型为PromptTemplate类型。
LLMChain
import time
from langchain.chains.llm import LLMChain
from langchain.memory import ConversationBufferMemory
from langchain_community.chat_models import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import PromptTemplate, HumanMessagePromptTemplate, ChatPromptTemplate, \
SystemMessagePromptTemplate, MessagesPlaceholder
resp_prompt_path = 'response_prompt.md'
prompt = PromptTemplate.from_file(resp_prompt_path,encoding='utf-8')
prompt = prompt.partial(
query="现在客运量是多少?",
result="### rt_schema:['客运量'], rt_result:[888461]",
reply_nodata="昨日数据未完成结算, 请12点以后查看。",
today="2024-11-27")
prompt_ = prompt.format()
t2 = time.time()
llm = ChatOpenAI(
)
prompt = ChatPromptTemplate(
messages=[
SystemMessagePromptTemplate.from_template(
"You are a nice chatbot having a conversation with a human."
),
MessagesPlaceholder(variable_name="chat_history"),
HumanMessagePromptTemplate.from_template(prompt_)
]
)
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
conversation = LLMChain(
llm=llm,
prompt=prompt,
verbose=True,
memory=memory
)
resp = conversation.invoke({"data":str({"query":"现在客运量是多少?","result":"### rt_schema:['客运量'], rt_result:[888461]","today":"2024-11-27"})})
resp_str = StrOutputParser.parse(self='',text=resp.get('text'))
print(resp_str)
print(time.time()-t2)
其中,HumanMessagePromptTemplate.from_template()要求参数是str类型,需要将prompt通过prompt.format()转成str,进入LLMChain后prompt要求是ChatPromptTemplate类型的。另外,该模型只能接收一个参数,如果出现多个参数,需要转换。
目前在大模型开发中,遇到响应度体验的问题,本想通过拆掉pipeline提升速度,但是最终发现效果不明显。就保留langchain的LCEL模式。