某些场景会要求ChatGPT重复处理同一个操作,要么在问题里面加入Prompt,要么用自己Prompt替换LangChain默认Prompt。
直接看看前后对比结果
LangChain默认的Prompt
template="Use the following pieces of context to answer the users question. \nIf you don't know the answer, just say that you don't know, don't try to make up an answer.\n----------------\n{context}"
我自己Prompt替换了之后
template="Use the following pieces of context to answer the users question in Chinese.\n If you don't know the answer, just say that you don't know, don't try to make up an answer.\n There are multiple answers, provide each answer in Chinese,specify the source file for each answer.\n \n\n{context}文章来源:https://www.toymoban.com/news/detail-579017.html
源代码如下:文章来源地址https://www.toymoban.com/news/detail-579017.html
def getMyPrompt():
'''自定义Prompt模板'''
prompt_template = """Use the following pieces of context to answer the users question in Chinese.
If you don't know the answer, just say that you don't know, don't try to make up an answer.
There are multiple answers, provide each answer in Chinese,specify the source file for each answer.
\n\n{context}\n\nQuestion: {question}\n\nAnswer in Chinese:"""
MyPrompt = PromptTemplate(
template=prompt_template, input_variables=["context","question"]#必须有上下文context和问题question
)
return MyPrompt
db_RTCS = Chroma(persist_directory="./RCTS/", embedding_function=embeddings)
print('----------------')
chain_type_kwargs = {"prompt": getMyPrompt()}#用自己的Prompt替换掉langchain默认的Prompt
qa_RTCS = RetrievalQA.from_chain_type(llm=openAiLLm,chain_type="stuff",
retriever=db_RTCS.as_retriever(),
chain_type_kwargs=chain_type_kwargs)
print(qa_RTCS)#查看自定义Prompt的结构体内容
到了这里,关于ChatGPT | 使用自己Prompt替换LangChain默认Prompt的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!