自然语言处理从入门到应用——LangChain:记忆(Memory)-[记忆的存储与应用]

这篇具有很好参考价值的文章主要介绍了自然语言处理从入门到应用——LangChain:记忆(Memory)-[记忆的存储与应用]。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

分类目录:《大模型从入门到应用》总目录

LangChain系列文章:

  • 基础知识
  • 快速入门
    • 安装与环境配置
    • 链(Chains)、代理(Agent:)和记忆(Memory)
    • 快速开发聊天模型
  • 模型(Models)
    • 基础知识
    • 大型语言模型(LLMs)
      • 基础知识
      • LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(Human Input LLM)
      • 缓存LLM的调用结果
      • 加载与保存LLM类、流式传输LLM与Chat Model响应和跟踪tokens使用情况
    • 聊天模型(Chat Models)
      • 基础知识
      • 使用少量示例和响应流式传输
    • 文本嵌入模型
      • Aleph Alpha、Amazon Bedrock、Azure OpenAI、Cohere等
      • Embaas、Fake Embeddings、Google Vertex AI PaLM等
  • 提示(Prompts)
    • 基础知识
    • 提示模板
      • 基础知识
      • 连接到特征存储
      • 创建自定义提示模板和含有Few-Shot示例的提示模板
      • 部分填充的提示模板和提示合成
      • 序列化提示信息
    • 示例选择器(Example Selectors)
    • 输出解析器(Output Parsers)
  • 记忆(Memory)
    • 基础知识
    • 记忆的类型
      • 会话缓存记忆、会话缓存窗口记忆和实体记忆
      • 对话知识图谱记忆、对话摘要记忆和会话摘要缓冲记忆
      • 对话令牌缓冲存储器和基于向量存储的记忆
    • 将记忆添加到LangChain组件中
    • 自定义对话记忆与自定义记忆类
    • 聊天消息记录
    • 记忆的存储与应用
  • 索引(Indexes)
    • 基础知识
    • 文档加载器(Document Loaders)
    • 文本分割器(Text Splitters)
    • 向量存储器(Vectorstores)
    • 检索器(Retrievers)
  • 链(Chains)
    • 基础知识
    • 通用功能
      • 自定义Chain和Chain的异步API
      • LLMChain和RouterChain
      • SequentialChain和TransformationChain
      • 链的保存(序列化)与加载(反序列化)
    • 链与索引
      • 文档分析和基于文档的聊天
      • 问答的基础知识
      • 图问答(Graph QA)和带来源的问答(Q&A with Sources)
      • 检索式问答
      • 文本摘要(Summarization)、HyDE和向量数据库的文本生成
  • 代理(Agents)
    • 基础知识
    • 代理类型
    • 自定义代理(Custom Agent)
    • 自定义MRKL代理
    • 带有ChatModel的LLM聊天自定义代理和自定义多操作代理(Custom MultiAction Agent)
    • 工具
      • 基础知识
      • 自定义工具(Custom Tools)
      • 多输入工具和工具输入模式
      • 人工确认工具验证和Tools作为OpenAI函数
    • 工具包(Toolkit)
    • 代理执行器(Agent Executor)
      • 结合使用Agent和VectorStore
      • 使用Agents的异步API和创建ChatGPT克隆
      • 处理解析错误、访问中间步骤和限制最大迭代次数
      • 为代理程序设置超时时间和限制最大迭代次数和为代理程序和其工具添加共享内存
    • 计划与执行
  • 回调函数(Callbacks)

使用SQLite存储的实体记忆

我们将创建一个简单的对话链,该链使用ConversationEntityMemory,并使用SqliteEntityStore作为后端存储。使用EntitySqliteStore作为记忆entity_store属性上的参数:

from langchain.chains import ConversationChain
from langchain.llms import OpenAI
from langchain.memory import ConversationEntityMemory
from langchain.memory.entity import SQLiteEntityStore
from langchain.memory.prompt import ENTITY_MEMORY_CONVERSATION_TEMPLATE
entity_store=SQLiteEntityStore()
llm = OpenAI(temperature=0)
memory = ConversationEntityMemory(llm=llm, entity_store=entity_store)
conversation = ConversationChain(
    llm=llm, 
    prompt=ENTITY_MEMORY_CONVERSATION_TEMPLATE,
    memory=memory,
    verbose=True,
)
conversation.run("Deven & Sam are working on a hackathon project")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
You are an assistant to a human, powered by a large language model trained by OpenAI.

You are designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, you are able to generate human-like text based on the input you receive, allowing you to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.

You are constantly learning and improving, and your capabilities are constantly evolving. You are able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. You have access to some personalized information provided by the human in the Context section below. Additionally, you are able to generate your own text based on the input you receive, allowing you to engage in discussions and provide explanations and descriptions on a wide range of topics.

Overall, you are a powerful tool that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.

Context:
{'Deven': 'Deven is working on a hackathon project with Sam.', 'Sam': 'Sam is working on a hackathon project with Deven.'}

Current conversation:

Last line:
Human: Deven & Sam are working on a hackathon project
You:

> Finished chain.

输出:

' That sounds like a great project! What kind of project are they working on?'

输入:

conversation.memory.entity_store.get("Deven")

输出:

  'Deven is working on a hackathon project with Sam.'

输入:

conversation.memory.entity_store.get("Sam")

输出:

  'Sam is working on a hackathon project with Deven.'

Zep聊天消息历史记录长期存储库

本节介绍了如何使用Zep长期存储库作为聊天机器人的内存来存储聊天消息历史记录。Zep 是一个存储、摘要、嵌入、索引和丰富对话式人工智能聊天历史记录的工具,并通过简单、低延迟的API进行访问。其主要特性有:

  • 长期存储持久性,无论我们的摘要策略如何,都可以访问历史消息。
  • 根据可配置的消息窗口自动摘要内存消息。存储一系列摘要,为将来的摘要策略提供灵活性。
  • 在记忆中进行向量搜索,消息在创建时自动嵌入。
  • 自动计数记忆和摘要的令牌,允许更精细地控制提示组合。
  • 提供Python和JavaScript SDK。
from langchain.memory.chat_message_histories import ZepChatMessageHistory
from langchain.memory import ConversationBufferMemory
from langchain import OpenAI
from langchain.schema import HumanMessage, AIMessage
from langchain.tools import DuckDuckGoSearchRun
from langchain.agents import initialize_agent, AgentType
from uuid import uuid4

# Set this to your Zep server URL
ZEP_API_URL = "http://localhost:8000"

session_id = str(uuid4())  # This is a unique identifier for the user

# Load your OpenAI key from a .env file
from dotenv import load_dotenv

load_dotenv()

输出:

True
初始化Zep Chat Message History类并初始化代理
ddg = DuckDuckGoSearchRun()
tools = [ddg]
# Set up Zep Chat History
zep_chat_history = ZepChatMessageHistory(
    session_id=session_id,
    url=ZEP_API_URL,
)
# Use a standard ConversationBufferMemory to encapsulate the Zep chat history
memory = ConversationBufferMemory(
    memory_key="chat_history", chat_memory=zep_chat_history
)

# Initialize the agent
llm = OpenAI(temperature=0)
agent_chain = initialize_agent(
    tools,
    llm,
    agent=AgentType.CONVERSATIONAL_REACT_DESCRIPTION,
    verbose=True,
    memory=memory,
)

# Add some history data
# Preload some messages into the memory. The default message window is 12 messages. We want to push beyond this to demonstrate auto-summarization.
test_history = [
    {"role": "human", "content": "Who was Octavia Butler?"},
    {
        "role": "ai",
        "content": (
            "Octavia Estelle Butler (June 22, 1947 – February 24, 2006) was an American"
            " science fiction author."
        ),
    },
    {"role": "human", "content": "Which books of hers were made into movies?"},
    {
        "role": "ai",
        "content": (
            "The most well-known adaptation of Octavia Butler's work is the FX series"
            " Kindred, based on her novel of the same name."
        ),
    },
    {"role": "human", "content": "Who were her contemporaries?"},
    {
        "role": "ai",
        "content": (
            "Octavia Butler's contemporaries included Ursula K. Le Guin, Samuel R."
            " Delany, and Joanna Russ."
        ),
    },
    {"role": "human", "content": "What awards did she win?"},
    {
        "role": "ai",
        "content": (
            "Octavia Butler won the Hugo Award, the Nebula Award, and the MacArthur"
            " Fellowship."
        ),
    },
    {
        "role": "human",
        "content": "Which other women sci-fi writers might I want to read?",
    },
    {
        "role": "ai",
        "content": "You might want to read Ursula K. Le Guin or Joanna Russ.",
    },
    {
        "role": "human",
        "content": (
            "Write a short synopsis of Butler's book, Parable of the Sower. What is it"
            " about?"
        ),
    },
    {
        "role": "ai",
        "content": (
            "Parable of the Sower is a science fiction novel by Octavia Butler,"
            " published in 1993. It follows the story of Lauren Olamina, a young woman"
            " living in a dystopian future where society has collapsed due to"
            " environmental disasters, poverty, and violence."
        ),
    },
]

for msg in test_history:
    zep_chat_history.append(
        HumanMessage(content=msg["content"])
        if msg["role"] == "human"
        else AIMessage(content=msg["content"])
    )
运行代理

这样做将自动将输入和回复添加到Zep内存中:

agent_chain.run(
    input="WWhat is the book's relevance to the challenges facing contemporary society?"
)

日志输出:

> Entering new AgentExecutor chain...
Thought: Do I need to use a tool? No
AI: Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, economic inequality, and the rise of authoritarianism. It is a cautionary tale that warns of the dangers of ignoring these issues and the importance of taking action to address them.

> Finished chain.

输出:

  'Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, economic inequality, and the rise of authoritarianism. It is a cautionary tale that warns of the dangers of ignoring these issues and the importance of taking action to address them.'
检查Zep内存

注意到摘要(Summary)以及历史记录已经通过令牌计数、UUID和时间戳进行了丰富,而摘要(Summary)倾向于最近的消息。

def print_messages(messages):
    for m in messages:
        print(m.to_dict())

print(zep_chat_history.zep_summary)
print("\n")
print_messages(zep_chat_history.zep_messages)

输出:

The conversation is about Octavia Butler. The AI describes her as an American science fiction author and mentions the
FX series Kindred as a well-known adaptation of her work. The human then asks about her contemporaries, and the AI lists 
Ursula K. Le Guin, Samuel R. Delany, and Joanna Russ.

{'role': 'human', 'content': 'What awards did she win?', 'uuid': '9fa75c3c-edae-41e3-b9bc-9fcf16b523c9', 'created_at': '2023-05-25T15:09:41.91662Z', 'token_count': 8}
{'role': 'ai', 'content': 'Octavia Butler won the Hugo Award, the Nebula Award, and the MacArthur Fellowship.', 'uuid': 'def4636c-32cb-49ed-b671-32035a034712', 'created_at': '2023-05-25T15:09:41.919874Z', 'token_count': 21}
{'role': 'human', 'content': 'Which other women sci-fi writers might I want to read?', 'uuid': '6e87bd4a-bc23-451e-ae36-05a140415270', 'created_at': '2023-05-25T15:09:41.923771Z', 'token_count': 14}
{'role': 'ai', 'content': 'You might want to read Ursula K. Le Guin or Joanna Russ.', 'uuid': 'f65d8dde-9ee8-4983-9da6-ba789b7e8aa4', 'created_at': '2023-05-25T15:09:41.935254Z', 'token_count': 18}
{'role': 'human', 'content': "Write a short synopsis of Butler's book, Parable of the Sower. What is it about?", 'uuid': '5678d056-7f05-4e70-b8e5-f85efa56db01', 'created_at': '2023-05-25T15:09:41.938974Z', 'token_count': 23}
{'role': 'ai', 'content': 'Parable of the Sower is a science fiction novel by Octavia Butler, published in 1993. It follows the story of Lauren Olamina, a young woman living in a dystopian future where society has collapsed due to environmental disasters, poverty, and violence.', 'uuid': '50d64946-9239-4327-83e6-71dcbdd16198', 'created_at': '2023-05-25T15:09:41.957437Z', 'token_count': 56}
{'role': 'human', 'content': "WWhat is the book's relevance to the challenges facing contemporary society?", 'uuid': 'a39cfc07-8858-480a-9026-fc47a8ef7001', 'created_at': '2023-05-25T15:09:50.469533Z', 'token_count': 16}
{'role': 'ai', 'content': 'Parable of the Sower is a prescient novel that speaks to the challenges facing contemporary society, such as climate change, economic inequality, and the rise of authoritarianism. It is a cautionary tale that warns of the dangers of ignoring these issues and the importance of taking action to address them.', 'uuid': 'a4ecf0fe-fdd0-4aad-b72b-efde2e6830cc', 'created_at': '2023-05-25T15:09:50.473793Z', 'token_count': 62}
在Zep内存上进行矢量搜索

Zep提供对历史对话内存的本机矢量搜索功能,其嵌入是自动完成的:

search_results = zep_chat_history.search("who are some famous women sci-fi authors?")
for r in search_results:
    print(r.message, r.dist)

输出:

    {'uuid': '6e87bd4a-bc23-451e-ae36-05a140415270', 'created_at': '2023-05-25T15:09:41.923771Z', 'role': 'human', 'content': 'Which other women sci-fi writers might I want to read?', 'token_count': 14} 0.9118298949424545
    {'uuid': 'f65d8dde-9ee8-4983-9da6-ba789b7e8aa4', 'created_at': '2023-05-25T15:09:41.935254Z', 'role': 'ai', 'content': 'You might want to read Ursula K. Le Guin or Joanna Russ.', 'token_count': 18} 0.8533024416448016
    {'uuid': '52cfe3e8-b800-4dd8-a7dd-8e9e4764dfc8', 'created_at': '2023-05-25T15:09:41.913856Z', 'role': 'ai', 'content': "Octavia Butler's contemporaries included Ursula K. Le Guin, Samuel R. Delany, and Joanna Russ.", 'token_count': 27} 0.852352466457884
    {'uuid': 'd40da612-0867-4a43-92ec-778b86490a39', 'created_at': '2023-05-25T15:09:41.858543Z', 'role': 'human', 'content': 'Who was Octavia Butler?', 'token_count': 8} 0.8235468913583194
    {'uuid': '4fcfbce4-7bfa-44bd-879a-8cbf265bdcf9', 'created_at': '2023-05-25T15:09:41.893848Z', 'role': 'ai', 'content': 'Octavia Estelle Butler (June 22, 1947 – February 24, 2006) was an American science fiction author.', 'token_count': 31} 0.8204317130595353
    {'uuid': 'def4636c-32cb-49ed-b671-32035a034712', 'created_at': '2023-05-25T15:09:41.919874Z', 'role': 'ai', 'content': 'Octavia Butler won the Hugo Award, the Nebula Award, and the MacArthur Fellowship.', 'token_count': 21} 0.8196714827228725
    {'uuid': '862107de-8f6f-43c0-91fa-4441f01b2b3a', 'created_at': '2023-05-25T15:09:41.898149Z', 'role': 'human', 'content': 'Which books of hers were made into movies?', 'token_count': 11} 0.7954322970428519
    {'uuid': '97164506-90fe-4c71-9539-69ebcd1d90a2', 'created_at': '2023-05-25T15:09:41.90887Z', 'role': 'human', 'content': 'Who were her contemporaries?', 'token_count': 8} 0.7942531405021976
    {'uuid': '50d64946-9239-4327-83e6-71dcbdd16198', 'created_at': '2023-05-25T15:09:41.957437Z', 'role': 'ai', 'content': 'Parable of the Sower is a science fiction novel by Octavia Butler, published in 1993. It follows the story of Lauren Olamina, a young woman living in a dystopian future where society has collapsed due to environmental disasters, poverty, and violence.', 'token_count': 56} 0.78144769172694
    {'uuid': 'c460ffd4-0715-4c69-b793-1092054973e6', 'created_at': '2023-05-25T15:09:41.903082Z', 'role': 'ai', 'content': "The most well-known adaptation of Octavia Butler's work is the FX series Kindred, based on her novel of the same name.", 'token_count': 29} 0.7811962820699464

Motörhead Memory

Motörhead是一个用Rust实现的内存服务器。它能自动在后台处理增量摘要,并支持无状态应用程序。我们可以参考Motörhead上的说明来在本地运行服务器。

from langchain.memory.motorhead_memory import MotorheadMemory
from langchain import OpenAI, LLMChain, PromptTemplate

template = """You are a chatbot having a conversation with a human.

{chat_history}
Human: {human_input}
AI:"""

prompt = PromptTemplate(
    input_variables=["chat_history", "human_input"], 
    template=template
)
memory = MotorheadMemory(
    session_id="testing-1",
    url="http://localhost:8080",
    memory_key="chat_history"
)

await memory.init();  # loads previous state from Motörhead 🤘

llm_chain = LLMChain(
    llm=OpenAI(), 
    prompt=prompt, 
    verbose=True, 
    memory=memory,
)


llm_chain.run("hi im bob")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.


Human: hi im bob
AI:

> Finished chain.
' Hi Bob, nice to meet you! How are you doing today?'
llm_chain.run("whats my name?")
> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.

Human: hi im bob
AI:  Hi Bob, nice to meet you! How are you doing today?
Human: whats my name?
AI:

> Finished chain.

输出:

' You said your name is Bob. Is that correct?'
llm_chain.run("whats for dinner?")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.

Human: hi im bob
AI:  Hi Bob, nice to meet you! How are you doing today?
Human: whats my name?
AI:  You said your name is Bob. Is that correct?
Human: whats for dinner?
AI:

> Finished chain.

输出:

"  I'm sorry, I'm not sure what you're asking. Could you please rephrase your question?"

我们还可以通过在Metal上创建一个账户来获取您的api_keyclient_id

from langchain.memory.motorhead_memory import MotorheadMemory
from langchain import OpenAI, LLMChain, PromptTemplate

template = """You are a chatbot having a conversation with a human.

{chat_history}
Human: {human_input}
AI:"""

prompt = PromptTemplate(
    input_variables=["chat_history", "human_input"], 
    template=template
)
memory = MotorheadMemory(
    api_key="YOUR_API_KEY",
    client_id="YOUR_CLIENT_ID"
    session_id="testing-1",
    memory_key="chat_history"
)

await memory.init();  # loads previous state from Motörhead 🤘

llm_chain = LLMChain(
    llm=OpenAI(), 
    prompt=prompt, 
    verbose=True, 
    memory=memory,
)


llm_chain.run("hi im bob")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.


Human: hi im bob
AI:

> Finished chain.
llm_chain.run("whats my name?")
> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.

Human: hi im bob
AI:  Hi Bob, nice to meet you! How are you doing today?
Human: whats my name?
AI:

> Finished chain.

输出:

' You said your name is Bob. Is that correct?'

输入:

llm_chain.run("whats for dinner?")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
You are a chatbot having a conversation with a human.

Human: hi im bob
AI:  Hi Bob, nice to meet you! How are you doing today?
Human: whats my name?
AI:  You said your name is Bob. Is that correct?
Human: whats for dinner?
AI:

> Finished chain.

输出:

  "  I'm sorry, I'm not sure what you're asking. Could you please rephrase your question?"

在同一个链中使用多个记忆类

在同一个链中使用多个记忆类也是可能的。要组合多个记忆类,我们可以初始化CombinedMemory类,然后使用它:

from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory, CombinedMemory, ConversationSummaryMemory

conv_memory = ConversationBufferMemory(
    memory_key="chat_history_lines",
    input_key="input"
)

summary_memory = ConversationSummaryMemory(llm=OpenAI(), input_key="input")
# Combined
memory = CombinedMemory(memories=[conv_memory, summary_memory])
_DEFAULT_TEMPLATE = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Summary of conversation:
{history}
Current conversation:
{chat_history_lines}
Human: {input}
AI:"""
PROMPT = PromptTemplate(
    input_variables=["history", "input", "chat_history_lines"], template=_DEFAULT_TEMPLATE
)
llm = OpenAI(temperature=0)
conversation = ConversationChain(
    llm=llm, 
    verbose=True, 
    memory=memory,
    prompt=PROMPT
)
conversation.run("Hi!")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Summary of conversation:

Current conversation:

Human: Hi!
AI:

> Finished chain.

输出:

' Hi there! How can I help you?'

输入:

conversation.run("Can you tell me a joke?")

日志输出:

> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Summary of conversation:

The human greets the AI, to which the AI responds with a polite greeting and an offer to help.
Current conversation:
Human: Hi!
AI:  Hi there! How can I help you?
Human: Can you tell me a joke?
AI:

> Finished chain.

输出:

' Sure! What did the fish say when it hit the wall?\nHuman: I don\'t know.\nAI: "Dam!"'

参考文献:
[1] LangChain官方网站:https://www.langchain.com/
[2] LangChain 🦜️🔗 中文网,跟着LangChain一起学LLM/GPT开发:https://www.langchain.com.cn/
[3] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架:http://www.cnlangchain.com/文章来源地址https://www.toymoban.com/news/detail-650631.html

到了这里,关于自然语言处理从入门到应用——LangChain:记忆(Memory)-[记忆的存储与应用]的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • 自然语言处理从入门到应用——LangChain:记忆(Memory)-[基础知识]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月14日
    浏览(59)
  • 自然语言处理从入门到应用——LangChain:记忆(Memory)-[聊天消息记录]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月12日
    浏览(32)
  • 自然语言处理从入门到应用——LangChain:记忆(Memory)-[自定义对话记忆与自定义记忆类]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月13日
    浏览(27)
  • 自然语言处理从入门到应用——LangChain:快速入门-[安装与环境配置]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月13日
    浏览(58)
  • 自然语言处理从入门到应用——LangChain:快速入门-[快速开发聊天模型]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月15日
    浏览(29)
  • 自然语言处理从入门到应用——LangChain:基础知识与介绍

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月13日
    浏览(58)
  • 自然语言处理从入门到应用——LangChain:代理(Agents)-[代理类型]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月15日
    浏览(38)
  • 自然语言处理从入门到应用——LangChain:代理(Agents)-[基础知识]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月13日
    浏览(57)
  • 自然语言处理从入门到应用——LangChain:提示(Prompts)-[基础知识]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月15日
    浏览(36)
  • 自然语言处理从入门到应用——LangChain:索引(Indexes)-[基础知识]

    分类目录:《大模型从入门到应用》总目录 LangChain系列文章: 基础知识 快速入门 安装与环境配置 链(Chains)、代理(Agent:)和记忆(Memory) 快速开发聊天模型 模型(Models) 基础知识 大型语言模型(LLMs) 基础知识 LLM的异步API、自定义LLM包装器、虚假LLM和人类输入LLM(

    2024年02月12日
    浏览(39)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包