欢迎来到尧图网

客户服务 关于我们

您的位置:首页 > 教育 > 锐评 > LLM RAG with Agent

LLM RAG with Agent

2024/10/24 4:46:30 来源:https://blog.csdn.net/suiusoar/article/details/140015007  浏览:    关键词:LLM RAG with Agent

题意:基于代理的LLM检索增强生成

问题背景:

I was trying the application code in the link.

我正在尝试链接中的应用程序代码。

I am using the following Llang Chain version

我正在使用以下Llang Chain版本

langchain 0.0.327 langchain-community 0.0.2 langchain-core 0.1.0

Getting the following error:        得到以下的错误

Entering new AgentExecutor chain...Traceback (most recent call last):File "RAGWithAgent.py", line 54, in <module>result = agent_executor({"input": "hi, im bob"})File "\lib\site-packages\langchain\chains\base.py", line 310, in __call__raise eFile "\lib\site-packages\langchain\chains\base.py", line 304, in __call__self._call(inputs, run_manager=run_manager)File "\lib\site-packages\langchain\agents\agent.py", line 1146, in _callnext_step_output = self._take_next_step(File "\lib\site-packages\langchain\agents\agent.py", line 933, in _take_next_stepoutput = self.agent.plan(File "\lib\site-packages\langchain\agents\openai_functions_agent\base.py", line 104, in planpredicted_message = self.llm.predict_messages(File "\lib\site-packages\langchain\chat_models\base.py", line 650, in predict_messagesreturn self(messages, stop=_stop, **kwargs)File "\lib\site-packages\langchain\chat_models\base.py", line 600, in __call__generation = self.generate(File "\lib\site-packages\langchain\chat_models\base.py", line 349, in generateraise eFile "\lib\site-packages\langchain\chat_models\base.py", line 339, in generateself._generate_with_cache(File "\lib\site-packages\langchain\chat_models\base.py", line 492, in _generate_with_cachereturn self._generate(File "\lib\site-packages\langchain\chat_models\openai.py", line 357, in _generatereturn _generate_from_stream(stream_iter)File "\lib\site-packages\langchain\chat_models\base.py", line 57, in _generate_from_streamfor chunk in stream:File "\lib\site-packages\langchain\chat_models\openai.py", line 326, in _streamfor chunk in self.completion_with_retry(File "\lib\site-packages\langchain\chat_models\openai.py", line 299, in completion_with_retryreturn _completion_with_retry(**kwargs)File "\lib\site-packages\tenacity\__init__.py", line 289, in wrapped_freturn self(f, *args, **kw)File "\lib\site-packages\tenacity\__init__.py", line 379, in __call__do = self.iter(retry_state=retry_state)File "\lib\site-packages\tenacity\__init__.py", line 314, in iterreturn fut.result()File "D:\Program Files\Python38\lib\concurrent\futures\_base.py", line 432, in resultreturn self.__get_result()File "D:\Program Files\Python38\lib\concurrent\futures\_base.py", line 388, in __get_resultraise self._exceptionFile "\lib\site-packages\tenacity\__init__.py", line 382, in __call__result = fn(*args, **kwargs)File "\lib\site-packages\langchain\chat_models\openai.py", line 297, in _completion_with_retryreturn self.client.create(**kwargs)File "\lib\site-packages\openai\api_resources\chat_completion.py", line 25, in createreturn super().create(*args, **kwargs)File "\lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 155, in createresponse, _, api_key = requestor.request(File "\lib\site-packages\openai\api_requestor.py", line 299, in requestresp, got_stream = self._interpret_response(result, stream)File "\lib\site-packages\openai\api_requestor.py", line 710, in _interpret_responseself._interpret_response_line(File "\lib\site-packages\openai\api_requestor.py", line 775, in _interpret_response_lineraise self.handle_error_response(
openai.error.InvalidRequestError: Unrecognized request argument supplied: functionsProcess finished with exit code 1

I used Azure LLM instead openAI. FAISS was not working for me so used Chroma Vector Store.

我使用了Azure的LLM,而不是OpenAI。FAISS对我不起作用,所以我使用了Chroma Vector Store。

Following is my code:        下面是我的代码

from langchain.text_splitter import CharacterTextSplitterfrom langchain.document_loaders import TextLoader
from langchain.agents.agent_toolkits import create_retriever_tool
from langchain.agents.agent_toolkits import create_conversational_retrieval_agentfrom langchain.chat_models import AzureChatOpenAI
from langchain.vectorstores import Chroma
from langchain.embeddings.sentence_transformer import SentenceTransformerEmbeddingsimport osAZURE_OPENAI_API_KEY = ""
os.environ["OPENAI_API_KEY"] = AZURE_OPENAI_API_KEYloader = TextLoader(r"Toward a Knowledge Graph of Cybersecurity Countermeasures.txt")
documents = loader.load()
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
chunks = text_splitter.split_documents(documents)# create the open-source embedding function
embedding_function = SentenceTransformerEmbeddings(model_name="all-mpnet-base-v2")current_directory = os.path.dirname("__file__")# load it into Chroma and save it to disk
db = Chroma.from_documents(chunks, embedding_function, collection_name="groups_collection",persist_directory=r"\rag_with_agent_chroma_db")retriever = db.as_retriever(search_kwargs={"k": 5})tool = create_retriever_tool(retriever,"search_state_of_union",
"Searches and returns documents regarding the state-of-the-union.",
)
tools = [tool]llm = AzureChatOpenAI(deployment_name='gtp35turbo',model_name='gpt-35-turbo',openai_api_key=AZURE_OPENAI_API_KEY,openai_api_version='2023-03-15-preview',openai_api_base='https://azureft.openai.azure.com/',openai_api_type='azure',streaming=True,verbose=True
)agent_executor = create_conversational_retrieval_agent(llm, tools, verbose=True, remember_intermediate_steps=True,memory_key="chat_history")result = agent_executor({"input": "hi, im bob"})print(result["output"])

问题解决:

I tried multiple including the answers mentioned above. But did not work. I even tried to degrade, and upgrade llm version.

我尝试了包括上面提到的多种方法,但都没有成功。我甚至尝试降级和升级LLM(大型语言模型)的版本。

Finally, the following code to initialize the agent worked for me with the current version of llm

最后,以下代码使用当前版本的LLM(大型语言模型)来初始化代理对我起作用了。

conversational_agent = initialize_agent(agent=AgentType.CHAT_ZERO_SHOT_REACT_DESCRIPTION, tools=tools,llm=llm,max_iterations=10,handle_parsing_errors=True,early_stopping_method="generate",memory=memory, verbose=True,)

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com