What causes this Attribute Error encountered when implementing LangChain's OpenAI LLM wrapper?

huangapple go评论91阅读模式
英文:

What causes this Attribute Error encountered when implementing LangChain's OpenAI LLM wrapper?

问题

这是我在这里的第一篇帖子。我正在使用 PyQt5 构建一个 Python 窗口应用程序,该应用程序实现了与 OpenAI 完成端点的交互。到目前为止,我自己编写的任何代码都表现正常,我已经达到了想要开始为对话交互实现长期记忆的阶段。我开始只是运行自己的提示链来对主题进行分类并将主题和摘要写入文本文件,但我决定最好尝试探索开源选项,看看编程社区如何管理这些事情。这让我找到了 LangChain,它似乎受到了一些热门支持,并且已经实现了我打算使用的许多功能。

然而,我到目前为止还没有取得任何成功。甚至最简单的示例也无法执行,无论我在其中实现它的上下文是什么(在类内部,在类外部,在异步循环中,在控制台中,在主窗口内的文本浏览器中等),我总是收到相同的错误消息。

最简单的示例:

import os
from langchain.llms import OpenAI
from local import constants # 用于 API 密钥
os.environ["OPENAI_API_KEY"] = constants.OPENAI_API_KEY
davinci = OpenAI(model_name= 'text-davinci-003', verbose=True, temperature=0.6)
text = "写一个关于一个对 Python 感到沮丧的人的故事。"
print("提示: " + text)
print(davinci(text))

它能够成功实例化包装器并将提示打印到控制台,但在通过包装器的函数发送命令以接收生成的文本时,会遇到此 AttributeError。

以下是追溯:

Traceback (most recent call last):
  File "D:\Dropbox\Pycharm Projects\workspace\main.py", line 16, in <module>
    print(davinci(text))
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py", line 255, in __call__
    return self.generate([prompt], stop=stop).generations[0][0].text
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py", line 128, in generate
    raise e
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py", line 125, in generate
    output = self._generate(prompts, stop=stop)
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py", line 259, in _generate
    response = self.completion_with_retry(prompt=_prompts, **params)
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py", line 200, in completion_with_retry
    retry_decorator = self._create_retry_decorator()
  File "D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py", line 189, in _create_retry_decorator
    retry_if_exception_type(openai.error.Timeout)
AttributeError: module 'openai.error' has no attribute 'Timeout'

我不认为 LangChain 库有错误,因为似乎没有其他人遇到过这个问题。我想我可能有一些依赖关系问题?或者我注意到其他使用 LangChain 库的人是在笔记本开发环境中这样做的,而我对此方面的陌生可能让我忽视了对库使用的一些基本期望?

欢迎任何建议!谢谢!

我尝试过的方法:我最初只是用一个通过 LangChain 的 llm 包装器发出调用完成端点的函数替换了我自己的函数。我希望它能像我的代码一样轻松地工作,但我收到了这个错误。然后我逐层剥离一切,尝试在程序的每个范围内实例化包装器,然后尝试通过等待完成的循环中的异步函数进行调用,但无论如何,我总是收到相同的错误消息。

英文:

This is my first post here. I'm building a Python window application with PyQt5 that implements interactions with the OpenAI completions endpoint. So far, any code that I've written myself has performed fine, and I was reaching the point where I wanted to start implementing long-term memory for conversational interactions. I started by just running my own chain of prompts for categorizing and writing topical subjects and summaries to text files, but I decided it best to try exploring open source options to see how the programming community is managing things. This led me to LangChain, which seems to have some popular support behind it and already implements many features that I intend.

However, I have not had even the tiniest bit of success with it yet. Even the most simple examples don't perform, regardless of what context I'm implementing it in (within a class, outside a class, in an asynchronous loop, to the console, to my text browsers within the main window, whatever) I always get the same error message.

The simplest possible example:

import os
from langchain.llms import OpenAI
from local import constants #For API key
os.environ[&quot;OPENAI_API_KEY&quot;] = constants.OPENAI_API_KEY
davinci = OpenAI(model_name= &#39;text-davinci-003&#39;, verbose=True, temperature=0.6)
text = &quot;Write me a story about a guy who is frustrated with Python.&quot;
print(&quot;Prompt: &quot; + text)
print(davinci(text))

It capably instantiates the wrapper and prints the prompt to the console, but at any point a command is sent through the wrapper's functions to receive generated text, it encounters this AttributeError.

Here is the traceback:

Traceback (most recent call last):
  File &quot;D:\Dropbox\Pycharm Projects\workspace\main.py&quot;, line 16, in &lt;module&gt;
    print(davinci(text))
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py&quot;, line 255, in __call__
    return self.generate([prompt], stop=stop).generations[0][0].text
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py&quot;, line 128, in generate
    raise e
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\base.py&quot;, line 125, in generate
    output = self._generate(prompts, stop=stop)
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py&quot;, line 259, in _generate
    response = self.completion_with_retry(prompt=_prompts, **params)
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py&quot;, line 200, in completion_with_retry
    retry_decorator = self._create_retry_decorator()
  File &quot;D:\Dropbox\Pycharm Projects\workspace\venv\lib\site-packages\langchain\llms\openai.py&quot;, line 189, in _create_retry_decorator
    retry_if_exception_type(openai.error.Timeout)
AttributeError: module &#39;openai.error&#39; has no attribute &#39;Timeout&#39;

I don't expect that there is a fault in the LangChain library, because it seems like nobody else has experienced this problem. I imagine I may have some dependency issue? Or I do notice that others using the LangChain library are doing so in a notebook development environment, and my lack of familiarity in that regard is making me overlook some fundamental expectation of the library's use?
Any advice is welcome! Thanks!

What I tried: I initially just replaced my own function for managing calls to the completion endpoint with one that issued the calls through LangChain's llm wrapper. I expected it to work as easily as my own code had, but I received that error. I then stripped everything apart layer by layer attempting to instantiate the wrapper at every scope of the program, then I attempted to make the calls in an asynchronous function through a loop that waited to completion, and no matter what, I always get that same error message.

答案1

得分: 0

我认为这可能与您当前安装的Python、OpenAI和/或LangChain版本有关。也许尝试使用更新的Python和OpenAI版本。我对Python和这些东西都很新,但希望我可以帮到您。

英文:

I think it might be something about your current installed versions of Python, OpenAI, and/or LangChain. Maybe try using a newer version of Python and OpenAI. I'm new to Python and these things but hopefully I could help.

huangapple
  • 本文由 发表于 2023年2月18日 08:42:22
  • 转载请务必保留本文链接:https://go.coder-hub.com/75490436.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定