RetryError:重试错误[<在0x7f89bc35eb90处完成的未来状态引发了AuthenticationError>]

huangapple go评论101阅读模式
英文:

tenacity.RetryError: RetryError[<Future at 0x7f89bc35eb90 state=finished raised AuthenticationError>]

问题

I am trying to deploy an app made with streamlit (using also streamlit_chat and streamlit_authenticator). This app is making use of llama-index to create a query engine incorporating chatgpt api. When I state "streamlit run app.py" in my computer, everything works fine, but when I deploy it the following error raises:

2023-06-07 16:45:28.682 Uncaught app exception

Traceback (most recent call last):

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 382, in call

  1. result = fn(*args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py", line 106, in get_embedding

  1. return openai.Embedding.create(input=[text], engine=engine)["data"][0]["embedding"]

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/embedding.py", line 33, in create

  1. response = super().create(*args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create

) = cls.__prepare_create_request(

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 106, in __prepare_create_request

  1. requestor = api_requestor.APIRequestor(

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_requestor.py", line 138, in init

  1. self.api_key = key or util.default_api_key()

File "/home/appuser/venv/lib/python3.10/site-packages/openai/util.py", line 186, in default_api_key

  1. raise openai.error.AuthenticationError(

openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = ', or you can set the environment variable OPENAI_API_KEY=). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = '. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 561, in _run_script

  1. self._session_state.on_script_will_rerun(rerun_data.widget_states)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py", line 68, in on_script_will_rerun

  1. self._state.on_script_will_rerun(latest_widget_states)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 482, in on_script_will_rerun

  1. self._call_callbacks()

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 495, in _call_callbacks

  1. self._new_widget_state.call_callback(wid)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 247, in call_callback

  1. callback(*args, **kwargs)

File "/app/bajoquetumgpt/docsv2.py", line 35, in generate_answer

  1. response = query_engine.query(user_msg)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/query/base.py", line 20, in query

  1. return self._query(str_or_query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/query_engine/retriever_query_engine.py", line 139, in _query

  1. nodes = self._retriever.retrieve(query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/base_retriever.py", line 21, in retrieve

  1. return self._retrieve(str_or_query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/token_counter/token_counter.py", line 78, in wrapped_llm_predict

  1. f_return_val = f(_self, *args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/vector_store/retrievers.py", line 62, in _retrieve

  1. self._service_context.embed_model.get_agg_embedding_from_queries(

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 83, in get_agg_embedding_from_queries

  1. query_embeddings = [self.get_query_embedding(query) for query in queries]

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 83, in

  1. query_embeddings = [self.get_query_embedding(query) for query in queries]

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 72, in get_query_embedding

  1. query_embedding = self._get_query_embedding(query)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py", line 223, in _get_query_embedding

  1. return get_embedding(query, engine=engine)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 289, in wrapped_f

  1. return self(f, *args, **kw)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 379, in call

  1. do = self.iter(retry_state=retry_state)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 326, in iter

  1. raise retry_exc from fut.exception()

tenacity.RetryError: RetryError[<Future at 0x7f89bc35eb90 state=finished raised AuthenticationError>]

The code is private but I can show the part where the authenticator and the query engine are used:

  1. import yaml
  2. from yaml.loader import SafeLoader
  3. with open('./config.yaml') as file:
  4. config = yaml.load(file, Loader=SafeLoader)
  5. authenticator = stauth.Authenticate(
  6. config['credentials'],
  7. config['cookie']['name'],
  8. config['cookie']['key'],
  9. config['cookie']['expiry_days'],
  10. config['preauthorized']
  11. )
  12. name, authentication_status, username = authenticator.login('Login', 'main')
  13. print(username, name, authentication_status)
  14. if authentication_status:
  15. authenticator.logout('Logout', 'main')
  16. st.write(f'Welcome *{name}*')
  17. elif authentication_status ==
  18. <details>
  19. <summary>英文:</summary>
  20. I am trying to deploy an app made with streamlit (using also streamlit_chat and streamlit_authenticator). This app is making use of llama-index to create a query engine incorporating chatgpt api. When I state &quot;streamlit run app.py&quot; in my computer, everything works fine, but when I deploy it the following error raises:
  21. 2023-06-07 16:45:28.682 Uncaught app exception
  22. Traceback (most recent call last):
  23. File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 382, in __call__
  24. result = fn(*args, **kwargs)
  25. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py&quot;, line 106, in get_embedding
  26. return openai.Embedding.create(input=[text], engine=engine)[&quot;data&quot;][0][&quot;embedding&quot;]
  27. File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/embedding.py&quot;, line 33, in create
  28. response = super().create(*args, **kwargs)
  29. File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py&quot;, line 149, in create
  30. ) = cls.__prepare_create_request(
  31. File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py&quot;, line 106, in __prepare_create_request
  32. requestor = api_requestor.APIRequestor(
  33. File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_requestor.py&quot;, line 138, in __init__
  34. self.api_key = key or util.default_api_key()
  35. File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/util.py&quot;, line 186, in default_api_key
  36. raise openai.error.AuthenticationError(
  37. openai.error.AuthenticationError: No API key provided. You can set your API key in code using &#39;openai.api_key = &lt;API-KEY&gt;&#39;, or you can set the environment variable OPENAI_API_KEY=&lt;API-KEY&gt;). If your API key is stored in a file, you can point the openai module at it with &#39;openai.api_key_path = &lt;PATH&gt;&#39;. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.
  38. The above exception was the direct cause of the following exception:
  39. Traceback (most recent call last):
  40. File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py&quot;, line 561, in _run_script
  41. self._session_state.on_script_will_rerun(rerun_data.widget_states)
  42. File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py&quot;, line 68, in on_script_will_rerun
  43. self._state.on_script_will_rerun(latest_widget_states)
  44. File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 482, in on_script_will_rerun
  45. self._call_callbacks()
  46. File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 495, in _call_callbacks
  47. self._new_widget_state.call_callback(wid)
  48. File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 247, in call_callback
  49. callback(*args, **kwargs)
  50. File &quot;/app/bajoquetumgpt/docsv2.py&quot;, line 35, in generate_answer
  51. response = query_engine.query(user_msg)
  52. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/query/base.py&quot;, line 20, in query
  53. return self._query(str_or_query_bundle)
  54. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/query_engine/retriever_query_engine.py&quot;, line 139, in _query
  55. nodes = self._retriever.retrieve(query_bundle)
  56. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/base_retriever.py&quot;, line 21, in retrieve
  57. return self._retrieve(str_or_query_bundle)
  58. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/token_counter/token_counter.py&quot;, line 78, in wrapped_llm_predict
  59. f_return_val = f(_self, *args, **kwargs)
  60. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/vector_store/retrievers.py&quot;, line 62, in _retrieve
  61. self._service_context.embed_model.get_agg_embedding_from_queries(
  62. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 83, in get_agg_embedding_from_queries
  63. query_embeddings = [self.get_query_embedding(query) for query in queries]
  64. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 83, in &lt;listcomp&gt;
  65. query_embeddings = [self.get_query_embedding(query) for query in queries]
  66. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 72, in get_query_embedding
  67. query_embedding = self._get_query_embedding(query)
  68. File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py&quot;, line 223, in _get_query_embedding
  69. return get_embedding(query, engine=engine)
  70. File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 289, in wrapped_f
  71. return self(f, *args, **kw)
  72. File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 379, in __call__
  73. do = self.iter(retry_state=retry_state)
  74. File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 326, in iter
  75. raise retry_exc from fut.exception()
  76. tenacity.RetryError: RetryError[&lt;Future at 0x7f89bc35eb90 state=finished raised AuthenticationError&gt;]
  77. The code is private but I can show the part where the authenticator and the query engine are used:
  78. import yaml
  79. from yaml.loader import SafeLoader
  80. with open(&#39;./config.yaml&#39;) as file:
  81. config = yaml.load(file, Loader=SafeLoader)
  82. authenticator = stauth.Authenticate(
  83. config[&#39;credentials&#39;],
  84. config[&#39;cookie&#39;][&#39;name&#39;],
  85. config[&#39;cookie&#39;][&#39;key&#39;],
  86. config[&#39;cookie&#39;][&#39;expiry_days&#39;],
  87. config[&#39;preauthorized&#39;]
  88. )
  89. name, authentication_status, username = authenticator.login(&#39;Login&#39;, &#39;main&#39;)
  90. print(username, name, authentication_status)
  91. if authentication_status:
  92. authenticator.logout(&#39;Logout&#39;, &#39;main&#39;)
  93. st.write(f&#39;Welcome *{name}*&#39;)
  94. elif authentication_status == False:
  95. st.error(&#39;Username/password is incorrect&#39;)
  96. elif authentication_status == None:
  97. st.warning(&#39;Please enter your username and password&#39;)
  98. FIRST_OUTPUT=&quot;&quot;&quot;Hello!&quot;&quot;&quot;
  99. if authentication_status:
  100. API_KEY = config[&#39;credentials&#39;][&#39;usernames&#39;][username].get(&#39;openaiapi&#39;, &quot;&quot;)
  101. st.text(&quot;&quot;&quot;First text&quot;&quot;&quot;)
  102. text_input_container = st.empty()
  103. if API_KEY==&quot;&quot;:
  104. API_KEY = text_input_container.text_input(label=&#39;Introduce your OpenAI API Key:&#39;, label_visibility = &#39;hidden&#39;, placeholder=&#39;Introduce your OpenAI API Key:&#39;)
  105. if API_KEY != &quot;&quot;:
  106. text_input_container.empty()
  107. os.environ[&#39;OPENAI_API_KEY&#39;] = API_KEY
  108. And it continues as:
  109. if API_KEY != &#39;&#39;:
  110. # Load the index from your saved index.json file
  111. storage_context = StorageContext.from_defaults(persist_dir=&#39;./storage&#39;)
  112. # load index
  113. index = load_index_from_storage(storage_context)
  114. query_engine = index.as_query_engine()
  115. if &quot;history&quot; not in st.session_state:
  116. st.session_state.history = initial_history
  117. c = st.expander(&quot;Open to see the previous messages!&quot;)
  118. for i, chat in enumerate(st.session_state.history):
  119. if i &lt; len(st.session_state.history)-6:
  120. with c:
  121. message(**chat, key=str(i)) #unpacking
  122. else:
  123. message(**chat, key=str(i)) #unpacking
  124. st.text_input(&quot;You: &quot;, &quot;&quot;, key=&quot;input_text&quot;, on_change = generate_answer)
  125. The function generate_answer is:
  126. def generate_answer():
  127. user_msg = st.session_state.input_text
  128. st.session_state.input_text = &quot;&quot;
  129. response = query_engine.query(user_msg)
  130. st.session_state.history.append(
  131. {&quot;message&quot;: user_msg, &quot;is_user&quot;:True,
  132. &quot;avatar_style&quot;: &quot;fun-emoji&quot;,
  133. &quot;seed&quot;: 4}
  134. )
  135. st.session_state.history.append(
  136. {&quot;message&quot;: str(response).strip(), &quot;is_user&quot;:False,
  137. &quot;avatar_style&quot;: &quot;bottts-neutral&quot;,
  138. &quot;seed&quot;: 36}
  139. )
  140. I would love to receive any help with it.
  141. </details>
  142. # 答案1
  143. **得分**: 9
  144. You are probably setting the API key like this:
  145. ```python
  146. os.environ["OPENAI_API_KEY"] = 'YOUR_KEY'

The problem here is because you are filling this variable just in your local environment. To send it to OpenAI you have to do something like this:

  1. import openai
  2. openai.api_key = os.environ["OPENAI_API_KEY"]

I'll let a piece of my code here to you understand better.

  1. from llama_index import SimpleDirectoryReader, GPTVectorStoreIndex, LLMPredictor, ServiceContext, StorageContext, load_index_from_storage
  2. from langchain import OpenAI
  3. import gradio as gr
  4. import os
  5. import openai
  6. # Here I fill my LOCAL environment variable
  7. os.environ["OPENAI_API_KEY"] = 'YOUR_KEY'
  8. def construct_index(directory_path):
  9. # And here I fill the key to openAI
  10. openai.api_key = os.environ["OPENAI_API_KEY"]
  11. num_outputs = 512
  12. llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name="text-davinci-003", max_tokens=num_outputs))
  13. service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
  14. docs = SimpleDirectoryReader(directory_path).load_data()
  15. index = GPTVectorStoreIndex.from_documents(docs, service_context=service_context)
  16. index.storage_context.persist()
  17. return index

That's worked for me.

英文:

You are probably setting the API key like this:

  1. os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;

The problem here is because you are filling this variable just in your local environment. To send it to OpenAI you have to do something like this:

  1. import openai
  2. openai.api_key = os.environ[&quot;OPENAI_API_KEY&quot;]

I'll let a piece of my code here to you understand better.

  1. from llama_index import SimpleDirectoryReader, GPTVectorStoreIndex, LLMPredictor, ServiceContext, StorageContext, load_index_from_storage
  2. from langchain import OpenAI
  3. import gradio as gr
  4. import os
  5. import openai
  6. # Here I fill my LOCAL environment variable
  7. os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;
  8. def construct_index(directory_path):
  9. # And here I fill the key to openAI
  10. openai.api_key = os.environ[&quot;OPENAI_API_KEY&quot;]
  11. num_outputs = 512
  12. llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name=&quot;text-davinci-003&quot;, max_tokens=num_outputs))
  13. service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
  14. docs = SimpleDirectoryReader(directory_path).load_data()
  15. index = GPTVectorStoreIndex.from_documents(docs, service_context=service_context)
  16. index.storage_context.persist()
  17. return index

That's worked for me.

答案2

得分: 3

  1. openai.api_key = "key"
英文:
  1. os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;

change this

  1. openai.api_key = &quot;key&quot;

to this

huangapple
  • 本文由 发表于 2023年6月8日 00:56:33
  • 转载请务必保留本文链接:https://go.coder-hub.com/76425556.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定