RetryError:重试错误[<在0x7f89bc35eb90处完成的未来状态引发了AuthenticationError>]

huangapple go评论71阅读模式
英文:

tenacity.RetryError: RetryError[<Future at 0x7f89bc35eb90 state=finished raised AuthenticationError>]

问题

I am trying to deploy an app made with streamlit (using also streamlit_chat and streamlit_authenticator). This app is making use of llama-index to create a query engine incorporating chatgpt api. When I state "streamlit run app.py" in my computer, everything works fine, but when I deploy it the following error raises:

2023-06-07 16:45:28.682 Uncaught app exception

Traceback (most recent call last):

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 382, in call

result = fn(*args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py", line 106, in get_embedding

return openai.Embedding.create(input=[text], engine=engine)["data"][0]["embedding"]

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/embedding.py", line 33, in create

response = super().create(*args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create

) = cls.__prepare_create_request(

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 106, in __prepare_create_request

requestor = api_requestor.APIRequestor(

File "/home/appuser/venv/lib/python3.10/site-packages/openai/api_requestor.py", line 138, in init

self.api_key = key or util.default_api_key()

File "/home/appuser/venv/lib/python3.10/site-packages/openai/util.py", line 186, in default_api_key

raise openai.error.AuthenticationError(

openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = ', or you can set the environment variable OPENAI_API_KEY=). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = '. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 561, in _run_script

self._session_state.on_script_will_rerun(rerun_data.widget_states)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py", line 68, in on_script_will_rerun

self._state.on_script_will_rerun(latest_widget_states)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 482, in on_script_will_rerun

self._call_callbacks()

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 495, in _call_callbacks

self._new_widget_state.call_callback(wid)

File "/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 247, in call_callback

callback(*args, **kwargs)

File "/app/bajoquetumgpt/docsv2.py", line 35, in generate_answer

response = query_engine.query(user_msg)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/query/base.py", line 20, in query

return self._query(str_or_query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/query_engine/retriever_query_engine.py", line 139, in _query

nodes = self._retriever.retrieve(query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/base_retriever.py", line 21, in retrieve

return self._retrieve(str_or_query_bundle)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/token_counter/token_counter.py", line 78, in wrapped_llm_predict

f_return_val = f(_self, *args, **kwargs)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/vector_store/retrievers.py", line 62, in _retrieve

self._service_context.embed_model.get_agg_embedding_from_queries(

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 83, in get_agg_embedding_from_queries

query_embeddings = [self.get_query_embedding(query) for query in queries]

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 83, in

query_embeddings = [self.get_query_embedding(query) for query in queries]

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 72, in get_query_embedding

query_embedding = self._get_query_embedding(query)

File "/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py", line 223, in _get_query_embedding

return get_embedding(query, engine=engine)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 289, in wrapped_f

return self(f, *args, **kw)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 379, in call

do = self.iter(retry_state=retry_state)

File "/home/appuser/venv/lib/python3.10/site-packages/tenacity/init.py", line 326, in iter

raise retry_exc from fut.exception()

tenacity.RetryError: RetryError[<Future at 0x7f89bc35eb90 state=finished raised AuthenticationError>]

The code is private but I can show the part where the authenticator and the query engine are used:

import yaml
from yaml.loader import SafeLoader
with open('./config.yaml') as file:
    config = yaml.load(file, Loader=SafeLoader)

authenticator = stauth.Authenticate(
    config['credentials'],
    config['cookie']['name'],
    config['cookie']['key'],
    config['cookie']['expiry_days'],
    config['preauthorized']
)
name, authentication_status, username = authenticator.login('Login', 'main')
print(username, name, authentication_status)
if authentication_status:
    authenticator.logout('Logout', 'main')
    st.write(f'Welcome *{name}*')
elif authentication_status ==

<details>
<summary>英文:</summary>

I am trying to deploy an app made with streamlit (using also streamlit_chat and streamlit_authenticator). This app is making use of llama-index to create a query engine incorporating chatgpt api. When I state &quot;streamlit run app.py&quot; in my computer, everything works fine, but when I deploy it the following error raises:

        2023-06-07 16:45:28.682 Uncaught app exception
    
    Traceback (most recent call last):
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 382, in __call__
    
        result = fn(*args, **kwargs)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py&quot;, line 106, in get_embedding
    
        return openai.Embedding.create(input=[text], engine=engine)[&quot;data&quot;][0][&quot;embedding&quot;]
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/embedding.py&quot;, line 33, in create
    
        response = super().create(*args, **kwargs)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py&quot;, line 149, in create
    
        ) = cls.__prepare_create_request(
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py&quot;, line 106, in __prepare_create_request
    
        requestor = api_requestor.APIRequestor(
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/api_requestor.py&quot;, line 138, in __init__
    
        self.api_key = key or util.default_api_key()
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/openai/util.py&quot;, line 186, in default_api_key
    
        raise openai.error.AuthenticationError(
    
    openai.error.AuthenticationError: No API key provided. You can set your API key in code using &#39;openai.api_key = &lt;API-KEY&gt;&#39;, or you can set the environment variable OPENAI_API_KEY=&lt;API-KEY&gt;). If your API key is stored in a file, you can point the openai module at it with &#39;openai.api_key_path = &lt;PATH&gt;&#39;. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.
    
    
    The above exception was the direct cause of the following exception:
    
    
    Traceback (most recent call last):
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py&quot;, line 561, in _run_script
    
        self._session_state.on_script_will_rerun(rerun_data.widget_states)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py&quot;, line 68, in on_script_will_rerun
    
        self._state.on_script_will_rerun(latest_widget_states)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 482, in on_script_will_rerun
    
        self._call_callbacks()
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 495, in _call_callbacks
    
        self._new_widget_state.call_callback(wid)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py&quot;, line 247, in call_callback
    
        callback(*args, **kwargs)
    
      File &quot;/app/bajoquetumgpt/docsv2.py&quot;, line 35, in generate_answer
    
        response = query_engine.query(user_msg)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/query/base.py&quot;, line 20, in query
    
        return self._query(str_or_query_bundle)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/query_engine/retriever_query_engine.py&quot;, line 139, in _query
    
        nodes = self._retriever.retrieve(query_bundle)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/base_retriever.py&quot;, line 21, in retrieve
    
        return self._retrieve(str_or_query_bundle)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/token_counter/token_counter.py&quot;, line 78, in wrapped_llm_predict
    
        f_return_val = f(_self, *args, **kwargs)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/indices/vector_store/retrievers.py&quot;, line 62, in _retrieve
    
        self._service_context.embed_model.get_agg_embedding_from_queries(
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 83, in get_agg_embedding_from_queries
    
        query_embeddings = [self.get_query_embedding(query) for query in queries]
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 83, in &lt;listcomp&gt;
    
        query_embeddings = [self.get_query_embedding(query) for query in queries]
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/base.py&quot;, line 72, in get_query_embedding
    
        query_embedding = self._get_query_embedding(query)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/llama_index/embeddings/openai.py&quot;, line 223, in _get_query_embedding
    
        return get_embedding(query, engine=engine)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 289, in wrapped_f
    
        return self(f, *args, **kw)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 379, in __call__
    
        do = self.iter(retry_state=retry_state)
    
      File &quot;/home/appuser/venv/lib/python3.10/site-packages/tenacity/__init__.py&quot;, line 326, in iter
    
        raise retry_exc from fut.exception()
    
    tenacity.RetryError: RetryError[&lt;Future at 0x7f89bc35eb90 state=finished raised AuthenticationError&gt;]

The code is private but I can show the part where the authenticator and the query engine are used:

    import yaml
    from yaml.loader import SafeLoader
    with open(&#39;./config.yaml&#39;) as file:
        config = yaml.load(file, Loader=SafeLoader)
    
    authenticator = stauth.Authenticate(
        config[&#39;credentials&#39;],
        config[&#39;cookie&#39;][&#39;name&#39;],
        config[&#39;cookie&#39;][&#39;key&#39;],
        config[&#39;cookie&#39;][&#39;expiry_days&#39;],
        config[&#39;preauthorized&#39;]
    )
    name, authentication_status, username = authenticator.login(&#39;Login&#39;, &#39;main&#39;)
    print(username, name, authentication_status)
    if authentication_status:
        authenticator.logout(&#39;Logout&#39;, &#39;main&#39;)
        st.write(f&#39;Welcome *{name}*&#39;)
    elif authentication_status == False:
        st.error(&#39;Username/password is incorrect&#39;)
    elif authentication_status == None:
        st.warning(&#39;Please enter your username and password&#39;)
    FIRST_OUTPUT=&quot;&quot;&quot;Hello!&quot;&quot;&quot;
    
    if authentication_status:
        API_KEY = config[&#39;credentials&#39;][&#39;usernames&#39;][username].get(&#39;openaiapi&#39;, &quot;&quot;)
        st.text(&quot;&quot;&quot;First text&quot;&quot;&quot;)
        text_input_container = st.empty()
        if API_KEY==&quot;&quot;:
            API_KEY = text_input_container.text_input(label=&#39;Introduce your OpenAI API Key:&#39;, label_visibility = &#39;hidden&#39;, placeholder=&#39;Introduce your OpenAI API Key:&#39;)
        if API_KEY != &quot;&quot;:
            text_input_container.empty()
        os.environ[&#39;OPENAI_API_KEY&#39;] = API_KEY

And it continues as:

    if API_KEY != &#39;&#39;:
            # Load the index from your saved index.json file
            storage_context = StorageContext.from_defaults(persist_dir=&#39;./storage&#39;)
            # load index
            index = load_index_from_storage(storage_context)
    
    
            query_engine = index.as_query_engine()
    
    
            if &quot;history&quot; not in st.session_state:
                st.session_state.history = initial_history
            c = st.expander(&quot;Open to see the previous messages!&quot;)
            for i, chat in enumerate(st.session_state.history):
                if i &lt; len(st.session_state.history)-6:
                    with c:
                        message(**chat, key=str(i)) #unpacking
                else:
                    message(**chat, key=str(i)) #unpacking
            st.text_input(&quot;You: &quot;, &quot;&quot;, key=&quot;input_text&quot;, on_change = generate_answer)

The function generate_answer is:

    def generate_answer():
        user_msg = st.session_state.input_text
        st.session_state.input_text = &quot;&quot;
        response = query_engine.query(user_msg)
        st.session_state.history.append(
            {&quot;message&quot;: user_msg, &quot;is_user&quot;:True,
            &quot;avatar_style&quot;: &quot;fun-emoji&quot;,
            &quot;seed&quot;: 4}
        )
        st.session_state.history.append(
            {&quot;message&quot;: str(response).strip(), &quot;is_user&quot;:False,
            &quot;avatar_style&quot;: &quot;bottts-neutral&quot;,
            &quot;seed&quot;: 36}
        )


I would love to receive any help with it.

</details>


# 答案1
**得分**: 9

You are probably setting the API key like this:
```python
os.environ["OPENAI_API_KEY"] = 'YOUR_KEY'

The problem here is because you are filling this variable just in your local environment. To send it to OpenAI you have to do something like this:

import openai

openai.api_key = os.environ["OPENAI_API_KEY"]

I'll let a piece of my code here to you understand better.

from llama_index import SimpleDirectoryReader, GPTVectorStoreIndex, LLMPredictor, ServiceContext, StorageContext, load_index_from_storage
from langchain import OpenAI
import gradio as gr
import os
import openai

# Here I fill my LOCAL environment variable
os.environ["OPENAI_API_KEY"] = 'YOUR_KEY'

def construct_index(directory_path):
    # And here I fill the key to openAI
    openai.api_key = os.environ["OPENAI_API_KEY"]
    num_outputs = 512

    llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name="text-davinci-003", max_tokens=num_outputs))
    
    service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)

    docs = SimpleDirectoryReader(directory_path).load_data()

    index = GPTVectorStoreIndex.from_documents(docs, service_context=service_context)

    index.storage_context.persist()
    return index

That's worked for me.

英文:

You are probably setting the API key like this:

 os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;

The problem here is because you are filling this variable just in your local environment. To send it to OpenAI you have to do something like this:

import openai
openai.api_key = os.environ[&quot;OPENAI_API_KEY&quot;]

I'll let a piece of my code here to you understand better.

from llama_index import SimpleDirectoryReader, GPTVectorStoreIndex, LLMPredictor, ServiceContext, StorageContext, load_index_from_storage
from langchain import OpenAI
import gradio as gr
import os
import openai
# Here I fill my LOCAL environment variable
os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;
def construct_index(directory_path):
# And here I fill the key to openAI
openai.api_key = os.environ[&quot;OPENAI_API_KEY&quot;]
num_outputs = 512
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name=&quot;text-davinci-003&quot;, max_tokens=num_outputs))
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
docs = SimpleDirectoryReader(directory_path).load_data()
index = GPTVectorStoreIndex.from_documents(docs, service_context=service_context)
index.storage_context.persist()
return index

That's worked for me.

答案2

得分: 3

openai.api_key  = "key"
英文:
os.environ[&quot;OPENAI_API_KEY&quot;] = &#39;YOUR_KEY&#39;

change this

openai.api_key  = &quot;key&quot;

to this

huangapple
  • 本文由 发表于 2023年6月8日 00:56:33
  • 转载请务必保留本文链接:https://go.coder-hub.com/76425556.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定