英文:
How to fix `transformers` package not found error in a Python project with `py-langchain`, `llama-index`, and `gradio`?
问题
I get this error ModuleNotFoundError: No module named 'transformers' even though I did the pip install transformers command. Could you kindly help me? Thank you
我的代码:
import os
import sys
from dotenv import load_dotenv
import gradio as gr
from langchain import OpenAI
from llama_index import SimpleDirectoryReader, GPTListIndex, GPTVectorStoreIndex, LLMPredictor, PromptHelper
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')
def construct_index(directory_path):
max_input_size = 4096
num_outputs = 512
max_chunk_overlap = 20
chunk_size_limit = 600
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name="text-davinci-003", max_tokens=num_outputs))
documents = SimpleDirectoryReader(directory_path).load_data()
index = GPTVectorStoreIndex(documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper)
index.save_to_disk('index.json')
return index
def qabot(input_text):
index = GPTVectorStoreIndex.load_from_disk('index.json')
response = index.query(input_text, response_mode="compact")
return response.response
iface = gr.Interface(fn=qabot, inputs=gr.inputs.Textbox(lines=7, label='Enter you query'), outputs="text", title="Custom-trained QA Application")
index = construct_index("docs")
iface.launch(share=True)
注意:我已经删除了代码中的 HTML 实体编码(例如,& 变成了 &),以便更好地呈现代码。
英文:
I get this error ModuleNotFoundError: No module named 'transformers' even though I did the pip install transformers command. Could you kindly help me? Thank you
My code:
import os
import sys
from dotenv import load_dotenv
import gradio as gr
from langchain import OpenAI
from llama_index import SimpleDirectoryReader, GPTListIndex, GPTVectorStoreIndex, LLMPredictor, PromptHelper
##from transformers import pipeline
load_dotenv()
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')
def construct_index(directory_path):
max_input_size = 4096
num_outputs = 512
max_chunk_overlap = 20
chunk_size_limit = 600
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.7, model_name="text-davinci-003", max_tokens=num_outputs))
documents = SimpleDirectoryReader(directory_path).load_data()
index = GPTVectorStoreIndex(documents, llm_predictor=llm_predictor, prompt_helper=prompt_helper)
index.save_to_disk('index.json')
return index
def qabot(input_text):
index = GPTVectorStoreIndex.load_from_disk('index.json')
response = index.query(input_text, response_mode="compact")
return response.response
iface = gr.Interface(fn=qabot, inputs=gr.inputs.Textbox(lines=7, label='Enter you query'), outputs="text", title="Custom-trained QA Application")
index = construct_index("docs")
iface.launch(share=True)
答案1
得分: 1
也许你可以取消注释
from transformers import pipeline
英文:
Maybe you can uncomment
##from transformers import pipeline
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论