无法将提示模板传递给语言链中的检索问答模块。

huangapple go评论145阅读模式
英文:

unable to pass prompt template to RetrievalQA in langchain

问题

I am new to Langchain and followed this Retrival QA - Langchain. I have a custom prompt but when I try to pass Prompt with chain_type_kwargs its throws error in pydantic StuffDocumentsChain. and on removing chain_type_kwargs it just works.

how can pass to the prompt?

  1. File /usr/local/lib/python3.11/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.__init__()
  2. ValidationError: 1 validation error for StuffDocumentsChain
  3. __root__
  4. document_variable_name context was not found in llm_chain input_variables: ['question'] (type=value_error)

Code

  1. import json, os
  2. from langchain.chains import RetrievalQA
  3. from langchain.llms import OpenAI
  4. from langchain.document_loaders import JSONLoader
  5. from langchain.text_splitter import CharacterTextSplitter
  6. from langchain.embeddings.openai import OpenAIEmbeddings
  7. from langchain.vectorstores import Chroma
  8. from langchain.chat_models import ChatOpenAI
  9. from langchain import PromptTemplate
  10. from pathlib import Path
  11. from pprint import pprint
  12. os.environ["OPENAI_API_KEY"] = "my-key"
  13. def metadata_func(record: dict, metadata: dict) -> dict:
  14. metadata["drug_name"] = record["drug_name"]
  15. return metadata
  16. loader = JSONLoader(
  17. file_path='./drugs_data_v2.json',
  18. jq_schema='.drugs[]',
  19. content_key="data",
  20. metadata_func=metadata_func)
  21. docs = loader.load()
  22. text_splitter = CharacterTextSplitter(chunk_size=5000, chunk_overlap=200)
  23. texts = text_splitter.split_documents(docs)
  24. embeddings = OpenAIEmbeddings()
  25. docsearch = Chroma.from_documents(texts, embeddings)
  26. template = """/
  27. example custom prompt
  28. Question: {question}
  29. Answer:
  30. """
  31. PROMPT = PromptTemplate(template=template, input_variables=['question'])
  32. qa = RetrievalQA.from_chain_type(
  33. llm=ChatOpenAI(
  34. model_name='gpt-3.5-turbo-16k'
  35. ),
  36. chain_type="stuff",
  37. chain_type_kwargs={"prompt": PROMPT},
  38. retriever=docsearch.as_retriever(),
  39. )
  40. query = "What did the president say about Ketanji Brown Jackson"
  41. qa.run(query)
  1. <details>
  2. <summary>英文:</summary>
  3. I am new to Langchain and followed this [Retrival QA - Langchain][1]. I have a custom prompt but when I try to pass Prompt with `chain_type_kwargs` its throws error in `pydantic` `StufDocumentsChain`. and on removing `chain_type_kwargs` itt just works.
  4. how can pass to the prompt?
  5. error
  6. ---
  7. ```cmd
  8. File /usr/local/lib/python3.11/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.__init__()
  9. ValidationError: 1 validation error for StuffDocumentsChain
  10. __root__
  11. document_variable_name context was not found in llm_chain input_variables: [&#39;question&#39;] (type=value_error)

Code

  1. import json, os
  2. from langchain.chains import RetrievalQA
  3. from langchain.llms import OpenAI
  4. from langchain.document_loaders import JSONLoader
  5. from langchain.text_splitter import CharacterTextSplitter
  6. from langchain.embeddings.openai import OpenAIEmbeddings
  7. from langchain.vectorstores import Chroma
  8. from langchain.chat_models import ChatOpenAI
  9. from langchain import PromptTemplate
  10. from pathlib import Path
  11. from pprint import pprint
  12. os.environ[&quot;OPENAI_API_KEY&quot;] = &quot;my-key&quot;
  13. def metadata_func(record: dict, metadata: dict) -&gt; dict:
  14. metadata[&quot;drug_name&quot;] = record[&quot;drug_name&quot;]
  15. return metadata
  16. loader = JSONLoader(
  17. file_path=&#39;./drugs_data_v2.json&#39;,
  18. jq_schema=&#39;.drugs[]&#39;,
  19. content_key=&quot;data&quot;,
  20. metadata_func=metadata_func)
  21. docs = loader.load()
  22. text_splitter = CharacterTextSplitter(chunk_size=5000, chunk_overlap=200)
  23. texts = text_splitter.split_documents(docs)
  24. embeddings = OpenAIEmbeddings()
  25. docsearch = Chroma.from_documents(texts, embeddings)
  26. template = &quot;&quot;&quot;/
  27. example custom prommpt
  28. Question: {question}
  29. Answer:
  30. &quot;&quot;&quot;
  31. PROMPT = PromptTemplate(template=template, input_variables=[&#39;question&#39;])
  32. qa = RetrievalQA.from_chain_type(
  33. llm=ChatOpenAI(
  34. model_name=&#39;gpt-3.5-turbo-16k&#39;
  35. ),
  36. chain_type=&quot;stuff&quot;,
  37. chain_type_kwargs={&quot;prompt&quot;: PROMPT},
  38. retriever=docsearch.as_retriever(),
  39. )
  40. query = &quot;What did the president say about Ketanji Brown Jackson&quot;
  41. qa.run(query)

答案1

得分: 1

{context}在模板中缺失。

英文:

{context} is missing in template.

答案2

得分: 0

尝试在定义之后格式化 PROMPT:

  1. PROMPT.format(question="你的问题")
英文:

Try to format the PROMPT after you define it

  1. PROMPT.format(question=&quot;your question&quot;)

huangapple
  • 本文由 发表于 2023年6月26日 15:25:21
  • 转载请务必保留本文链接:https://go.coder-hub.com/76554411.html
匿名

发表评论

匿名网友
#

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定