英文:
How to send OpenAI stream response from Nextjs API to client
问题
I tried openai-streams + nextjs-openai, they only works for Node 18+, however, they failed on Node 17 and lower. I'm restricted to Node 17 and lower as Digital Oceans App Platform currently not supporting Node 18.
我尝试了 openai-streams + nextjs-openai,它们只适用于 Node 18+,但在 Node 17 及以下版本上失败。由于 Digital Ocean App Platform 目前不支持 Node 18,我只能使用 Node 17 及以下版本。
I also tried this method which works well on the client side, but it exposes the API key. I want to implement it within the NextJS API route, but I'm unable to pass the streaming response to the client.
我还尝试了 这种方法,它在客户端上效果很好,但会暴露 API 密钥。我想在 NextJS API 路由中实现它,但无法将流式响应传递给客户端。
With the code below, I can only get the first chunk of response from the API route, and not able to get the streaming response to have the ChatGPT effect. Please kindly help.
使用下面的代码,我只能获取 API 路由的第一个响应块,无法获取流式响应以实现 ChatGPT 效果。请帮忙看一下。
// /api/prompt.js
import { Configuration, OpenAIApi } from "openai";
import { Readable } from "readable-stream";
const configuration = new Configuration({
apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY,
});
const openai = new OpenAIApi(configuration);
export default async function handler(req, res) {
const completion = await openai.createCompletion(
{
model: "text-davinci-003",
prompt: "tell me a story",
max_tokens: 500,
stream: true,
},
{ responseType: "stream" }
);
completion.data.on("data", async (data) => {
const lines = data
.toString()
.split("\n")
.filter((line) => line.trim() !== "");
for (const line of lines) {
const message = line.replace(/^data: /, "");
if (message === "[DONE]") {
return;
}
try {
const parsed = JSON.parse(message);
const string = parsed.choices[0].text;
Readable.from(string).pipe(res);
} catch (error) {
console.error("Could not JSON parse stream message", message, error);
}
}
});
}
// /components/Completion.js
export default function Completion() {
const [text, setText] = useState();
const generate = async () => {
const response = await fetch("/api/prompt");
console.log("response: ", response);
const text = await response.text();
console.log("text: ", text);
setText((state) => state + text);
};
// ... rest
}
英文:
I tried openai-streams + nextjs-openai, they only works for Node 18+, however, they failed on Node 17 and lower. I'm restricted to Node 17 and lower as Digital Oceans App Platform currently not supporting Node 18.
I also tried this method which works well on client side, but it exposes the API key. I want to implement within the NextJS API route, but I'm unable to pass the streaming response to the client.
With the code below, I can only get the first chunk of response from the API route, and not able to get the streaming response to have the ChatGPT effect. Please kindly help.
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
// /api/prompt.js
import { Configuration, OpenAIApi } from "openai";
import { Readable } from "readable-stream";
const configuration = new Configuration({
apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY,
});
const openai = new OpenAIApi(configuration);
export default async function handler(req, res) {
const completion = await openai.createCompletion(
{
model: "text-davinci-003",
prompt: "tell me a story",
max_tokens: 500,
stream: true,
},
{ responseType: "stream" }
);
completion.data.on("data", async (data) => {
const lines = data
.toString()
.split("\n")
.filter((line) => line.trim() !== "");
for (const line of lines) {
const message = line.replace(/^data: /, "");
if (message === "[DONE]") {
return;
}
try {
const parsed = JSON.parse(message);
const string = parsed.choices[0].text;
Readable.from(string).pipe(res);
} catch (error) {
console.error("Could not JSON parse stream message", message, error);
}
}
});
<!-- end snippet -->
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
// /components/Completion.js
export default function Completion() {
const [text, setText] = useState();
const generate = async () => {
const response = await fetch("/api/prompt");
console.log("response: ", response);
const text = await response.text();
console.log("text: ", text);
setText((state) => state + text);
};
// ... rest
}
<!-- end snippet -->
答案1
得分: 1
你可以使用vercel AI sdk
的StreamingTextResponse
。
以下是一些来自他们文档的示例代码:
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from 'ai'
const config = new Configuration({
apiKey: process.env.OPENAI_API_KEY
})
const openai = new OpenAIApi(config)
export const runtime = 'edge'
export async function POST(req) {
const { messages } = await req.json()
const response = await openai.createChatCompletion({
model: 'gpt-4',
stream: true,
messages
})
const stream = OpenAIStream(response)
return new StreamingTextResponse(stream)
}
注意:
- 这需要使用Edge Runtime。
openai-edge
库要求使用Node 18+。然而,官方的openai
库很快就会支持流式传输,因此这个库可能会被弃用。
英文:
Well you can use StreamingTextResponse
of vercel AI sdk
Attached is some example code form their docs
import { Configuration, OpenAIApi } from 'openai-edge'
import { OpenAIStream, StreamingTextResponse } from 'ai'
const config = new Configuration({
apiKey: process.env.OPENAI_API_KEY
})
const openai = new OpenAIApi(config)
export const runtime = 'edge'
export async function POST(req) {
const { messages } = await req.json()
const response = await openai.createChatCompletion({
model: 'gpt-4',
stream: true,
messages
})
const stream = OpenAIStream(response)
return new StreamingTextResponse(stream)
}
Notes:
- This requires the Edge Runtime
- The library
openai-edge
requires Node 18+. However, this is expected to be deprecated soon with the support for streaming in the officialopenai
library.
答案2
得分: 0
你可以在 Node <18 上使用 openai-streams/node
入口,它将返回一个 Node.js Readable
,而不是 WHATWG ReadableStream
。我会尽快更新文档以使其更清晰。
Node:在Next.js API路由中消耗流
如果你无法使用 Edge 运行时,或者出于其他原因想要消耗 Node.js 流,请使用 openai-streams/node
:
import type { NextApiRequest, NextApiResponse } from "next";
import { OpenAI } from "openai-streams/node";
export default async function test(_: NextApiRequest, res: NextApiResponse) {
const stream = await OpenAI("completions", {
model: "text-davinci-003",
prompt: "写一个快乐的句子。\n\n",
max_tokens: 25,
});
stream.pipe(res);
}
英文:
You can use the openai-streams/node
entrypoint on Node <18, which will return a Node.js Readable
instead of a WHATWG ReadableStream
. I'll update the docs to be clearer soon.
Node: Consuming streams in Next.js API Route
If you cannot use an Edge runtime or want to consume Node.js streams for another reason, use openai-streams/node
:
import type { NextApiRequest, NextApiResponse } from "next";
import { OpenAI } from "openai-streams/node";
export default async function test(_: NextApiRequest, res: NextApiResponse) {
const stream = await OpenAI("completions", {
model: "text-davinci-003",
prompt: "Write a happy sentence.\n\n",
max_tokens: 25,
});
stream.pipe(res);
}
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论