英文:
I have a question about overlaying the drawing process on a Canvas onto a Video and recording it as a video. Can this be achieved using WebCodecs?
问题
我想将绘制在Canvas(输入)上的过程内容合成并记录成电影,然后叠加在视频流上。
WebCodecs API是否有实现这一功能的方法?
我当前的方法是使用createImageBitmap()来捕获Video(输入)和Canvas(输入)的每个画面,然后传递给WebWorker。
在WebWorker中,将Video和Canvas绘制到另一个Canvas(输出)。绘制完成后,再次传递图片,重复此过程。
然后在Canvas(输出)上执行captureStream()。
最后,对Canvas(输出)流进行编码并将其写入FileWritable。
但我发现,即使我使用createImageBitmap生成可传递对象传递给WebWorker,这很可能会导致程序拥塞。
我想知道是否有更好的方法将Canvas内容叠加在视频上并输出电影。
抱歉,英语不是我的母语。
英文:
I want to synthesize and record the process content drawn on Canvas(input) together into a movie and overlay it on top of the Video Stream
Is there a way for the WebCodecs API to do this?
My current approach is to use createImageBitmap() to capture one picture each for Video(input) and Canvas(input) and pass it to the WebWorker
In WebWorker, the Video and Canvas are drawn to another Canvas(output). After drawing, the picture is passed in again, and this cycle is repeated
And then do a captureStream() on Canvas(output)
Finally, Encode this Canvas(Output) Stream and write it to FileWritable
But I found that this was very likely to cause congestion in my program, even though I used createImageBitmap to generate the ImageBitmap passed in the transferable object to the WebWorker
I wonder if there's a better way to overlay canvas content on top of video and output a movie
Sorry English is not my first language
答案1
得分: 1
是的,WebCodecs可以从您的画布动画输出新的视频。执行这些的基本步骤如下:
- 创建和配置视频编码器实例。查看文档以了解如何操作(https://developer.mozilla.org/en-US/docs/Web/API/VideoEncoder)
- 创建和配置复用器。有许多不同的复用器可用作库。我正在使用这个来创建WebM文件,因为它使用简单:https://github.com/Vanilagy/webm-muxer。
- 使用画布作为参数创建videoFrames,只需添加相应的视频时间戳:new VideoFrame(canvas, { timestamp });
- 将videoFrame传递给编码器,它将输出一个已编码的视频帧。
- 将已编码的视频帧传递给复用器。
这个示例使用了网络摄像头流,您可以在这里获取代码,只需将视频帧的来源更改为画布:https://vanilagy.github.io/webm-muxer/demo-streaming/
英文:
Yes, webcodecs can output a new video from your canvas animation. Basic steps for these are as follows:
- Create and configure a video encoder instance. Check the documentation to learn how to (https://developer.mozilla.org/en-US/docs/Web/API/VideoEncoder)
- Create and configure a muxer. There are many different muxers available as libraries. I am using this one to make webm files, because it is simple to use: https://github.com/Vanilagy/webm-muxer.
- Create the videoFrames using the canvas as the argument, just adding the respective video timestamp: new VideoFrame(canvas, { timestamp });
- Feed the videoFrame to the encoder, which will output an encodedVideoFrame
- Feed the encodedVideoFrame to the muxer.
This example use the webcam stream, you can grab the code there and just change the source of the video frames to the canvas:
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论