英文:
Timeline editing with ffmpeg-python
问题
我试图在特定时间点在屏幕的某个部分绘制一个框。我尝试查看ffmpeg-python的文档,但无法找到对enable
选项的任何支持。
我尝试使用以下代码:
(
ffmpeg
.input(video)
.drawbox(x, y, w, h, enable=f"between(t,{timestamp},{total_time})")
.output("out.mp4")
.run()
)
希望它会作为kwarg传递给ffmpeg,但这对视频没有影响,框永远不会被绘制。
英文:
I'm trying to draw a box on one part of the screen at a specific instance in time. I've tried looking through the documentation for ffmpeg-python, but am unable to locate any instance of support for the enable
option.
I tried using
(
ffmpeg
.input(video)
.drawbox(x, y, w, h, enable=f"between(t,{timestamp},{total_time})")
.output("out.mp4")
.run()
)
hoping that it would be passed to ffmpeg as a kwarg, but this has no effect on the video and the box never gets drawn.
答案1
得分: 2
根据ffmpeg-python的drawbox文档,语法如下:
ffmpeg.drawbox(stream, x, y, width, height, color, thickness=None, **kwargs)
第一个参数是流 - 应该是一个视频流。
我们可以使用drawbox
过滤器如下:
filtered_input_video = ffmpeg.drawbox(ffmpeg.input("in.mp4").video, x, y, w, h, color="black", enable=f"between(t, {start_time}, {end_time})")
请注意,between
的参数是开始时间和结束时间。
filtered_input_video
作为ffmpeg.output
的第一个参数:
ffmpeg.output(filtered_input_video, "out.mp4").run()
更新的代码示例(使用测试模式作为输入):
import ffmpeg
x, y, w, h = 50, 5, 100, 100
start_time = 3
end_time = 7
input_video = ffmpeg.input("testsrc=size=192x108:rate=1:duration=10", f="lavfi").video
filtered_input_video = ffmpeg.drawbox(input_video, x, y, w, h, color="black", enable=f"between(t, {start_time}, {end_time})")
(
ffmpeg
.output(filtered_input_video, "out.mp4", vcodec="libx264", pix_fmt="yuv420p")
.overwrite_output() # 如果已存在,覆盖输出(可选)
.run()
)
示例输出(框在第3秒到第7秒之间显示):
英文:
According to the ffmpeg-python drawbox documentation the syntax is:
ffmpeg.drawbox(stream, x, y, width, height, color, thickness=None, **kwargs)
The first argument is a stream - it should be a video stream.
We may use drawbox
filter as follows:
filtered_input_video = ffmpeg.drawbox(ffmpeg.input("in.mp4").video, x, y, w, h, color="black", enable=f"between(t, {start_time}, {end_time})")
Note that the arguments of between
are start time and end time.
filtered_input_video
comes as first argument of ffmpeg.output
:
ffmpeg.output(filtered_input_video, "out.mp4").run()
Updated code sample (using test pattern as input):
import ffmpeg
x, y, w, h = 50, 5, 100, 100
start_time = 3
end_time = 7
input_video = ffmpeg.input("testsrc=size=192x108:rate=1:duration=10", f="lavfi").video # Replace this with: ffmpeg.input(video).video
#filtered_input_video = ffmpeg.filter(input_video, "drawbox", x, y, w, h, enable=f"between(t, {start_time}, {end_time})") # Alternative syntax
filtered_input_video = ffmpeg.drawbox(input_video, x, y, w, h, color="black", enable=f"between(t, {start_time}, {end_time})")
(
ffmpeg
.output(filtered_input_video, "out.mp4", vcodec="libx264", pix_fmt="yuv420p")
.overwrite_output() # Overwrite output if already exist (optional)
.run()
)
Sample output (box is shown from the 3'rd second to the 7'th):
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论