将项目推送到数组中的速度应与屏幕捕获的速度相匹配。

huangapple go评论53阅读模式
英文:

How to push items inside an array at the rate at which the screen is being getting captured

问题

I am using getUserDisplay api to record the current tab of the user, Now I want to capture the users mouse move positions at the rate it is capturing the screen in (so that each frame could have one object of x and y coordinates)

const constraints = {
  audio: false,
  video: true,
  videoConstraints:{
    mandatory:
     {minFrameRate:60,maxFrameRate:90,maxWidth:1920,maxHeight:1080}} 
};

navigator.mediaDevices.getUserDisplay(constraints)
  .then(stream => {

    const mediaStream = stream; // 30 frames per second
    const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: 'video/webm',
    videoBitsPerSecond: 3000000
    });

in short > I recorded the screen and send the recorded video to the backend, in the backend decoded the frames, now for each frame there should the mouse x and y coordinates just like doing in real, then I would stitch the frames together and form a video
wants to do some editing with the recording

I don't wanna add the cursor on the video in the frontend js, but rather save the mouse coordinates and recorded video separate and send both to backend

I try using requestAnimationFrame, but it's not equal to the number of frames in the video, I tested and the number of frames in the recorded video were like 570 and the array only contains 194 items.

function tests() {
      testf.push('test');
      window.requestAnimationFrame(tests)
    }
英文:

I am using getUserDisplay api to record the current tab of the user, Now I want to capture the users mouse move positions at the rate it is capturing the screen in (so that each frame could have one object of x and y coordinates)


const constraints = {
  audio: false,
  video: true,
  videoConstraints:{
    mandatory:
     {minFrameRate:60,maxFrameRate:90,maxWidth:1920,maxHeight:1080}} 
};

navigator.mediaDevices.getUserDisplay(constraints)
  .then(stream => {

    const mediaStream = stream; // 30 frames per second
    const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: 'video/webm',
    videoBitsPerSecond: 3000000
    });
   

in short > I recorded the screen and send the recorded video to the backend, in the backend decoded the frames, now for each frame there should the mouse x and y coordinates just like doing in real, then I would stich the frames together and form a video
wants to do some editing with the recording

I don't wanna add the cursor on the video in the frontend js, but rather save the mouse coordiates and recorded video seperate and send both to backend

I try using requestAnimationFrame, but its not equal to the number of frames in the video, I tested and number of frames in the recorded video were like 570 and the array only contains 194 items.

 function tests() {
      testf.push('test');
      window.requestAnimationFrame(tests)
    }
    

Thank you so much for reading, any advice would be greatly appreciated 将项目推送到数组中的速度应与屏幕捕获的速度相匹配。

答案1

得分: 2

抱歉,以下是翻译好的内容:

很遗憾,从技术上讲,由于 API 不提供 onFrame 事件或类似的内容,无法在每个记录的帧上捕获鼠标位置。

另外,由于不同设备上的帧率在技术上可能不同,我不确定坚持记录的帧是否至关重要。

相反,我建议坚持计时。我们需要知道的是,在某一帧被记录时,鼠标位于特定位置的时间。换句话说,鼠标和帧上的图像应该同步。

这可以通过在视频录制开始时启动计时器,然后使用 requestAnimationFrame() 来调度一个函数来实现,该函数将记录{ timestamp_msec,mouseX,mouseY,mouseButtons },其中timestamp_msec 是自视频录制开始时以毫秒为单位的时间。

在接收端,我们将获得一个视频和上述对象的数组。在视频中,您始终可以获取当前播放位置。有了以毫秒为单位的位置,您始终可以找到时间戳小于或等于当前视频位置的鼠标位置对象。

另一件事是 requestAnimationFrame() 使您能够精确匹配记录鼠标位置与屏幕更新的时刻。即使您以高于 requestAnimationFrame() 提供的帧率记录视频,某些帧也只是重复的,价值不大。

我构建过几个类似于远程桌面的系统,上述方法效果非常好。

这里是一个使用 requestAnimationFrame 的快速演示:

let recordingStart = Date.now(); // 录制开始时设置此值
let lastKnownMousePosition = {};

window.addEventListener('mousemove', (event) => {
  lastKnownMousePosition = {
    mouseX: event.clientX,
    mouseY: event.clientY,
    mouseButtons: event.buttons,
  };
});

const frameHandler = () => {
  const mousePosition = {
    timestamp: Date.now() - recordingStart,
    ...lastKnownMousePosition,
  };
  // 在这里将 mousePosition 发送到服务器
  console.log(mousePosition);
  requestAnimationFrame(frameHandler);
};
requestAnimationFrame(frameHandler);

请告诉我这是否有帮助。

英文:

Unfortunately, it is technically impossible to capture the mouse position on each recorded frame because API doesn't provide an onFrame event or anything similar.

Additionally, because frame rates can technically be different on different devices, I'm not sure it is essential to stick to the recorded frames.

What I would recommend instead is to stick to the timing. What we need to know is that the mouse was at a specific position at the time when the certain frame was recorded. In other words, the mouse and picture on the frame should be in sync.

This can be achieved by starting a timer at the moment when the video recording starts and then using requestAnimationFrame() to schedule a function that will record { timestamp_msec, mouseX, mouseY, mouseButtons } where timestamp_msec is a time in msec since the moment when video recording started.

We will get a video and an array of the above objects on the receiving side. In the video, you can always obtain the current playing position. Having that position in msec, you can always find the mouse position object, which has a timestamp that is less or equal to the current video position.

Another thing is that requestAnimationFrame() gives you a way to precisely match the moment you record the mouse position with the screen update. Even if you record a video with a higher frame rate than requestAnimationFrame() can provide, some frames will be just duplicated and have little value.

I've built a few remote desktop like systems, and the approach described above works like a charm.

Here is a quick demo with requestAnimationFrame:

<!-- begin snippet: js hide: false console: true babel: false -->

<!-- language: lang-js -->

let recordingStart = Date.now(); // set this when you start recording
let lastKnownMousePosition = {};

window.addEventListener(&#39;mousemove&#39;, (event) =&gt; {
  lastKnownMousePosition = {
    mouseX: event.clientX,
    mouseY: event.clientY,
    mouseButtons: event.buttons,
  };
});

const frameHandler = () =&gt; {
  const mousePosition = {
    timestamp: Date.now() - recordingStart,
    ...lastKnownMousePosition,
  };
  // send mousePosition to server here
  console.log(mousePosition);
  requestAnimationFrame(frameHandler);
};
requestAnimationFrame(frameHandler);

<!-- end snippet -->

Please let me know if this helps.

huangapple
  • 本文由 发表于 2023年5月22日 14:32:58
  • 转载请务必保留本文链接:https://go.coder-hub.com/76303535.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定