• <bdo id='eWTt7'></bdo><ul id='eWTt7'></ul>

      <small id='eWTt7'></small><noframes id='eWTt7'>

      <legend id='eWTt7'><style id='eWTt7'><dir id='eWTt7'><q id='eWTt7'></q></dir></style></legend>

    1. <i id='eWTt7'><tr id='eWTt7'><dt id='eWTt7'><q id='eWTt7'><span id='eWTt7'><b id='eWTt7'><form id='eWTt7'><ins id='eWTt7'></ins><ul id='eWTt7'></ul><sub id='eWTt7'></sub></form><legend id='eWTt7'></legend><bdo id='eWTt7'><pre id='eWTt7'><center id='eWTt7'></center></pre></bdo></b><th id='eWTt7'></th></span></q></dt></tr></i><div id='eWTt7'><tfoot id='eWTt7'></tfoot><dl id='eWTt7'><fieldset id='eWTt7'></fieldset></dl></div>

        <tfoot id='eWTt7'></tfoot>
      1. 我们如何使用 mediaRecorder 将画布流与音频流混合

        时间:2023-06-21
            <bdo id='2WYtp'></bdo><ul id='2WYtp'></ul>
            <legend id='2WYtp'><style id='2WYtp'><dir id='2WYtp'><q id='2WYtp'></q></dir></style></legend>

            <small id='2WYtp'></small><noframes id='2WYtp'>

          • <tfoot id='2WYtp'></tfoot>

              <tbody id='2WYtp'></tbody>

                1. <i id='2WYtp'><tr id='2WYtp'><dt id='2WYtp'><q id='2WYtp'><span id='2WYtp'><b id='2WYtp'><form id='2WYtp'><ins id='2WYtp'></ins><ul id='2WYtp'></ul><sub id='2WYtp'></sub></form><legend id='2WYtp'></legend><bdo id='2WYtp'><pre id='2WYtp'><center id='2WYtp'></center></pre></bdo></b><th id='2WYtp'></th></span></q></dt></tr></i><div id='2WYtp'><tfoot id='2WYtp'></tfoot><dl id='2WYtp'><fieldset id='2WYtp'></fieldset></dl></div>

                  本文介绍了我们如何使用 mediaRecorder 将画布流与音频流混合的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我有一个使用 canvas.captureStream() 的画布流.我有来自 webrtc 视频通话的另一个视频流.现在我想将画布流与视频流的音轨混合.我该怎么做?

                  I have a canvas stream using canvas.captureStream(). I have another video stream from webrtc video call. Now i want to mix canvas stream with audio tracks of the video stream.How can i do that?

                  推荐答案

                  使用 MediaStream 构造函数 在 Firefox 和 Chrome 56 中可用,用于将轨道组合成一个新流:

                  Use the MediaStream constructor available in Firefox and Chrome 56, to combine tracks into a new stream:

                  let stream = new MediaStream([videoTrack, audioTrack]);
                  

                  以下内容在 Firefox 中对我有用(在 Chrome 中使用 https fiddle,尽管在录制时会出错):

                  The following works for me in Firefox (Use https fiddle in Chrome, though it errors on recording):

                  navigator.mediaDevices.getUserMedia({audio: true})
                    .then(stream => record(new MediaStream([stream.getTracks()[0],
                                                            whiteNoise().getTracks()[0]]), 5000)
                      .then(recording => {
                        stop(stream);
                        video.src = link.href = URL.createObjectURL(new Blob(recording));
                        link.download = "recording.webm";
                        link.innerHTML = "Download recording";
                        log("Playing "+ recording[0].type +" recording:");
                      })
                      .catch(log))
                    .catch(log);
                  
                  var whiteNoise = () => {
                    let ctx = canvas.getContext('2d');
                    ctx.fillRect(0, 0, canvas.width, canvas.height);
                    let p = ctx.getImageData(0, 0, canvas.width, canvas.height);
                    requestAnimationFrame(function draw(){
                      for (var i = 0; i < p.data.length; i++) {
                        p.data[i++] = p.data[i++] = p.data[i++] = Math.random() * 255;
                      }
                      ctx.putImageData(p, 0, 0);
                      requestAnimationFrame(draw);
                    });
                    return canvas.captureStream(60);
                  }
                  
                  var record = (stream, ms) => {
                    var rec = new MediaRecorder(stream), data = [];
                    rec.ondataavailable = e => data.push(e.data);
                    rec.start();
                    log(rec.state + " for "+ (ms / 1000) +" seconds...");
                    var stopped = new Promise((y, n) =>
                        (rec.onstop = y, rec.onerror = e => n(e.error || e.name)));
                    return Promise.all([stopped, wait(ms).then(_ => rec.stop())]).then(_ => data);
                  };
                  
                  var stop = stream => stream.getTracks().forEach(track => track.stop());
                  var wait = ms => new Promise(resolve => setTimeout(resolve, ms));
                  var log = msg => div.innerHTML += "<br>" + msg;

                  <div id="div"></div><br>
                  <canvas id="canvas" width="160" height="120" hidden></canvas>
                  <video id="video" width="160" height="120" autoplay></video>
                  <a id="link"></a>

                  这篇关于我们如何使用 mediaRecorder 将画布流与音频流混合的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  上一篇:HTML5 画布是否总是必须是矩形? 下一篇:如何在 HTML 画布上渲染图标字体,尤其是 Material Design 图标字体?

                  相关文章

                2. <small id='HRwBd'></small><noframes id='HRwBd'>

                  <legend id='HRwBd'><style id='HRwBd'><dir id='HRwBd'><q id='HRwBd'></q></dir></style></legend>
                  <i id='HRwBd'><tr id='HRwBd'><dt id='HRwBd'><q id='HRwBd'><span id='HRwBd'><b id='HRwBd'><form id='HRwBd'><ins id='HRwBd'></ins><ul id='HRwBd'></ul><sub id='HRwBd'></sub></form><legend id='HRwBd'></legend><bdo id='HRwBd'><pre id='HRwBd'><center id='HRwBd'></center></pre></bdo></b><th id='HRwBd'></th></span></q></dt></tr></i><div id='HRwBd'><tfoot id='HRwBd'></tfoot><dl id='HRwBd'><fieldset id='HRwBd'></fieldset></dl></div>

                        <bdo id='HRwBd'></bdo><ul id='HRwBd'></ul>

                      <tfoot id='HRwBd'></tfoot>