WebRTC video with transparent background

1,920 views
Skip to first unread message

Shai Amir

unread,
Jan 30, 2021, 4:00:50 PM1/30/21
to discuss-webrtc
Hi,

I'm trying to transmit the WebRTC video with transparent background (I do not want to manipulate it on the other side using canvas).
I'm using Tensorflow BodyPix to segment the person, and indeed my local video is drawn with a transparent background on my local canvas.
I'm using captureStream in order to send out my transparent video to the other WebRTC client.
However, the client seems to be receiving my video with black background instead of transparent.
My assumption was that WebM/VP8 would handle the transparency.
Am I missing something?

Thanks in advance!

guest271314

unread,
Jan 31, 2021, 12:30:16 PM1/31/21
to discuss-webrtc
Can you post the code here or create a plnkr https://plnkr.co with the relevant code included?

guest271314

unread,
Jan 31, 2021, 1:04:45 PM1/31/21
to discuss-webrtc
Will a "white" background work?

const ctx = canvas.getContext('2d');
ctx.globalAlpha = 0.0;
ctx.globalCompositeOperation = 'source-in';
//...
ctx.fillStyle = '#ffffffff';
ctx.fillRect(0, 0, canvas.width, canvas.height);
//...
ctx.drawImage(capture, 0, 0, 50, 50);
canvasTrack.requestFrame();

Dhimant Bhayani

unread,
Jan 31, 2021, 1:20:55 PM1/31/21
to discuss...@googlegroups.com
Are you using a bodypix model of tensorflow in native app or browser?

We tried in browser and performance was not acceptable.

Any pointers is appreciated.

Thx.

Sent from my iPhone

On Jan 31, 2021, at 10:04 AM, guest271314 <guest...@gmail.com> wrote:

Will a "white" background work?
--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/e2d6cc66-4f62-4560-ae63-d666a0c49e8bn%40googlegroups.com.

guest271314

unread,
Jan 31, 2021, 5:23:29 PM1/31/21
to discuss-webrtc
I tried with WebRTC on Chromium 90. The background is rendered the color black.

Locally the MediaStream does render the transparent background at the HTMLVideoElement.

MediaRecorder captures the transparent background encoded in the MediaStreamTrack.

We can use RTCDataChannel to stream ArrayBuffer (Chromium does not currently support Blob over data channel https://bugs.chromium.org/p/webrtc/issues/detail?id=2276) from MediaRecorder.dataavailable event, append the WebM buffer to SourceBuffer, https://plnkr.co/edit/xWDJUlLuxxiZI76R?open=lib%2Fscript.js

onload = () => {
const canvas = document.querySelector('canvas');
const video = document.querySelector('video');
const ctx = canvas.getContext('2d');

video.onplay = async (e) => {
video.onplay = null;
console.log(e);
try {
const canvasStream = canvas.captureStream(0);
const [canvasTrack] = canvasStream.getVideoTracks();
canvasTrack.onmute = canvasTrack.onunmute = (e) => {
console.log(e);
};

const renderStream = document.createElement('video');
renderStream.autoplay = renderStream.controls = true;
document.body.appendChild(renderStream);
const ms = new MediaSource();
const msRender = document.createElement('video');
document.body.appendChild(msRender);
msRender.autoplay = msRender.controls = true;
let sourceBuffer;
ms.onsourceopen = (e) => {
sourceBuffer = ms.addSourceBuffer('video/webm;codecs=vp8');
};
msRender.src = URL.createObjectURL(ms);
renderStream.srcObject = canvasStream;
const recorder = new MediaRecorder(renderStream.srcObject);
recorder.ondataavailable = async ({ data }) => {
const buffer = await data.arrayBuffer();
// send buffer to RTCDataChannel
sourceBuffer.appendBuffer(buffer);
};
recorder.start(100);
//setTimeout(() => recorder.stop(), 30000)
const rs = new ReadableStream({
async pull(controller) {
if (!video.paused) {
const frame = await createImageBitmap(video, {
resizeWidth: 50,
resizeHeight: 50,
});
ctx.drawImage(frame, 0, 0, 50, 50);
canvasTrack.requestFrame();
controller.enqueue(
await new Promise((resolve) => setTimeout(resolve, 1000 / 60))
);
} else {
controller.close();
}
},
}).pipeTo(new WritableStream());
} catch (err) {
console.error(err);
}
};
};

Shai Amir

unread,
Feb 2, 2021, 6:47:09 AM2/2/21
to discuss-webrtc
Uncaught (in promise) DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
    at MediaRecorder.recorder.ondataavailable

I didn't get what you're trying to do
Would appreciate another explanation.
Where exactly do we lose the transparency?
Thanks.

guest271314

unread,
Feb 2, 2021, 9:45:58 AM2/2/21
to discuss...@googlegroups.com
On Chromium WebRTC does not preserve transparency of frames. MediaRecorder does preserve transparency of frames.

On Firefox MediaRecorder does not preserve transparency of frames https://bugzilla.mozilla.org/show_bug.cgi?id=1689927#c2 and can freeze machine.

I was able to stream the recorded MediaStream using MediaSource.

The plnkr https://plnkr.co/edit/xWDJUlLuxxiZI76R?preview and code in the Firefox bug can be run at Chromium without the DOMException you describe being thrown.The same code will throw at Firefox if pipeTo() is not changed to a ReadableStream and ReadableStreamDefaultReader, because Firefox currently does not support pipeTo().

You received this message because you are subscribed to a topic in the Google Groups "discuss-webrtc" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/discuss-webrtc/abe2lVcjVtE/unsubscribe.
To unsubscribe from this group and all its topics, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/ccf4adf6-5359-4a34-b41d-09d1b1d38e08n%40googlegroups.com.

guest271314

unread,
Feb 2, 2021, 9:50:32 AM2/2/21
to discuss-webrtc
Firefox also does not support

{
resizeWidth: 50,
resizeHeight: 50,
}

at createImageBitmap().

I will run Chromium again to test the output. I tested at Chromium 90 before Firefox 84 froze X, though I could not isolate the freeze to trying to stream transparent frames alone, I just filed the Firefox bug for MediaRecorder not encoding transparent frames.

guest271314

unread,
Feb 2, 2021, 10:01:24 AM2/2/21
to discuss-webrtc
I do not have the time available right now to create a minimal, complete, verifiable example of the concept using RTCDataChannel instead of RTCPeerConnection.

I will adjust this code https://plnkr.co/edit/XhRMpnw0lebPdA8J?preview to do that later within a few days to test the concept.

Using MediaSource is not necessarily ideal due to limitatations on SourceBuffer internal size https://bugs.chromium.org/p/chromium/issues/detail?id=1162820 and audio synchronization on Chromium https://bugs.chromium.org/p/chromium/issues/detail?id=1006617, however could be a workaround in some cases.

guest271314

unread,
Feb 7, 2021, 1:43:09 AM2/7/21
to discuss-webrtc
An example of streaming transparent frames using RTCDataChannel plnkr https://plnkr.co/edit/2Y9B97LmGbl5XR0d, in pertinent part

Offer

const channel = remote.createDataChannel('stream-transparent-frames', {
negotiated: true,
id: 0,
});
channel.binaryType = 'arraybuffer';
channel.onopen = async (e) => {
console.log(e, remote, channel);
const canvas = document.createElement('canvas');
const ctx = canvas.getContext('2d');
const canvasStream = canvas.captureStream(0);
const [canvasTrack] = canvasStream.getVideoTracks();
canvasTrack.onmute = canvasTrack.onunmute = (e) => {
console.log(e);
};
const rs = new ReadableStream({
async pull(controller) {
if (!capture.paused) {
const frame = await createImageBitmap(capture, {
resizeWidth: 50,
resizeHeight: 50,
});
ctx.drawImage(frame, 0, 0, 50, 50);
canvasTrack.requestFrame();
controller.enqueue(
await new Promise((resolve) =>
canvas.toBlob(resolve, 'image/webp')
)
);
// await new Promise((resolve) => setTimeout(resolve, 1000 / 60));
} else {
controller.close();
}
},
}).pipeTo(
new WritableStream({
async write(value, controller) {
channel.send(await value.arrayBuffer());
remote
.getStats()
.then(
(stats) =>
(data.textContent = [...stats].find(([a, b]) =>
a.startsWith('RTCDataChannel')
)[1].bytesSent),
console.error
);
},
})
);
};

Answer 

const channel = local.createDataChannel('stream-transparent-frames', {
negotiated: true,
id: 0,
});
channel.binaryType = 'arraybuffer';
const canvas = document.createElement('canvas');
const ctx = canvas.getContext('2d');
const canvasStream = canvas.captureStream(0);
const [canvasTrack] = canvasStream.getVideoTracks();
canvasTrack.onmute = canvasTrack.onunmute = (e) => {
console.log(e);
};
channel.onopen = async (e) => {
console.log(e, local, channel);
video.srcObject = canvasStream;
};
channel.onmessage = async (e) => {
const frame = await createImageBitmap(
new Blob([e.data], { type: 'image/webp' })
);
canvas.width = frame.width;
canvas.height = frame.height;
ctx.drawImage(frame, 0, 0);
canvasTrack.requestFrame();
local
.getStats()
.then(
(stats) =>
(data.textContent = [...stats].find(([a, b]) =>
a.startsWith('RTCDataChannel')
)[1].bytesReceived),
console.error
);
};

Shai Amir

unread,
Feb 14, 2021, 1:46:02 AM2/14/21
to discuss-webrtc
Are you suggesting that the whole AV would go through the data channel? (no peer connection) 

guest271314

unread,
Feb 14, 2021, 12:53:43 PM2/14/21
to discuss-webrtc
RTCPeerConnection does not currently support retaining of transparency in video frames to remote peer.

Audio can be sent using RTCDataChannel as well.

Or, RTCPeerConnection can be used for audio.

I focused on the use case of transparent video frames described at OP. Audio was not mentioned as part of requirement.
Reply all
Reply to author
Forward
0 new messages