How to Desync Audio / Video

168 views
Skip to first unread message

Matt Knowles

unread,
Sep 16, 2024, 12:29:55 PMSep 16
to discuss-webrtc
Background info:
We run a game streaming platform. To facilitate the game streaming we have code inside the Unity game which uses libwebrtc under the hood. The client side is a webpage with javascript using the standard browsers, not a custom client. 

Use Case:
We are adding the option to run with audio / video synchronized or not synchronized. When the tracks are synchronized, I call the RTCPeerConnection.addTrack with no stream ids on the Unity side. 
RTCPeerConneciton.addTrack

When the tracks are NOT synchronized I call RTCPeerConnection.addTrack with a different streamId for the audio and video tracks.

The client side code puts the tracks in the same stream and passes it to a video element as the 'srcObject'

Question:
I am only putting these tracks on different streams on the game / server side. Does the the client side have to be on different streams to be desynchronized or is the change from the server side enough? 

I am assuming if I put the tracks in different streams on the client side, I need to create an audio element for the audio stream, is that correct? Since videoElements only have the one srcObject.

Thanks in advance,
Matt




Muhammad Usman Bashir

unread,
Sep 16, 2024, 7:51:28 PMSep 16
to discuss-webrtc
In your case, client-side separation is inevitable along with server,
- Separate audio and video tracks into different MediaStreams,
- Create separate <video> and <audio> elements as per requirement,
- Assign video track to video.srcObject and audio track to audio.srcObject

You may also need to implement some custom timing logic for precise desync control or adjustment of NTP TimeStamp as well. 

Philipp Hancke

unread,
Sep 16, 2024, 11:04:34 PMSep 16
to discuss...@googlegroups.com
Am Mo., 16. Sept. 2024 um 09:29 Uhr schrieb Matt Knowles <matt.k...@level-ex.com>:
Background info:
We run a game streaming platform. To facilitate the game streaming we have code inside the Unity game which uses libwebrtc under the hood. The client side is a webpage with javascript using the standard browsers, not a custom client. 

Use Case:
We are adding the option to run with audio / video synchronized or not synchronized. When the tracks are synchronized, I call the RTCPeerConnection.addTrack with no stream ids on the Unity side. 
RTCPeerConneciton.addTrack

that sounds wrong since you do not put them into the same MediaStream (in the SDP; the JS object does not matter much) which is when they are synchronized.
You can see minPlayoutDelay on the video receive stats in webrtc-internals (but not in getStats since this is not a standardized stat IIRC)

When the tracks are NOT synchronized I call RTCPeerConnection.addTrack with a different streamId for the audio and video tracks.

The client side code puts the tracks in the same stream and passes it to a video element as the 'srcObject'

Question:
I am only putting these tracks on different streams on the game / server side. Does the the client side have to be on different streams to be desynchronized or is the change from the server side enough? 

I am assuming if I put the tracks in different streams on the client side, I need to create an audio element for the audio stream, is that correct? Since videoElements only have the one srcObject.

Thanks in advance,
Matt




--
This list falls under the WebRTC Code of Conduct - https://webrtc.org/support/code-of-conduct.
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/8be54425-511e-45f7-9664-79a09a26284cn%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages