Background info:
We run a game streaming platform. To facilitate the game streaming we have code inside the Unity game which uses libwebrtc under the hood. The client side is a webpage with javascript using the standard browsers, not a custom client.Use Case:
We are adding the option to run with audio / video synchronized or not synchronized. When the tracks are synchronized, I call the RTCPeerConnection.addTrack with no stream ids on the Unity side.
RTCPeerConneciton.addTrack
When the tracks are NOT synchronized I call RTCPeerConnection.addTrack with a different streamId for the audio and video tracks.
The client side code puts the tracks in the same stream and passes it to a video element as the 'srcObject'Question:I am only putting these tracks on different streams on the game / server side. Does the the client side have to be on different streams to be desynchronized or is the change from the server side enough?
I am assuming if I put the tracks in different streams on the client side, I need to create an audio element for the audio stream, is that correct? Since videoElements only have the one srcObject.Thanks in advance,Matt
--
This list falls under the WebRTC Code of Conduct - https://webrtc.org/support/code-of-conduct.
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/8be54425-511e-45f7-9664-79a09a26284cn%40googlegroups.com.