How to disable audio and video sync in chrome recive side

561 views
Skip to first unread message

Jeffrey

unread,
Aug 4, 2021, 8:08:00 AM8/4/21
to discuss-webrtc
hi all i want to  disable audio and video sync in chrome receive side , Does anyone know how to do that?

Chema Gonzalez

unread,
Aug 7, 2021, 7:58:07 AM8/7/21
to discuss-webrtc
This is an interesting question. I assume you mean using some JS API. I don't know how to do this, but I'd interested in knowing too. 

If you have access to the native code (and can build your own chromium version), then you can remove the a/v sync calls. More concretely, remove calls to:

```
  syncable_audio_->SetMinimumPlayoutDelay(target_audio_delay_ms);
  syncable_video_->SetMinimumPlayoutDelay(target_video_delay_ms);
```

if using `video/rtp_streams_synchronizer.cc`, or similarly if using `video/rtp_streams_synchronizer2.cc`. 

On the topic, does anybody know why the a/v sync native code is duplicated? 

Thanks,
-Chema

Tao Meme

unread,
Aug 8, 2021, 9:56:43 PM8/8/21
to discuss...@googlegroups.com
It's easy to do this.

if you create both audio and video track on same STREAM, Synchronization is inevitable.

But if you create two different STREAM, adding video and audio track separately, there will no  Synchronization. What a smart idea!

Chema Gonzalez <che...@gmail.com> 于2021年8月7日周六 下午7:58写道:
--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/c9801831-5c67-417f-bd33-1de99f9f3591n%40googlegroups.com.

Jeffrey

unread,
Aug 8, 2021, 11:57:56 PM8/8/21
to discuss-webrtc
Thanks all
Chrome i can not  access to the native code ,i'll try  meem's solution

Robert Ayrapetyan

unread,
Oct 16, 2023, 3:07:05 AM10/16/23
to discuss-webrtc
Could you clarify:

> But if you create two different STREAM, adding video and audio track separately, there will no  Synchronization.

I have two streams (I believe they are different because they have different SSRC values). They are still being synchronized inside the RtpStreamsSynchronizer::UpdateDelay() (that is, video frames rendering is being delayed in case audio stream has playout/jitterbuffer delays and vice-versa).
Is your statement false, or do I misunderstand the meaning of "stream"?

Razvan Grigore

unread,
Mar 21, 2025, 1:38:20 PMMar 21
to discuss-webrtc
I also struggled with this and found out that if you change the `msid` of the tracks (inside the same SDP/RTCPeerConnection), then Chrome stops doing the bloody lip-synchronisation. Having separate `ssrc` is not enough.

I wasted a few days on this, full of frustration that JS code like `jitterBufferDelayHint`, `jitterBufferTarget`, and `playoutDelayHint` set to 0 do not prevent it. Maybe I should create a chromium bug?

Hope this will spare somebody's time some day.

Cheers,
R

Philipp Hancke

unread,
Mar 21, 2025, 1:39:57 PMMar 21
to discuss...@googlegroups.com
if you put tracks into the same stream they get synchronized, that is WAI

--
This list falls under the WebRTC Code of Conduct - https://webrtc.org/support/code-of-conduct.

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.

Vitaly Ivanov

unread,
Mar 21, 2025, 9:53:26 PMMar 21
to discuss...@googlegroups.com
Imo it's not a bug, but quite the opposite, the most intuitive way to do it. I was there too, and to me it was really obvious I had to untangle audio and video to get the lowest video latency at the expense of breaking lip-sync.
As far as I understand it, ssrc uniquely identifies an RTP stream, so audio and video will have their own ssrc's anyway, and it's not to be confused with JS MediaStream (which is a bundle of sync'ed audio and video tracks)

Harald Alvestrand

unread,
Mar 22, 2025, 2:32:16 AMMar 22
to discuss...@googlegroups.com
If you control both sender side and receiver side, the easiest way to keep them separate is to use AddTrack with no "streams" parameter when you attach the track to the PeerConnection.
WebRTC will attempt to sync up things that are in the same stream (but cannot succeed in all cases), and MSID is the mechanism that WebRTC uses to tell the other side what streams the track is in.


Reply all
Reply to author
Forward
0 new messages