Sorry for the late response. Thank you for your clarification!
In our project, we built a webRTC client with webRTC native sdk, which receives video and audio from a SDI card(1080p60) and send out to an Ant media server. The webRTC client encodes the coming video using our own hardware encoder and handles audio by using the built-in encoder, then send out to an Ant media server and display on Chrome browser. I checked the timestamp for both video and audio and they all come at the same time from SDI. The hardware encoding time is around 10ms per frame, which means each frame should go smoothly without delay.
We setup the ntp_time_ms and capture_time_ms of webrtc::VideoFrame as the ntp time from SDI and timestamp_rtp is (int32_t)ntp_time*90 in our webRTC client. But it doesn't seem helpful. The video and audio are synchronized at the first several minutes. We can see the lip sync is working at the beginning since sometimes the video run faster to catch up with the audio. However, after a while, they are out of sync. Video is always slower than audio. If timestamp doesn't work, I don't know which way we can do.
We are using Ant media server Enterprise Edition 2.1.0 20200720_1328, waiting for another team to update it to 2.3.0. Do you think 2.3.0 could solve the issue? Any other ideas or suggestions?
Thanks a lot!
Julie