CSipSimple AV-Sync

106 views
Skip to first unread message

james gordon

unread,
Feb 14, 2014, 9:13:29 AM2/14/14
to csipsim...@googlegroups.com

I have tested video calling with CSipSimple on quite a few Android devices and other video capable SIP phones. The audio/video quality is excellent and the audio/video is perfectly synchronised. 

I was wondering if any AV-Sync had been implemented or was this working by pure luck? 

The issue I am having is on a custom Android device, unfortunately the video is around a second behind the audio when calling device to device. For example;

T = Samsung Tab 2
S = Samsung Galaxy S4
P = Polycom SIP Phone
C = Custom Android device

S/P/T <-> S/P/T is synced
C <-> S/P/T is synced
but
C <-> C is around a second behind?

My custom Android device is similar specification to T, and is running Android 4.3 AOSP.

I have tried all Audio/Video codec combinations, Java/OpenSL, but it is miles out on the all.

I know video is experimental, but I would really appreciate if you could let me know if any synchronisation is implemented, also if you have any ideas why it wouldn't synchronise in this particular case.

Thanks

Régis Montoya

unread,
Mar 1, 2014, 5:52:29 AM3/1/14
to CSipSimple dev group
Hi James,

I don't know ;)
The feature might come from pjsip library.
For now as you noticed it's experimental, and the only thing that was made on csipsimple side was to focus on port of the "device" (rendered/capturers) on android.



--
You received this message because you are subscribed to the Google Groups "CSipSimple Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to csipsimple-de...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

james gordon

unread,
Jun 4, 2014, 8:04:02 AM6/4/14
to csipsim...@googlegroups.com

Hi r3gis,

I am still working on this video sync issue :(

So far I have found that A/V sync is not implemented in PJSIP, there is a ticket to do this in the future though.

I have a few questions I was hoping you could answer;

Video capture is currently done using WebRTC and the audio with either Java or OpenSl (correct?). With the lack of synchronisation on the incoming packets within PJSIP I guess it comes down to syncing the audio/video streams.
Is there any reason why WebRTC is not used for the audio? As if it was it might be possible to use the VoEVideoSync from WebRTC, do you think this is possible?

Finally, I have noticed in the webrtc_android_video_render_dev.cpp that 400ms is added to the render time, is there a reason for this? see below;

// TODO : shall we try to use frame timestamp?
    WebRtc_Word64 nowMs = TickTime::MillisecondTimestamp();
    stream->_videoFrame.SetRenderTime( nowMs + 400);
    stream->_videoFrame.SetTimeStamp(frame->timestamp.u64/*nowMs*/);
    stream->_renderProvider->RenderFrame(0, stream->_videoFrame);


In webrtc_android_video_render_dev.cpp at line 558 the frame timestamp is set, if I -400ms from this the video is pretty much in sync.

Any chance you could point me in the right direction?

Thanks

Régis Montoya

unread,
Jun 4, 2014, 8:37:16 AM6/4/14
to CSipSimple dev group
Hi,

My replies inline :

2014-06-04 14:04 GMT+02:00 james gordon :
>
> Video capture is currently done using WebRTC and the audio with either Java
> or OpenSl (correct?).

Yep it's correct.

> Is there any reason why WebRTC is not used for the audio?
The code of csipsimple audio was made before code of WebRTC was
released. I also worked myself on WebRTC project topic for a big
telecom company and we used audio stack from webRTC project. Results
was (about one year ago) less good than csipsimple implementation in
terms of various devices support and totally similar in terms of
performance. There is no magic, the code is pretty similar ;). The
only big diff is that csipsimple implem is made to directly bind pjsip
callback and approach. Integrating webrtc would mean integrate another
abstraction layer to already existing pjsip abstraction layer. So the
current implem (proven working with many device) was direct to java
and openSL-ES apis... decision was took to not try to integrate webRTC
for this point.
For video, things was very different : nothing was available in
csipsimple yet (except some proof of concept not made with opengl
apis), and the webrtc code had some features to support more devices
with video cameras.

> As if it was it
> might be possible to use the VoEVideoSync from WebRTC, do you think this is
> possible?

I don't think so. WebRTC is somehow equivalent to "pjmedia" (the
module in pjsip responsible to manage av media). The audio video sync
has to be done in a central point that you'll not be able to override
in pjmedia (unless you patch pjmedia... but well would be as fast to
implement the feature in pjmedia I thin).

However, in case you'd like to go the way integrating completely
webRTC, there is a solution offered by pjsip : add your own media
adapter based on webrtc. This way you'll completly replace pjmedia
with webrtc media layer and use pjsip only for negociation.

>
> Finally, I have noticed in the webrtc_android_video_render_dev.cpp that
> 400ms is added to the render time, is there a reason for this? see below;
>
> // TODO : shall we try to use frame timestamp?
> WebRtc_Word64 nowMs = TickTime::MillisecondTimestamp();
> stream->_videoFrame.SetRenderTime( nowMs + 400);
> stream->_videoFrame.SetTimeStamp(frame->timestamp.u64/*nowMs*/);
> stream->_renderProvider->RenderFrame(0, stream->_videoFrame);
>

It looks like some value I copied paste from some sample from webRTC
and that's it's necessary to get the android rendering layer consider
this frame should not be skipped (basically when the rendering layer
of webrtc get a frame it can decide to drop frames if looks like it's
currently overrunning).
But if I added a TODO before it's because I was not sure about the way
it's done.
And indeed 400ms seems pretty big and actually should even change
depending on the video frames size + the CPU load of your device. (if
your device is able to reach the rendering loop very quickly, it
should be able to process this frame sooner).

>
> In webrtc_android_video_render_dev.cpp at line 558 the frame timestamp is
> set, if I -400ms from this the video is pretty much in sync.
>
Well if you have a device with high CPU capabilities I think it's
almost real time rendering. However, if you plan to get things run on
other (old) device it's maybe risky.

james gordon

unread,
Jun 4, 2014, 8:43:43 AM6/4/14
to csipsim...@googlegroups.com


Nice one, thanks for the quick replay, will do some more investigating!
Reply all
Reply to author
Forward
0 new messages