rationale behind sync interval adjustments in mobwrite.computeSyncInterval_

2 views
Skip to first unread message

Brian Slesinsky

unread,
Nov 14, 2009, 1:37:57 AM11/14/09
to MobWrite
Why does this code reduce the sync interval gradually? It seems like
it would be better to set it to its minimum value on any sign of
activity, and then let it gradually creep up again if the activity
doesn't continue.

if (mobwrite.clientChange_) {
// Client-side activity.
// Cut the sync interval by 40% of the min-max range.
mobwrite.syncInterval -= range * 0.4;
}
if (mobwrite.serverChange_) {
// Server-side activity.
// Cut the sync interval by 20% of the min-max range.
mobwrite.syncInterval -= range * 0.2;
}

Neil Fraser

unread,
Nov 14, 2009, 1:49:47 AM11/14/09
to mobw...@googlegroups.com
2009/11/13 Brian Slesinsky <bsles...@gmail.com>:

> Why does this code reduce the sync interval gradually? It seems like
> it would be better to set it to its minimum value on any sign of
> activity, and then let it gradually creep up again if the activity
> doesn't continue.

It's a good point.

My goal was to get the syncs to approximately match the input
frequency. Consider a case where there was a change on average every
four seconds. Under the existing system the syncs would self-adjust
so that they are oscillating between three and five seconds.
Implementing the scheme you propose would mean that syncs would be
saw-toothing between one and five seconds.

What I've got is optimum for the case where input levels hold at a
reasonably fixed rate for a while. However I have not gathered data
on whether or not this is optimum for typing patterns. That would
definitely be something very interesting to look at. It would not
surprise me if your sawtooth scheme would be a better fit for
burstable typing patterns.

The computeSyncInterval function is isolated specifically so that it
is very easy to override.

--
Neil Fraser, Programmer & Wizard
http://neil.fraser.name

Reply all
Reply to author
Forward
0 new messages