Idea: Always check for updates via coral cache?

341 views
Skip to first unread message

Anthony Lieuallen

unread,
Feb 16, 2012, 4:24:53 PM2/16/12
to greasemo...@googlegroups.com
Lead-in FYI: Coral Cache is a content distribution network that "just works" for any URL.  It's generally more available than anything than the biggest sites out there.
http://www.coralcdn.org/

Some back of the napkin math: According to Mozilla, Greasemonkey has ~3.5mm active users (https://addons.mozilla.org/en-US/statistics/addon/748).  If each user has _one_ script from userscripts.org installed (on average), turning auto update checking on by default, with a once-per-week policy means:

(users / days in week / hours in day / minutes in hour / seconds in minute)
3500000 / 7 / 24 / 60 / 60 = 5.79

Something like six requests per second.  If it's three per user on average, that's something like 20 requests per second.  We could totally crush us.o with automated traffic.

As written now, Greasemonkey's update checking involves one HTTP request just to check if an update is available. Then, even if it's the same URL (i.e. no @updateURL is in the script) another separate HTTP request as part of the (updated) script's installation.  I was thinking this was a bug for a while.  But if we leave it as is, and always pass the first ("is there an update?") check through coral cache then we alleviate a lot of load on us.o, and since we're not actually downloading the script to be installed through coral we don't open up extra security holes.

Thoughts?

arantius

unread,
Feb 16, 2012, 4:39:21 PM2/16/12
to greasemonkey-dev
So, one obvious problem is that this would mean that non-internet-
facing update URLs (i.e. intranet, localhost) would never work.
Should we do this, but only for userscripts.org (which already has a
bit of a special case for loading meta.js instead)?

Johan Sundström

unread,
Feb 17, 2012, 3:18:59 AM2/17/12
to greasemo...@googlegroups.com
On Thu, Feb 16, 2012 at 22:24, Anthony Lieuallen <aran...@gmail.com> wrote:
As written now, Greasemonkey's update checking involves one HTTP request just to check if an update is available. Then, even if it's the same URL (i.e. no @updateURL is in the script) another separate HTTP request as part of the (updated) script's installation.  I was thinking this was a bug for a while.  But if we leave it as is, and always pass the first ("is there an update?") check through coral cache then we alleviate a lot of load on us.o, and since we're not actually downloading the script to be installed through coral we don't open up extra security holes.

Thoughts?

Barring requests to the contrary from us.o maintainers (unlikely), I think this is a good idea, as a special-case for us.o (so as not to exclude non-public-web-facing scripts, as you noted).

Adjacent topic:

For any https:// urls (the case I want most auto-updates to be from a security stand-point) this will only add traffic, as nyud.net redirects back to the original site for those, e g:


ends up downloading the script from its original location,


--
 / Johan Sundström, http://ecmanaut.blogspot.com/

Anthony Lieuallen

unread,
Feb 17, 2012, 9:10:56 AM2/17/12
to greasemo...@googlegroups.com
2012/2/17 Johan Sundström <oya...@gmail.com>

Barring requests to the contrary from us.o maintainers (unlikely), I think this is a good idea, as a special-case for us.o (so as not to exclude non-public-web-facing scripts, as you noted).

Yep.
 
Adjacent topic:

For any https:// urls (the case I want most auto-updates to be from a security stand-point) this will only add traffic, as nyud.net redirects back to the original site for those...

Especially if it's going to only be for us.o, we'll just make sure that URL uses HTTP.  (Because again, the check-for-update-availability happens first, and often, and has no strong need to be secure.  The download-the-script connection can still be secure.
Reply all
Reply to author
Forward
0 new messages