Intent to Implement and Ship: Gate JavaScript dialogs on Site Engagement

424 views
Skip to first unread message

Avi Drissman

unread,
Aug 26, 2016, 5:37:52 PM8/26/16
to blink-dev

Contact emails

a...@chromium.org


Spec

The relevant link is https://html.spec.whatwg.org/#dom-alert .


Summary

alert dialogs are a remnant of JavaScript 1.0. They allow web pages to inform the user and get input from the user, but lock up the entire browser while doing so. Many web pages use them legitimately, so we don't want to break them, but there are many sketchy websites that use the app-modality of dialogs to scare the user into thinking their computer is infected and scam them. While services like SafeBrowsing help a little here, the turnover rate of domains used for the scamming is high, and SafeBrowsing lags. In addition, there is a security issue involving dialogs that is actively being abused in the wild.


This Intent is to ignore requests to show dialogs that are made from pages that have low Site Engagement. This helps in both the scam website case, and with the security issue.


What is Site Engagement? Site Engagement is a measure of how much the user interacts with a site. The more that a user visits/uses a site, the higher the engagement score is. If you want to see engagement scores on your Chrome, check out chrome://site-engagement.


There is already a plan to gate use of the Vibration API on Site Engagement, and this would fit right in.


For those concerned that this would interfere with legitimate use of JavaScript dialogs, even a moderate use of a website for 10 minutes at a sitting, just once a week would be enough to allow the use of dialogs.


Firefox already does something similar to this. As noted in their onbeforeunload documentation:


browsers may not display prompts created in beforeunload event handlers unless the page has been interacted with.

We would match them for onbeforeunload dialogs, and extend it to other JavaScript dialogs. 


This does not violate the spec. Per https://html.spec.whatwg.org/#dom-alert, we already have the right to block dialogs at our discretion.


(There is an ongoing project (unrelated to this) that is working on breaking the app-modality. This Intent is a complement to that project, and can be done sooner.)


Motivation

Users are being harmed by bad actors. This mitigates much of the abuse of the API by bad actors while preserving the legitimate use of this API.


Interoperability and Compatibility Risk

Firefox already does this for onbeforeunload dialogs. We would be following them there, and going further.


Ongoing technical constraints

None.


Will this feature be supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?

Yes.


OWP launch tracking bug

None yet.


Link to entry on the feature dashboard

This is a change to existing behavior that remains spec-compliant. I don't believe it requires a feature dashboard entry.


Requesting approval to ship?

Yes.

Avi Drissman

unread,
Aug 27, 2016, 1:46:40 AM8/27/16
to blink-dev
To clarify the platform section, Site Engagement is a Chrome feature, so the intent is to ship this on the five Chrome platforms, but not Android WebView. Implementation would happen entirely in Chrome, not in Content or Blink.

PhistucK

unread,
Aug 29, 2016, 12:07:32 PM8/29/16
to Avi Drissman, blink-dev
I personally think ten minutes are a very, very long time. Remember, those dialogs are also used for validations in forms (unfortunately). The user will not understand what happened and why the value is not displayed, or the form cannot be submitted.
It makes sense to me that a better logic would be -
Dialogs from windows that are not focused should gate on the site engagement level and dialogs from focused windows should just pass through.
If a website shows a dialog (or maybe numerous dialogs) in a window that is not focused, its site engagement level will determine whether it can show alerts once it is focused.
Most of the annoying ads come from pop unders (Chrome is working on preventing those, I know) or tab unders (no way or intention to prevent those, I believe), so this should mitigate most of the annoyances.

If focused popups that show dialogs become a common thing, you can execute the original plan and gate on the site engagement level regardless of focus.

Also, keep in mind that some of those dialogs are in a loop. Does that mean that a tab will take a whole CPU core until the users closes it (since it keeps showing dialogs that always immediately return)? That has battery consequences and may have global performance consequences, or even make the current tab hang (in case they share the same process, which they might). A few of those lingering in the background and all of your CPU cores are completely hogged, causing global wreckage.


PhistucK

--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

Dominick Ng

unread,
Aug 29, 2016, 8:44:48 PM8/29/16
to PhistucK, Avi Drissman, blink-dev
The exact amount of site engagement used to gate the dialogs is under discussion - I agree that requiring ten minutes of upfront usage to allow alerts seems high. However, Avi's point is that just ten minutes of usage per week is sufficient to give a site a continual non-zero amount of engagement - and that usage can be spread out across shorter sessions as well.

For example, we can select a level where a user who is filling in a form on a website has through the very act of filling in the form earned sufficient engagement to allow an alert dialog to appear. The most extreme example is to simply require non-zero engagement, since a nice property is that sites which use rotating domains will always have no engagement upfront (helping to counter drive-by or clickjacking attacks which take users elsewhere), whilst sites which engender legitimate usage earn engagement very quickly.

Restricting the throttled dialogs to background tabs would still leave open the ability for sites in focused tabs to use the app modality of dialogs to scare users.

PhistucK

unread,
Aug 30, 2016, 2:35:34 AM8/30/16
to Dominick Ng, Avi Drissman, blink-dev
I meant focused windows. So iFrames that are not focused also count.

Anyway, form filling sounds like a good engagement indicator that should mitigate most of my concerns.

The CPU hogging issue, though, is important, I think.


PhistucK

Dimitri

unread,
Aug 30, 2016, 12:23:42 PM8/30/16
to blink-dev
LGTM on the general approach. I think there's still lots to uncover through experimentation and defining site engagement params more clearly.

:DG<

Philip Jägenstedt

unread,
Sep 2, 2016, 7:27:19 AM9/2/16
to Dimitri, blink-dev
It was discussed on intervention-dev whether we should change the return type of alert to allow sites to detect that the alert wasn't shown. Where did you land on that?

Overall, my main concern is important sites like your bank, tax office, healthcare institution, or some government agency that you don't visit often. They might use alerts to communicate something important, like "down for maintenance" or "our UI is terrible and we don't know how to fix it, but click over there in the corner to get started". Most of these would be suppressed.

It's hopeless to prevent the bad usage without also preventing some of the good, but do we have any hunches or data about the proportions?

Avi Drissman

unread,
Sep 2, 2016, 10:01:36 AM9/2/16
to Philip Jägenstedt, Dimitri, blink-dev
I'm going to land some UMA.

One set of data will be just site engagement at the time of alerting, so we can get a histogram. I'm thinking about a second set of data, something that tells us about alert quality vs engagement. Can we do two-way UMA, comparing length of dialog text (as a proxy for quality) vs engagement? I'm going to talk to the UMA people about that.

Avi

Rick Byers

unread,
Sep 2, 2016, 10:31:37 AM9/2/16
to Avi Drissman, Philip Jägenstedt, Dimitri, blink-dev
That sounds really interesting, thanks Avi!  The simple thing to do for correlation studies is just generate a histogram based on, eg. the ratio of dialog text length to engagement score.  There's no direct way to correlate two different histograms.

Avi Drissman

unread,
Sep 2, 2016, 10:38:39 AM9/2/16
to Rick Byers, Philip Jägenstedt, Dimitri, blink-dev
Maybe two histograms, then? The length of dialog text when the engagement is below the proposed cut-off, and the length when the engagement is above the cutoff?

We need better analysis tools :(

Avi

Dominic Mazzoni

unread,
Sep 2, 2016, 11:17:57 AM9/2/16
to Avi Drissman, Rick Byers, Philip Jägenstedt, Dimitri, blink-dev
How about the length of the time the user interacted with a site *after* the dialog was shown as a useful metric? The assumption is that a user is more likely to quickly close a site that abuses alert(). Also, subsequently clicking "Prevent this page from showing additional dialogs" is a strong signal that the site was abusing it.

Rick Byers

unread,
Sep 2, 2016, 11:19:34 AM9/2/16
to Avi Drissman, Philip Jägenstedt, Dimitri, blink-dev
On Fri, Sep 2, 2016 at 10:38 AM, Avi Drissman <a...@chromium.org> wrote:
Maybe two histograms, then? The length of dialog text when the engagement is below the proposed cut-off, and the length when the engagement is above the cutoff?

Sure that works, and you could vary the proposed cut-off as a finch experiment. 

We need better analysis tools :(

I think the fundamental issue here is one of data volume.  Every opted-in chrome user will be uploading this data.  100 words per day per user is probably OK, but 100*100 is not (especially once you multiply by the thousands of different things people want to measure).

Avi Drissman

unread,
Sep 2, 2016, 11:22:00 AM9/2/16
to Rick Byers, Philip Jägenstedt, Dimitri, blink-dev
On Fri, Sep 2, 2016 at 11:19 AM, Rick Byers <rby...@chromium.org> wrote:
I think the fundamental issue here is one of data volume.  Every opted-in chrome user will be uploading this data.  100 words per day per user is probably OK, but 100*100 is not (especially once you multiply by the thousands of different things people want to measure).

I don't want to upload 100×100. I just want to be able to ask the UMA system to chart histogram A vs histogram B for the same users. Anyway, pipe dreams... 

Avi Drissman

unread,
Sep 2, 2016, 11:25:31 AM9/2/16
to Dominic Mazzoni, Rick Byers, Philip Jägenstedt, Dimitri, blink-dev
Some really interesting ideas.

Re length of time, I'm not sure how to easily measure that in the code that exists today. Re the "prevent" button, that's easy.

So UMA-wise:
- Engagement
- Dialog length below the engagement cutoff
- Dialog length above the engagement cutoff
- Dialog length for dialogs that got the "prevent" button clicked on them

Avi

Rick Byers

unread,
Sep 2, 2016, 11:32:04 AM9/2/16
to Avi Drissman, Philip Jägenstedt, Dimitri, blink-dev
But that won't tell your correlation.  Eg. one user may have a whopping 5% of their page views include long alert texts, and 5% of DIFFERENT page views have high engagement scores.  But another user could have their 5% high-engagement score sites also have long alert texts.  You couldn't tell these two users apart without storing and uploading the 100*100 matrix.

But reporting the ratio of the alert text length to engagement score should give you a pretty good idea - at least enough to form a hypothesis you could explicitly test (eg. engagement cutoff).
Reply all
Reply to author
Forward
0 new messages