Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Stuck with manual review forever

271 views
Skip to first unread message

Emiliano Heyns

unread,
Jun 8, 2015, 6:04:26 PM6/8/15
to mozilla-addons-...@lists.mozilla.org
Here are the things that mean I'll be stuck with manual review for my extension:

* Access to the `Function` global: I can work around this, but code will be more complicated (read: higher risk of breakage) and less performant. Working on that, grudgingly.

* Use of deprecated __exposedProps__ declaration: not my choice. Zotero uses this, I have to follow.

* Variable element type being created: are you kidding me? I can't dynamically create dom elements from variables?

* Global variable overwrite: yes, in a *sandbox*. This is what sandboxes were made for.

* Mouse events may cause performance issues: comes from jsencrypt, a well-respected encryption library. I guess I'll have to hunt around for a simpler library. Who cares about good encryption, after all; it *may* cause performance issues.

* `setTimeout` called in potentially dangerous manner: again, jsencrypt. I will trust the jsencrypt authors here over a blanket statement of potential danger.

* Use of `document.write` strongly discouraged: again, jsencrypt.

* Markup should not be passed to `innerHTML` dynamically: again, jsencrypt.

* on* property being assigned string: yes, I do have a simple JS object like "typeMap = {thesis: 'thesis', online: 'blogPost forumPost webpage'}". Why on earth should every single property starting with 'on' ON POJOs be deemed "medium severity"?!

Dan Stillman

unread,
Jun 8, 2015, 6:46:41 PM6/8/15
to mozilla-addons-...@lists.mozilla.org
Given that we were assured repeatedly that this new system was for
malware and not to enforce ambiguous coding rules that depend highly on
the context in which they're used, surely none of these things, which
all show up just as warnings (albeit in some cases with "Signing
severity: high"), will prevent automatic signing... Jorge, could you
confirm that?

Jorge Villalobos

unread,
Jun 9, 2015, 3:58:47 PM6/9/15
to mozilla-addons-...@lists.mozilla.org
Anything that can execute untrusted code needs to be flagged. Some
things like 'eval', 'Function' and injecting <script> tags needs to have
a high severity, since they can be used to run remote code.

The idea for the unlisted validator is that non-high severity flags will
be ignored in subsequent submissions, while high severity flags will be
persistent. However, reviewers will have the option of marking down
those flags so they do not show up on following versions. This is
determined on a per-add-on basis. This hasn't been implemented yet, but
we plan to do so in the coming weeks.

Jorge

Dan Stillman

unread,
Jun 9, 2015, 6:43:48 PM6/9/15
to mozilla-addons-...@lists.mozilla.org
On 6/9/15 3:58 PM, Jorge Villalobos wrote:
> Anything that can execute untrusted code needs to be flagged. Some
> things like 'eval', 'Function' and injecting <script> tags needs to have
> a high severity, since they can be used to run remote code.
>
> The idea for the unlisted validator is that non-high severity flags will
> be ignored in subsequent submissions, while high severity flags will be
> persistent. However, reviewers will have the option of marking down
> those flags so they do not show up on following versions. This is
> determined on a per-add-on basis. This hasn't been implemented yet, but
> we plan to do so in the coming weeks.

Do you mean that entire classes of warnings will be able to be disabled
for a given extension, or that specific instances in the code will be
whitelisted but that a change to a surrounding line or new usage
elsewhere would prevent a subsequent version of the same extension from
being automatically signed? Unless the severity of existing checks is
altered, the latter would be tantamount to requiring permanent AMO
review for larger extensions.

More generally, if the goal here is truly to prevent malware, I would
hope that the option will exist for established extensions to be
whitelisted permanently once it's clear that they're legitimate. Is that
something we can expect?

Emiliano Heyns

unread,
Jun 10, 2015, 5:45:33 AM6/10/15
to mozilla-addons-...@lists.mozilla.org
On Tuesday, June 9, 2015 at 9:58:47 PM UTC+2, Jorge Villalobos wrote:

> Anything that can execute untrusted code needs to be flagged. Some
> things like 'eval', 'Function' and injecting <script> tags needs to have
> a high severity, since they can be used to run remote code.

Does this include mozIJSSubScriptLoader.loadSubScript?

Regards,
Emile

Jorge Villalobos

unread,
Jun 10, 2015, 5:14:10 PM6/10/15
to mozilla-addons-...@lists.mozilla.org
I think it's less likely to do this with loadSubScript, since you can
only load local URLs. Of course it's possible to get a remote script,
save it as a local file and then run that, but it's a more convoluted
way of doing this.

Having said that, I'm not sure what level was given to that particular
warning.

Jorge

Jorge Villalobos

unread,
Jun 10, 2015, 5:19:18 PM6/10/15
to mozilla-addons-...@lists.mozilla.org
On 6/9/15 4:43 PM, Dan Stillman wrote:
> On 6/9/15 3:58 PM, Jorge Villalobos wrote:
>> Anything that can execute untrusted code needs to be flagged. Some
>> things like 'eval', 'Function' and injecting <script> tags needs to have
>> a high severity, since they can be used to run remote code.
>>
>> The idea for the unlisted validator is that non-high severity flags will
>> be ignored in subsequent submissions, while high severity flags will be
>> persistent. However, reviewers will have the option of marking down
>> those flags so they do not show up on following versions. This is
>> determined on a per-add-on basis. This hasn't been implemented yet, but
>> we plan to do so in the coming weeks.
>
> Do you mean that entire classes of warnings will be able to be disabled
> for a given extension, or that specific instances in the code will be
> whitelisted but that a change to a surrounding line or new usage
> elsewhere would prevent a subsequent version of the same extension from
> being automatically signed? Unless the severity of existing checks is
> altered, the latter would be tantamount to requiring permanent AMO
> review for larger extensions.

My understanding is that it will be fine-grained for serious issues like
eval, and will be broader for other warnings. I'm not sure yet if the
warnings will be disabled at the file level or rough line proximity.

> More generally, if the goal here is truly to prevent malware, I would
> hope that the option will exist for established extensions to be
> whitelisted permanently once it's clear that they're legitimate. Is that
> something we can expect?

That's probably what will happen for many add-ons in the long term, yes.
We want to be cautious and not open the flood gates too much, though.

Jorge

Mindaugas J.

unread,
Jul 4, 2015, 2:31:55 PM7/4/15
to mozilla-addons-...@lists.mozilla.org
Well since I can see this for my add-on
> createElement() used to create script tag
> Signing severity: medium
I'm pretty much guaranteed to have manual review forever? I'm injecting code locally btw (from my addon package via textContent). The code is reused across browsers. It seems I'll have to customize my build script for Firefox since the validator obviously has no idea about control-flow and I can't hide it behind if (notFirefox) { doSomethingElse(); }

My AMO-listing is in the queue for 11 weeks now at 66 of 300, tyvm. Sigh...
There is absolutely no way anyone can get read for this to be enabled in FF41.

----

Is this manual review of unlisted add-ons at least using a separate expedited review queue?

Jorge Villalobos

unread,
Jul 6, 2015, 6:06:45 PM7/6/15
to mozilla-addons-...@lists.mozilla.org
On 7/4/15 12:31 PM, Mindaugas J. wrote:
> On Thursday, June 11, 2015 at 12:14:10 AM UTC+3, Jorge Villalobos wrote:
>> On 6/10/15 3:45 AM, Emiliano Heyns wrote:
>>> On Tuesday, June 9, 2015 at 9:58:47 PM UTC+2, Jorge Villalobos wrote:
>>>
>>>> Anything that can execute untrusted code needs to be flagged. Some
>>>> things like 'eval', 'Function' and injecting <script> tags needs to have
>>>> a high severity, since they can be used to run remote code.
>>>
>>> Does this include mozIJSSubScriptLoader.loadSubScript?
>>>
>>> Regards,
>>> Emile
>>
>> I think it's less likely to do this with loadSubScript, since you can
>> only load local URLs. Of course it's possible to get a remote script,
>> save it as a local file and then run that, but it's a more convoluted
>> way of doing this.
>>
>> Having said that, I'm not sure what level was given to that particular
>> warning.
>>
>> Jorge
>
> Well since I can see this for my add-on
>> createElement() used to create script tag
>> Signing severity: medium
> I'm pretty much guaranteed to have manual review forever? I'm injecting code locally btw (from my addon package via textContent). The code is reused across browsers. It seems I'll have to customize my build script for Firefox since the validator obviously has no idea about control-flow and I can't hide it behind if (notFirefox) { doSomethingElse(); }

The idea is to implement a feature in the validator that will allow
reviewers to flag certain validations as "unimportant" so they don't
show up for updates of the same add-on. Minor validation issues will
also be non-recurrent (injecting scripts isn't considered minor,
however). These issues are currently blocking us from officially
announcing unlisted submissions.

> My AMO-listing is in the queue for 11 weeks now at 66 of 300, tyvm. Sigh...
> There is absolutely no way anyone can get read for this to be enabled in FF41.
>
> ----
>
> Is this manual review of unlisted add-ons at least using a separate expedited review queue?

Yes, unlisted add-ons have a separate queue that is being prioritized
over the others. At the moment those queues are a bit backed up since we
only started to review them very recently and all Mozilla employees were
away for the week of June 22nd.

Jorge

star...@gmail.com

unread,
Sep 3, 2015, 2:56:42 PM9/3/15
to mozilla-addons-...@lists.mozilla.org
(My first reply here accidently ended in pre-moderation because I posted it before joining the group. I'll try again, and now I'll make my question more specific and short.)

My addon modifies certain page on certain site using "page-mod" from Addon SDK. Now I need to conditionally load one single js file hosted on the site I'm modifying. The condition is determined by my content script.

If I try to add required script with document.createElement('script'), the auto-validation will reject my addon from signing. Bundling the required file with the addon is a bad idea because it may be changed on site at any time (as I do not control the site). And even if I bundle it - seems the only way to add it is to unconditionally load it via page-mod, which is unacceptable.

So, am I stuck with manual review forever? Or are there any other options to conditionally load a script hosted on the same site I'm working with?

Jorge Villalobos

unread,
Sep 3, 2015, 4:52:04 PM9/3/15
to mozilla-addons-...@lists.mozilla.org
Dynamically loading scripts, especially remote ones, will always be
flagged by the validator. In the future, reviewers will be able to
choose which flags to ignore in followup versions of the add-on,
provided the code around the flag doesn't change. So, a reviewer could
accept your add-on and mark it so future versions aren't flagged for
that particular issue.

Jorge

star...@gmail.com

unread,
Sep 4, 2015, 12:23:12 PM9/4/15
to mozilla-addons-...@lists.mozilla.org
Thank you for your reply, a few more questions then.

> Dynamically loading scripts, especially remote ones, will always be
> flagged by the validator.

When you say "especially remote ones", does it mean that there's some way to dynamically load at least local scripts?

Validator rejects any .createElement('script') regardless of where the src parameter is pointing - neither remote URLs, local URLs nor even static code is accepted.

Interesting that if I try to put a script with a remote src into .insertAdjacentHTML - it will be also rejected, but the message from validator is different: for this case it suggests trying local resource:// URI. And this way it is accepted, but seems useless as the js code added via .insertAdjacentHTML isn't getting evaluated anyway.

And... At all, what's the problem with the remote scripts located on the same domain as the site being modified by an add-on? I agree that loading unknown remote code from other domains is dangerous and should be prohibited (but, probably, with some sort of whitelisting for official jquery hosting etc.), but what's wrong with the scripts that are local for current site? I mean those scripts, src of which is relative to site and doesn't contain a domain part.

> In the future, reviewers will be able to
> choose which flags to ignore in followup versions of the add-on,
> provided the code around the flag doesn't change. So, a reviewer could
> accept your add-on and mark it so future versions aren't flagged for
> that particular issue.

Just in case, this way might only static script URLs be accepted, or will I be able to dynamically build relative to site src-s? Or there's currently no policy for such cases?

And, the last one, is there any approximate date when such functionality for validator will go live?

Jorge Villalobos

unread,
Sep 4, 2015, 4:45:53 PM9/4/15
to mozilla-addons-...@lists.mozilla.org
On 9/4/15 2:26 AM, star...@gmail.com wrote:
> Thank you for your reply, a few more questions then.
>
>> Dynamically loading scripts, especially remote ones, will always be
>> flagged by the validator.
>
> When you say "especially remote ones", does it mean that there's some way to dynamically load at least local scripts?
>
> Validator rejects any .createElement('script') regardless of where the src parameter is pointing - neither remote URLs, local URLs nor even static code is accepted.
>
> Interesting that if I try to put a script with a remote src into .insertAdjacentHTML - it will be also rejected, but the message from validator is different: for this case it suggests trying local resource:// URI. And this way it is accepted, but seems useless as the js code added via .insertAdjacentHTML isn't getting evaluated anyway.

All forms of script injection are flagged, since they are all
potentially dangerous. How reviewers respond to those flags depends on
the circumstances. This is why I highlighted remote scripts in this
case, since they are generally not allowed, while local scripts are.

> And... At all, what's the problem with the remote scripts located on the same domain as the site being modified by an add-on? I agree that loading unknown remote code from other domains is dangerous and should be prohibited (but, probably, with some sort of whitelisting for official jquery hosting etc.), but what's wrong with the scripts that are local for current site? I mean those scripts, src of which is relative to site and doesn't contain a domain part.

That's an exception we can make, if it's obvious from the code.

>> In the future, reviewers will be able to
>> choose which flags to ignore in followup versions of the add-on,
>> provided the code around the flag doesn't change. So, a reviewer could
>> accept your add-on and mark it so future versions aren't flagged for
>> that particular issue.
>
> Just in case, this way might only static script URLs be accepted, or will I be able to dynamically build relative to site src-s? Or there's currently no policy for such cases?

The exceptions would be added per validator flag, depending on context.
So, if we're okay with the way you inject scripts, we can add an
exception for that so they don't show up again during validation, unless
the surrounding code changes.

> And, the last one, is there any approximate date when such functionality for validator will go live?

It's all being worked on now, so a few weeks at most.

Jorge
0 new messages