Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Re: Add-on File Registration PRD

819 views
Skip to first unread message

Jorge Villalobos

unread,
Oct 30, 2013, 5:55:21 PM10/30/13
to
Cross posting to dev.planning, where I originally intended this to be.
Please follow up to dev.planning.

Jorge

On 10/30/13 3:42 PM, Jorge Villalobos wrote:
> Hello!
>
> As many of you know, the Add-ons Team, User Advocacy Team, Firefox Team
> and others have been collaborating for over a year in a project called
> Squeaky [1]. Our aim is to improve user experience for add-ons,
> particularly add-ons that we consider bad for various levels of "bad".
>
> Part of our work consists on pushing forward improvements in Firefox
> that we think will significantly achieve our goals, which is why I'm
> submitting this spec for discussion:
>
> https://docs.google.com/document/d/1SZx7NlaMeFxA55-u8blvgCsPIl041xaJO5YLdu6HyOk/edit?usp=sharing
>
> The Add-on File Registration System is intended to create an add-on file
> repository that all add-on developers need to submit their files to.
> This repository won't publish any of the files, and inclusion won't
> require more than passing a series of automatic malware checks. We will
> store the files and generated hashes for them.
>
> On the client side, Firefox will compute the hashes of add-on files
> being installed and query the API for it. If the file is registered, it
> can be installed, otherwise it can't (there is planned transition period
> to ease adoption). There will also be periodic checks of installed
> add-ons to make sure they are registered. All AMO files would be
> registered automatically.
>
> This system will allow us to better keep track of add-on IDs, be able to
> easily find the files they correspond to, and have effective
> communication channels to their developers. It's not a silver bullet to
> solve add-on malware problems, but it raises the bar for malware developers.
>
> We believe this strikes the right balance between a completely closed
> system (where only AMO add-ons are allowed) and the completely open but
> risky system we currently have in place. Developers are still free to
> distribute add-ons as they please, while we get a much-needed set of
> tools to fight malware and keep it at bay.
>
> There are more details in the doc, so please give it a read and post
> your comments and questions on this thread.
>
> Jorge Villalobos
> Add-ons Developer Relations Lead
>
> [1] https://wiki.mozilla.org/AMO/Squeaky
>

David E. Ross

unread,
Oct 30, 2013, 8:09:06 PM10/30/13
to
So the plan is to stop me from installing extensions that I know to be
good but have not gone through addons.mozilla.org? You do not indicate
any option for the user to override this prohibition. I have several
such extensions, and I do not want to lose the ability to get updates to
them.

This appears to be a total reversal of past Mozilla philosophy, which
not only encouraged the development of extensions but also strongly
advocated the use of extension where users wanted features that the
developers were not interested in providing.

Also cross-posted to mozilla.dev.extensions. However, I have set
Followup-To for mozilla.dev.planning.

--
David E. Ross
<http://www.rossde.com/>

Where does your elected official stand? Which
politicians refuse to tell us where they stand?
See the non-partisan Project Vote Smart at
<http://votesmart.org/>.

a.very.l...@gmail.com

unread,
Oct 30, 2013, 9:57:44 PM10/30/13
to
On Wednesday, October 30, 2013 9:55:21 PM UTC, jorgev wrote:
> Cross posting to dev.planning, where I originally intended this to be.
>

There are two large problems with this.
1) Since Firefox will not allow any unregistered add-on to be installed then creating new an extension or theme will become impossible unless you allow blank add-ons to be registered. If I create a new extension it cannot be installed and tested until it is registered.

2) The company I work for has a number of extensions that are only used on our own servers. They are have no functionality outside of our servers. By requiring permission from Mozilla to run these extensions we will need to make certain items that could possibly give insight on our production practices to our competitors are removed and thus functionality would be lost.


Nicholas Nethercote

unread,
Oct 30, 2013, 10:22:57 PM10/30/13
to a.very.l...@gmail.com, dev. planning
On Wed, Oct 30, 2013 at 6:57 PM, <a.very.l...@gmail.com> wrote:
>
> There are two large problems with this.
> 1) Since Firefox will not allow any unregistered add-on to be installed then creating new an extension or theme will become impossible unless you allow blank add-ons to be registered. If I create a new extension it cannot be installed and tested until it is registered.

And would the edit/run cycle become edit/register/run?

> 2) The company I work for has a number of extensions that are only used on our own servers. They are have no functionality outside of our servers. By requiring permission from Mozilla to run these extensions we will need to make certain items that could possibly give insight on our production practices to our competitors are removed and thus functionality would be lost.

Yeah.

I'm pretty sympathetic to the suggestions that crack down on add-ons,
but this seems unworkably strict.

Nick

Onno Ekker

unread,
Oct 31, 2013, 1:20:17 AM10/31/13
to Jorge Villalobos
Can everyone submit an add-on to Squeaky? Or only the add-on developer?
I ask this because to me it's not clear what happens to the metadata,
like in install.rdf. It's still necessary sometimes to edit this file,
especially for the targetApplication's maxVersion. There are also other
local edits (forks or branches) possible to existing add-ons, that are
not (yet) in the official version.

Onno

J. Ryan Stinnett

unread,
Oct 31, 2013, 10:49:01 AM10/31/13
to
As others have mentioned, I don't think this degree of centralization is wanted or needed at all.

Why should it be necessary for me to register on AMO if I don't want to use it to distribute an add-on? That seems very bad to me.

> This repository won't publish any of the files, and inclusion won't
> require more than passing a series of automatic malware checks. We will
> store the files and generated hashes for them.

This seems like it will just escalate an "arms race" with malware developers. It will simply take them a bit of time to find clever ways around whatever scanning is being done.

> This system will allow us to better keep track of add-on IDs, be able to
> easily find the files they correspond to, and have effective
> communication channels to their developers.

These seem like anti-goals to me. I don't want you to track anything about a non-AMO add-on. Also, if you expect to detect malware with the scheme, I don't think the guilty developers would really be open to communication.

> We believe this strikes the right balance between a completely closed
> system (where only AMO add-ons are allowed) and the completely open but
> risky system we currently have in place. Developers are still free to
> distribute add-ons as they please, while we get a much-needed set of
> tools to fight malware and keep it at bay.

I completely disagree with your summary here. This spec would add roadblocks to both the add-on development and distribution process, only to get what seem to like questionable value from surveying all that data.

Non-AMO developers are already avoiding AMO because they find it too limiting for one reason or another. I would much rather allow the good bunch of non-AMO developers to continue innovating as they have today, instead of throwing a bunch of roadblocks at them, which could cause them to give up altogether.

- Ryan

Gregory Szorc

unread,
Oct 31, 2013, 12:17:46 PM10/31/13
to Nicholas Nethercote, a.very.l...@gmail.com, dev. planning
On 10/30/2013 7:22 PM, Nicholas Nethercote wrote:
> On Wed, Oct 30, 2013 at 6:57 PM, <a.very.l...@gmail.com> wrote:
>>
>> There are two large problems with this.
>> 1) Since Firefox will not allow any unregistered add-on to be installed then creating new an extension or theme will become impossible unless you allow blank add-ons to be registered. If I create a new extension it cannot be installed and tested until it is registered.
>
> And would the edit/run cycle become edit/register/run?

This doesn't seem to be a valid concern as the proposal address this:

> Add-on developers testing their add-ons locally shouldn’t need to register every build, so they need a way to override registration.
> One possible solution is to add a command line switch (-noaddonreg) which opens Firefox in a mode that ignores registration checks. To avoid malicious applications from launching Firefox in this mode, Firefox could prompt users at startup if this switch is on, pointing out that it is an insecure mode and giving them the default option of starting in normal mode. Developers would quickly learn to mechanically choose the non-default and this would just become a minor annoyance in their development process.
> Additionally (or alternatively), a locked pref could be used to whitelist add-on IDs.

This seems like a very minor inconvenience to developers and one that
could be made invisible through tooling (such as the Add-on SDK's cfx tool).


Jorge Villalobos

unread,
Oct 31, 2013, 12:50:48 PM10/31/13
to
File registration would be tied to an account, so the developer should
be the one submitting it. maxVersion changes are generally not necessary
anymore, so hopefully that won't be a major problem. If you needed to
modify the XPI for whatever reason, you should also change the add-on ID
and then submit it for registration.

For abandoned add-ons that are still widely used, we might do some
registration ourselves.

Jorge

Jorge Villalobos

unread,
Oct 31, 2013, 12:57:48 PM10/31/13
to
On 10/30/13 7:57 PM, a.very.l...@gmail.com wrote:
> On Wednesday, October 30, 2013 9:55:21 PM UTC, jorgev wrote:
>> Cross posting to dev.planning, where I originally intended this to be.
>>
>
> There are two large problems with this.
> 1) Since Firefox will not allow any unregistered add-on to be installed then creating new an extension or theme will become impossible unless you allow blank add-ons to be registered. If I create a new extension it cannot be installed and tested until it is registered.

As posted by someone else, the proposal addresses the developer flow.
Not only new add-ons but also existing add-ons under development need a
way to bypass registration to avoid having to register every single build.

> 2) The company I work for has a number of extensions that are only used on our own servers. They are have no functionality outside of our servers. By requiring permission from Mozilla to run these extensions we will need to make certain items that could possibly give insight on our production practices to our competitors are removed and thus functionality would be lost.

Registered files are not exposed in any way to the public. The only way
to learn about them would be to send an API call using the add-on ID and
a file hash, which will tell you if the hash is registered or not. I
don't see how that would give your competitors any insights (unless
you're saying Mozilla is your competitor).

We understand that because of various reasons, like business policies of
not sharing certain information under any circumstances, registering
add-ons for internal use will not be possible. In those cases you will
have the option of just blocking the API in your internal network. API
failures will allow the add-on to be installed. We're also looking into
locked preferences to whitelist a set of IDs.

Jorge

Jorge Villalobos

unread,
Oct 31, 2013, 1:06:27 PM10/31/13
to
The proposal mentions a way to bypass check for add-ons that are under
development, and possibly an add-on ID whitelist. It's also possible to
just block the API calls, since network errors won't prevent add-ons
from being installed.

> This appears to be a total reversal of past Mozilla philosophy, which
> not only encouraged the development of extensions but also strongly
> advocated the use of extension where users wanted features that the
> developers were not interested in providing.

We're adding a malware detection step in the development process. I
don't think that's a complete reversal. What would be a complete
reversal in our philosophy would be to only allow AMO add-ons, and
that's something I've been fighting for years so it doesn't happen.
However, there are real problems and risks in way we currently deal with
add-ons, which is why we're putting this proposal forward. This is a
middle ground where we can have some control over malware distribution
while still allowing developers to create whatever they want (that isn't
malicious) outside of AMO.

Jorge

Avi Hal

unread,
Oct 31, 2013, 1:06:43 PM10/31/13
to
I'm copying my post from dev.platform:

On Thursday, October 31, 2013 2:09:06 AM UTC+2, David E. Ross wrote:
> This appears to be a total reversal of past Mozilla philosophy, ...

Agreed. Central repo and mandatory approval is not what Mozilla is IMO. While there are some gains from such move, I think it hurts freedom and openness more.

Essentially the browser has become an operating system, where apps/addons could be installed to it, including malwares. However, consider what happens if Microsoft or Apple would not let any app run unless it's approved on their main desktop OS? Look at the open community response to IOS closed garden.

What about companies who have private addons for their own employees which they don't want to share with Mozilla? What about addons which are against some US regulations? can we stop the government from preventing Mozilla approving those? What about addons which are against Mozilla's philosophy? Do we wanna stop those? Would we?

This really is a line Mozilla should not cross IMO.

Mozilla may provide signing for those who request it, or even maintain a repository of malware hashes (sort of antivirus), but making mandatory approval of addons is not something I'd like to see.

Also, is there a discussion going on? or is this a probe to see how much resistance there is for it? What would stop this suggested change from taking place? How many people were involved in creating this suggestion?

- avih

Mike Connor

unread,
Oct 31, 2013, 1:16:25 PM10/31/13
to Jorge Villalobos, dev-planning@lists.mozilla.org planning

On Oct 31, 2013, at 12:57 PM, Jorge Villalobos <jo...@mozilla.com> wrote:

> We understand that because of various reasons, like business policies of
> not sharing certain information under any circumstances, registering
> add-ons for internal use will not be possible. In those cases you will
> have the option of just blocking the API in your internal network. API
> failures will allow the add-on to be installed. We're also looking into
> locked preferences to whitelist a set of IDs.

I’m not convinced this will create a significant barrier, since a side loaded add-on is already coming in with admin privs, and thus can trivially subvert things like host resolution.

Is there a reason we haven’t just gone with signing of XPIs (submit unsigned XPIs, get back signed XPIs) rather than a network-driven API? It’d be good to understand which options were considered over the last year.

— Mike

Boris Zbarsky

unread,
Oct 31, 2013, 1:19:37 PM10/31/13
to
On 10/31/13 1:06 PM, Avi Hal wrote:
> However, consider what happens if Microsoft or Apple would not let any app run unless it's approved on their main desktop OS?

Just for what it's worth, Apple already does this by default, starting
with OS X 10.8. It's possible to change the default behavior by going
into the OS security settings, for now; we'll see if that lasts.

There was a bit of stink about it at the time on tech sites, but mostly
it's been business as usual for Apple.

Not that I think we should use Apple as a role model for this stuff, of
course. ;)

-Boris

4mr....@gmail.com

unread,
Oct 31, 2013, 3:10:52 PM10/31/13
to
Well, this is annoying...

I don't want to see a stupid popup every time I restart FF/open a different profile.

>AMO will also expose a file upload API, to facilitate automated file registration.

We make daily builds in two branches: beta and nightly. They are used by up to 10% of our users. Needless to say, this gets a strong down-vote unless it will be easy to integrate in build scripts.

Jorge Villalobos

unread,
Oct 31, 2013, 4:50:01 PM10/31/13
to
This is the discussion. We need to determine if there are drawbacks we
aren't aware of, and what people think about the drawbacks we know
about, specifically the openness issue.

It's difficult to say how many people have been involved with this idea,
since it has existed for a couple of years. At least a couple dozen
people from various teams within Mozilla have known about this and
shared their thoughts. Now it's time for the whole Mozilla community to
have their say.

However, I think it's unlikely that we just decide not to do anything.
It will either be this or something similar. The security risks of the
current system are pretty large.

Jorge

Jorge Villalobos

unread,
Oct 31, 2013, 4:56:58 PM10/31/13
to
There are a couple of reasons why signing isn't in this proposal. The
first one is that we would need to set a cut-off point after which we
only allow signed files. That would exclude all installations of
previously unsigned files, which would be massive. In this proposal,
developers have the possibility of registering their old files, so
current users can continue using them. Even for old, abandoned add-ons,
either us or one of their users can register old files so that they
continue working. On the AMO side we can automatically register all
files old, and new, fairly easily.

Signing XPIs would require us to repackage the files to add the
signature, and that's something we've had problems with in the past
(like with SDK repacks). I don't have confidence that it would work for
the the huge back-catalog of add-on files we have. And, even if we did
sign them all, that still excludes all installs of old files.

Jorge

Jorge Villalobos

unread,
Oct 31, 2013, 4:59:16 PM10/31/13
to 4mr....@gmail.com
The upload API is meant to handle this use case specifically, and we'll
make sure it works well.

Jorge

a.very.l...@gmail.com

unread,
Oct 31, 2013, 5:30:17 PM10/31/13
to
The command line switch (-noaddonreg)(which I didn't notice, sorry about that) is a much better idea. A whitelist in unworkable in a corporate situation simply because the amount of effort it would take to roll it out to 200+ isn't worth the money it would cost. -noaddonreg would solve the problem nicely since it would be a very easy way to take Mozilla out of seconding guessing our IT department's choices.

Gijs Kruitbosch

unread,
Oct 31, 2013, 5:33:07 PM10/31/13
to
The thing is, you're implicitly signing things now, in a certain way.
All a signature really is is a guarantee of an identity that it knows
about some thing (usually based on a hash of the thing).

If we allowed signed add-ons without the hash being registered
centrally, that'd likely alleviate the burden somewhat for people
wanting to locally deploy stuff (and not wanting the
signature-checking-style-things going in AMO's direction) and do good
things in terms of the openness discussion as well: all we're enforcing
is some kind of voucher for the correctness of the package, which would
be either the certificate chain for a signature of a signed xpi, or AMO
in the case of an unsigned one.

~ Gijs

Gijs Kruitbosch

unread,
Oct 31, 2013, 5:35:01 PM10/31/13
to
On 31/10/13, 22:30 , a.very.l...@gmail.com wrote:
> On Thursday, October 31, 2013 9:57:48 AM UTC-7, jorgev wrote:
>> On 10/30/13 7:57 PM, a.very.l...@gmail.com wrote:
>>
>>> On Wednesday, October 30, 2013 9:55:21 PM UTC, jorgev wrote:
>>
>>>> Cross posting to dev.planning, where I originally intended this to be.
>>
>>>>
>>
>>>
>>
>>> There are two large problems with this.
>>
>>> 1) Since Firefox will not allow any unregistered add-on to be installed then creating new an extension or theme will become impossible unless you allow blank add-ons to be registered. If I create a new extension it cannot be installed and tested until it is registered.
>>
>>
>>
>> As posted by someone else, the proposal addresses the developer flow.
>>
>> Not only new add-ons but also existing add-ons under development need a
>>
>> way to bypass registration to avoid having to register every single build..
>>
>
> The command line switch (-noaddonreg)(which I didn't notice, sorry about that) is a much better idea. A whitelist in unworkable in a corporate situation simply because the amount of effort it would take to roll it out to 200+ isn't worth the money it would cost. -noaddonreg would solve the problem nicely since it would be a very easy way to take Mozilla out of seconding guessing our IT department's choices.
>

How do we envision this working? For it to work for people in enterprise
situations, it having a click-through UI is a no-no. As an add-on dev,
I'd get pretty tired of having to click through it all the time. If
there's no warning at all, I'd imagine malware will have started using
it as a startup parameter for Firefox before it's been through the 12
weeks from nightly to release...

~ Gijs

Gavin Sharp

unread,
Oct 31, 2013, 5:40:11 PM10/31/13
to J. Ryan Stinnett, dev. planning
On Thu, Oct 31, 2013 at 7:49 AM, J. Ryan Stinnett <jry...@gmail.com> wrote:
> This seems like it will just escalate an "arms race" with malware developers.
> It will simply take them a bit of time to find clever ways around whatever
> scanning is being done.

That the solution isn't perfect and results in an "arms race" doesn't
mean that it has no value. Sometimes it makes sense to "get into the
arms race" if it means that we protect more of our users in practice.

(Of course, the specifics of how we approach this is complicated and
worthy of debate. Sometimes the costs of being in the arms race
outweigh the benefits, but that's not always the case.)

Gavin

Jorge Villalobos

unread,
Oct 31, 2013, 7:54:30 PM10/31/13
to
So, you're suggesting to have signing in addition to the hashes, as a
way to opt-out?

Jorge

Jorge Villalobos

unread,
Oct 31, 2013, 7:56:31 PM10/31/13
to
There would be a keyboard shortcut, of course. At the very least
something like "Left arrow + Return", which is something that a
developer should be able to do without even thinking about it.

Jorge

bre...@gmail.com

unread,
Oct 31, 2013, 8:38:34 PM10/31/13
to
Well, I guess an apology is in order. I had not envisioned such a way as this for you to mitigate security risks without disabling third party add-ons, so I see now that your rejection of my add-on, AsYouWish[1] from AMO was not a result of arbitrariness.

However, I am concerned that my platform, which allows websites to gain access to Addon SDK privileges upon user approval, may be treated as malware (since sites may be able to use it as such).

At an earlier time, Mozilla felt that enablePrivilege was a fair enough mechanism (e.g., to allow more rapid deployment by developers and avoidance of installation for users such as add-ons cannot provide), and even now, with other third party add-ons, there can be a spectrum of privilege escalation.

For example, I have completed initial work on WebAppFind[2] to allow executables to be associated with file extensions on the desktop (currently Windows only), so that when a user right-clicks "Open With..." or assigns the executable as the default handler for a file on their desktop and double-clicks the file, Firefox will be invoked with command line instructions sent by the executable that my add-on receives, and it will then grant a suitable website the necessary message listening and posting capabilities that will allow it to read and potentially overwrite the contents of the opened file. I think this is a far more circumscribed escalation of privileges than AsYouWish, but I'm not sure whether you would block this from AMO or worse, treat it as malware due to it enabling a malicious (albeit user-approved) website to overwrite the contents of a clicked file.

I would hope you could clarify in plain language what the attempts at "malware" exclusion mean as far as those add-ons which deliberately facilitate an escalation of privileges. I would think that any postMessage communication between websites and add-ons (as is already a part of your SDK API) is likely to be used to provide some form of escalation of privileges, so if you are going to be policing this, I hope you can define a consistent (and hopefully not unduly constraining) policy in this regard.

The web is moving forward with even more dangerous capabilities like user-approved Geo-location, so I hope that efforts for minimization of risk will not unduly hold back an increase of opportunities.

[1] https://github.com/brettz9/asyouwish/
[2] https://github.com/brettz9/webappfind

ert...@gmail.com

unread,
Oct 31, 2013, 9:07:54 PM10/31/13
to
I develop an extension for use on an intranet that has no access to the Internet. Will this affect my extension?

Eric Moore

unread,
Nov 1, 2013, 12:33:26 AM11/1/13
to
On Wednesday, October 30, 2013 5:55:21 PM UTC-4, jorgev wrote:
> Cross posting to dev.planning, where I originally intended this to be.

The proposal describes it as something that would be implemented in Firefox. My concern is what impact this might have on Thunderbird.

1) Would this only be implemented in Firefox or might it actually be implemented in Gecko or some other shared library that forces the Thunderbird developers to support this proposal unless they want to fork and maintain their own version of that component?

2) The "series of automatic malware checks" might block some Thunderbird add-ons because whoever designed the malware checks is focused strictly on Firefox.

3) Mozilla has stopped development of Thunderbird so the development process no longer mimics Firefox's. Thunderbird has only one release a year with new features (implemented by the community) so users are very dependent upon add-ons for new features/functionality. Many popular add-ons haven't been updated for several years.

This means that any additional obstacles have a disproportionate effect on Thunderbird users. Please don't evaluate the impact just in terms of what it means to Firefox users.

David E. Ross

unread,
Nov 1, 2013, 12:59:32 AM11/1/13
to
What is needed is an end-user option that says "I accept the risk. Let
me install and enable the addons I want." There already is a similar
existing option for about:config

Steve Fink

unread,
Nov 1, 2013, 2:27:05 AM11/1/13
to dev-pl...@lists.mozilla.org
On 10/31/2013 07:49 AM, J. Ryan Stinnett wrote:
>> This repository won't publish any of the files, and inclusion won't
>> require more than passing a series of automatic malware checks. We will
>> store the files and generated hashes for them.
> This seems like it will just escalate an "arms race" with malware developers. It will simply take them a bit of time to find clever ways around whatever scanning is being done.

Escalate, yes. But we're already in that arms race, and there's no way
to avoid it. We provide the capability to modify the operation of users'
browsers. It is inevitably exploited for both good and harmful purposes.
We do what we can to keep the balance on the good side without choking
it off entirely. Barriers to side-loaded extensions and AMO review are
just two example attempts we've made to swing things towards the good
side. This proposal pushes harder on the good side, at the risk of
choking off the supply (it improves quality at the cost of quantity, and
it's hard to say what the end result is.)

Steve Fink

unread,
Nov 1, 2013, 2:55:44 AM11/1/13
to dev-pl...@lists.mozilla.org
This system involves permanently storing IP addresses and the full text
of all addon files. That's a dangerous pile of data to be sitting on. Do
we want to handle subpoenas for IP addresses known to belong to
organizations and individuals under investigation? Do we want to risk a
security breach exposing a whole lot of data that doesn't belong to us?

Some organizations will have policies that forbid exposing these data
(in particular, the addon files) to external third parties, no matter
how convincingly they promise to take care of them. We would lose those
users unless we came up with a better mechanism for local registration.
But such a mechanism might defeat part of the desired benefits of this
proposal (eg having better contact info for addon authors.)

I don't know what the automatic malware analysis is, but every similar
analysis that I've ever worked on has required constant change,
especially when it's exposed as a tool in the arms race against people
motivated to subvert it (eg malware authors.) I'm skeptical that we
could rerun the analysis on the entire registry, every time the analysis
changes. The registry will at least have to track what version of the
analysis was run for a given entry, and the browser UI would need to
accept a registered addon becoming unregistered at any time.

The proposal would slow down startup, since all enabled addons will need
to be hashed before loading them "for real". (And those hashes would
need to be looked up in a local cache of the registry, assuming you
don't want to wait on a network hit during startup.)

An alternative to consider would be to run the analysis on the client
side. The code for the analysis could be updated with every Firefox
release or more frequently. The perf hit could be mitigated by running
the analysis for popular addons on the AMO side, and using the registry
sketched out in the proposal for those. Sure, malware could subvert the
local analysis, but it could just as easily lie about the content hash
used to query the registry. (The client side would still probably want
to cache the analysis results across browser restarts, so it'd still
need to checksum the file and look up the checksum in a local registry
to avoid redoing the analysis during startup.)

Gijs Kruitbosch

unread,
Nov 1, 2013, 4:29:33 AM11/1/13
to Jorge Villalobos
Yes, but to shoot my own plan down a little bit, I'm not sure how much
that gets you in the malware department. Getting a cert that lets you
sign packages is just a question of money, not of the reputability of
your business. On the other hand, it *would* make it harder to have
random IDs for signed add-ons, as you'd need to sign each one when you
change the ID (as that'd change the file hash and thus the signature).

We could also consider adding blocklist features that block based on the
signature/cert chain used, so that even if malware providers obtain a
cert, we could decide to block add-ons signed with that cert.

~ Gijs

Philip Chee

unread,
Nov 1, 2013, 8:19:48 AM11/1/13
to
On 01/11/2013 01:06, Jorge Villalobos wrote:

> The proposal mentions a way to bypass check for add-ons that are under
> development, and possibly an add-on ID whitelist. It's also possible to
> just block the API calls, since network errors won't prevent add-ons
> from being installed.
>
>> This appears to be a total reversal of past Mozilla philosophy, which
>> not only encouraged the development of extensions but also strongly
>> advocated the use of extension where users wanted features that the
>> developers were not interested in providing.
>
> We're adding a malware detection step in the development process. I
> don't think that's a complete reversal. What would be a complete
> reversal in our philosophy would be to only allow AMO add-ons, and
> that's something I've been fighting for years so it doesn't happen.
> However, there are real problems and risks in way we currently deal with
> add-ons, which is why we're putting this proposal forward. This is a
> middle ground where we can have some control over malware distribution
> while still allowing developers to create whatever they want (that isn't
> malicious) outside of AMO.

Here is my usecase:

I run http://xsidebar.mozdev.org/modifiedmisc.html

This is a collection of Firefox and Thunderbird extensions that I have
modified to work with SeaMonkey.

These extensions fall into various categories:

1. Orphaned extensions that I have been keeping alive by updating the
code. The original authors aren't around any more, and even if they are
they probably aren't interested in any more involvement with Mozilla.
These extensions may or may not be registered with AMO. How would I get
AMO to accept my mods?

2. Active extensions from authors who aren't interested in supporting
SeaMonkey - as it their right. Assuming that these authors are still
using AMO, how do I get my forks registered? Without changing the
extension ID.

An additional usecase:

3. Extensions that aren't on AMO (due to some disagreements with AMO
policies or some other dispute) but are hosted on mozdev, or
extensions-mirror, or code.google.com or github, or bitbucket, or
sourceforge, or the authors blog. If these authors aren't interested in
involvement with AMO, forcing them to register their addons are just
going to make them even more annoyed with AMO.

You have a problem but your proposed cure is worse than the disease.
Stop listening to the echo chamber and pay more attention to extension
developers please.

Phil

--
Philip Chee <phi...@aleytys.pc.my>, <phili...@gmail.com>
http://flashblock.mozdev.org/ http://xsidebar.mozdev.org
Guard us from the she-wolf and the wolf, and guard us from the thief,
oh Night, and so be good for us to pass.

Henri Sivonen

unread,
Nov 1, 2013, 8:42:23 AM11/1/13
to dev-pl...@lists.mozilla.org
On Wed, Oct 30, 2013 at 11:42 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
> As many of you know, the Add-ons Team, User Advocacy Team, Firefox Team
> and others have been collaborating for over a year in a project called
> Squeaky [1].

(Actually, many of us probably didn't know.)

> Part of our work consists on pushing forward improvements in Firefox
> that we think will significantly achieve our goals, which is why I'm
> submitting this spec for discussion:
>
> https://docs.google.com/document/d/1SZx7NlaMeFxA55-u8blvgCsPIl041xaJO5YLdu6HyOk/edit?usp=sharing
...
> We believe this strikes the right balance between a completely closed
> system (where only AMO add-ons are allowed) and the completely open but
> risky system we currently have in place.

I disagree on the point of right balance. I think this is at the same
time to Draconian and too lenient.

I think this is to Draconian, because it prevents organizations
("enterprises") from deploying add-ons within the organization without
having to coordinate from an outside entity. Based on what Mike Kaply
said on the add-on experience mailing list, I believe this would be a
problem for real organizations that deploy Firefox today. I think it
would be a big mistake to add another round of fuel to the meme that
Mozilla is hostile to the enterprise sector.

Undermining enterprise deployment undermines the Mozillians who've
fought hard to be able to use and let others use Firefox in their
workplaces. In general, the home use sector and the enterprise sector
spill over in both directions: People who are happy with Firefox at
home want to use it at work and will be less happy if they can't. On
the other hand, if people are forced to use something other than
Firefox at work, they might get accustomed to it or even like it and
move to the same non-Firefox browser at home.

So I think it would be a very bad idea to proceed with this plan
without solving the enterprise deployment concerns before proceeding.
(That is, proceeding and maybe solving the enterprise concerns later
would mean the damage would be done by the times of the solution
arrives.)

On the other hand, I think this proposal is too lenient, because it
wouldn't solve the problem of search hijacks an unwanted toolbars. The
mockups for this project suggest that search hijacks and unwanted
toolbars could still be registered.
(https://wiki.mozilla.org/Add-ons/FileRegistration/Mockups)

That is, it seems to me that this proposal would have significant
downsides but not be a major win in terms of the most common add-on
annoyances.

On Thu, Oct 31, 2013 at 7:19 PM, Boris Zbarsky <bzba...@mit.edu> wrote:
> Just for what it's worth, Apple already does this by default, starting with
> OS X 10.8.

What Apple is doing by default on OS X is substantially different from
this proposal. By default, Apple requires OS X *developers* to be
registered and requires the developers to sign the apps they
distribute. This means that the apps don't need to be disclosed to
Apple prior to distribution, but ex-post revocation in case of
misbehavior is possible. This proposal involves disclosing the add-ons
to Mozilla prior to distribution.

As far as I can tell, the requirement to disclose add-ons to Mozilla
ex-ante provides Mozilla with the opportunity to run a malware scan
and to put the code on file in case there's a need to study the code
more carefully for the purpose of ex-post policy enforcement.

Are a malware scan and having the code on file really worth the
downsides of the proposed system when the definition of "malware"
doesn't even include the most typical sorts of unwanted add-ons?

--
Henri Sivonen
hsiv...@hsivonen.fi
http://hsivonen.fi/

J. Ryan Stinnett

unread,
Nov 1, 2013, 11:34:08 AM11/1/13
to
Jorge Villalobos wrote:
> However, I think it's unlikely that we just decide not to do anything.
> It will either be this or something similar. The security risks of the
> current system are pretty large.

Perhaps it is because I am newer here, but part of what made it so hard for me to read your initial post was that I personally do not perceive any malware risk, so it came across as many developer restrictions for a threat that did not even seem to be real.

I would suggest adding more information about what Mozilla considers to be "malware" and how prevalent it seems to be today.

Jorge Villalobos wrote:
> If a user installs a file that is not registered, a message will be shown
> explaining that the file can’t be installed for their protection.

It looks as if there is no way to get around the decision received from the registration service once a check has been made. Why is that? Alerting the user that Mozilla considers an add-on to be unsafe seems okay, but it just does not seem possible to truly know whether *each user* would agree the extension is unsafe if they knew what it was doing.

Could there be a button to continue anyway? Or at least a link to some documentation explaining how to white list IDs?

Jorge Villalobos wrote:
> ...you will have the option of just blocking the API in your internal
> network. API failures will allow the add-on to be installed.

Do you plan to publish the host names of the servers used in this process so that they can easily be blocked for people / organizations who choose this option? Perhaps they could be part of the same documentation that I was suggesting above that would explain how to white list.

Jorge Villalobos wrote:
> So, you're suggesting to have signing in addition to the hashes, as a
> way to opt-out?

I do think this is valuable to explore as a lighter-weight solution for any developers that do not feel comfortable handing over their code. As you mentioned earlier, supporting signing *only* seems bad, since it would lead to many broken extensions. But a "dual path" approach that also made signing available as an option seems more developer friendly. I hope that this approach can be evaluated.

- Ryan

Dave Townsend

unread,
Nov 1, 2013, 11:45:47 AM11/1/13
to Eric Moore, dev-pl...@lists.mozilla.org
On Thu, Oct 31, 2013 at 9:33 PM, Eric Moore <erm...@gmail.com> wrote:

> On Wednesday, October 30, 2013 5:55:21 PM UTC-4, jorgev wrote:
> > Cross posting to dev.planning, where I originally intended this to be.
>
> The proposal describes it as something that would be implemented in
> Firefox. My concern is what impact this might have on Thunderbird.
>
> 1) Would this only be implemented in Firefox or might it actually be
> implemented in Gecko or some other shared library that forces the
> Thunderbird developers to support this proposal unless they want to fork
> and maintain their own version of that component?
>

Due to the nature of the code this will certainly be implemented
predominantly in toolkit but it will be done so in a way that allows
applications to opt-in to it similar to other things that we have added in
the past like the sideloading opt-in screen and hotfix support.


> 2) The "series of automatic malware checks" might block some Thunderbird
> add-ons because whoever designed the malware checks is focused strictly on
> Firefox.
>

I don't know the specifics of the checks but I believe it is more about
checking for the existance of actual binary malware that the add-on might
try to execute. This would apply equally to Thunderbird add-ons as well as
Firefox add-ons. Regardless I'm sure the AMO team will be open to make
changes if this became a problem.

Dave Townsend

unread,
Nov 1, 2013, 11:55:43 AM11/1/13
to Steve Fink, dev-pl...@lists.mozilla.org
On Thu, Oct 31, 2013 at 11:55 PM, Steve Fink <sf...@mozilla.com> wrote:

> The proposal would slow down startup, since all enabled addons will need
> to be hashed before loading them "for real". (And those hashes would
> need to be looked up in a local cache of the registry, assuming you
> don't want to wait on a network hit during startup.)
>

The intention is not to hash the add-ons during startup. We've talked about
the option of hashing some time later to verify that all the installed
add-ons are still valid. This does create a way for an application on the
user's system to get an unknown add-on into Firefox but that seems like an
ok trade-off against saving startup time.


> An alternative to consider would be to run the analysis on the client
> side. The code for the analysis could be updated with every Firefox
> release or more frequently. The perf hit could be mitigated by running
> the analysis for popular addons on the AMO side, and using the registry
> sketched out in the proposal for those. Sure, malware could subvert the
> local analysis, but it could just as easily lie about the content hash
> used to query the registry. (The client side would still probably want
> to cache the analysis results across browser restarts, so it'd still
> need to checksum the file and look up the checksum in a local registry
> to avoid redoing the analysis during startup.)
>

The malware check itself is not the most important part of this proposal.
Being able to get information about and contact the developers of
potentially harmful extensions and blocking the ability of malware to
create per-install versions of their extensions that would be difficult for
us to block is what we're trying to solve here. Local file checks just
don't help.

Dave Townsend

unread,
Nov 1, 2013, 12:04:31 PM11/1/13
to Henri Sivonen, dev-pl...@lists.mozilla.org
Earlier comments pointed out that enterprises can just block access to the
API which would effectively turn off this feature. Do you think that isn't
good enough to solve the problem?


> On the other hand, I think this proposal is too lenient, because it
> wouldn't solve the problem of search hijacks an unwanted toolbars. The
> mockups for this project suggest that search hijacks and unwanted
> toolbars could still be registered.
> (https://wiki.mozilla.org/Add-ons/FileRegistration/Mockups)
>
> That is, it seems to me that this proposal would have significant
> downsides but not be a major win in terms of the most common add-on
> annoyances.
>

This is a general tool to help combat the prevalence of malware add-ons by
blocking the install of add-ons that we don't know about and giving us the
ability to look at add-ons after the fact to evaluate them for potential
blocklisting or developer outreach. That doesn't mean we can't also work on
specific types of malware. We have already added some tools to mitigate
against search engine hijacking in Firefox.

and...@ducker.org.uk

unread,
Nov 1, 2013, 12:18:14 PM11/1/13
to
On Friday, 1 November 2013 16:04:31 UTC, Dave Townsend wrote:
> Earlier comments pointed out that enterprises can just block access to the
> API which would effectively turn off this feature. Do you think that isn't
> good enough to solve the problem?

I'd say so. You're asking people to add rules to their firewall in order to solve an application usability issue. This means that (a) there's going to be a pain point in tracking down the firewall team and getting them to make a change in the first place (always frustrating) and then (b) when someone is cleaning up the firewall and can't work out why a rule is there, and removes it, suddenly your internal-only Firefox addons stop working, and nobody can work out why!

This should be something that users can override. If my Android phone can let me install applications from anywhere on earth, once I tell it I understand the risks, then my browser should be able to do likewise. Locking me into a central, single, blesed repository, that I cannot override as a user, is what put me off ever getting an iPhone. This is _not_ openness.

Andy

Jorge Villalobos

unread,
Nov 1, 2013, 1:24:04 PM11/1/13
to
On 11/1/13 6:19 AM, Philip Chee wrote:> On 01/11/2013 01:06, Jorge
If they're abandoned you can just register them without changing their
ID. If the original developer wanted to register them and complained to
us, we would need to give him/her the rights to the ID, and then we
would be in case (2).

> 2. Active extensions from authors who aren't interested in supporting
> SeaMonkey - as it their right. Assuming that these authors are still
> using AMO, how do I get my forks registered? Without changing the
> extension ID.

You can ask the developers of the original add-on to add you as a
(possibly hidden) owner or developer of the extension, so you can
register your versions under their ID. They wouldn't need to be hosted
on AMO in order to be registered.

If the developer refuses, then you would have no choice other than using
a different ID.

> An additional usecase:
>
> 3. Extensions that aren't on AMO (due to some disagreements with AMO
> policies or some other dispute) but are hosted on mozdev, or
> extensions-mirror, or code.google.com or github, or bitbucket, or
> sourceforge, or the authors blog. If these authors aren't interested in
> involvement with AMO, forcing them to register their addons are just
> going to make them even more annoyed with AMO.

Yes, that's a problem we can't really avoid if we want to make any
useful changes. This system is designed to minimize the amount of
contact developers have with AMO and the AMO-specific policies that are
usually the reason they host their add-ons elsewhere.

> You have a problem but your proposed cure is worse than the disease.
> Stop listening to the echo chamber and pay more attention to extension
> developers please.

We're in a position where something needs to change. If you have any
alternative to offer other than "let's keep things like they are", this
is the time and place to do it. We're well aware of the impact this will
have on developers, and we're doing our best to make keep it to a minimum.

Jorge

Steve Fink

unread,
Nov 1, 2013, 1:30:50 PM11/1/13
to Dave Townsend, dev-pl...@lists.mozilla.org
On 11/01/2013 08:55 AM, Dave Townsend wrote:
> On Thu, Oct 31, 2013 at 11:55 PM, Steve Fink <sf...@mozilla.com
> <mailto:sf...@mozilla.com>> wrote:
>
> The proposal would slow down startup, since all enabled addons
> will need
> to be hashed before loading them "for real". (And those hashes would
> need to be looked up in a local cache of the registry, assuming you
> don't want to wait on a network hit during startup.)
>
>
> The intention is not to hash the add-ons during startup. We've talked
> about the option of hashing some time later to verify that all the
> installed add-ons are still valid. This does create a way for an
> application on the user's system to get an unknown add-on into Firefox
> but that seems like an ok trade-off against saving startup time.

So as a malware author, I would need to be sure that on first run my
addon disables the check either by adding a whitelist pref, preventing
the API from being called, or blocking that network address; or that it
does its damage then uninstalls itself (or replaces itself with a
registered addon). That's still *some* additional protection, I suppose,
since we can at least detect most of those (at least until the next turn
in the arms race game.)

>
>
> An alternative to consider would be to run the analysis on the client
> side. The code for the analysis could be updated with every Firefox
> release or more frequently. The perf hit could be mitigated by running
> the analysis for popular addons on the AMO side, and using the
> registry
> sketched out in the proposal for those. Sure, malware could
> subvert the
> local analysis, but it could just as easily lie about the content hash
> used to query the registry. (The client side would still probably want
> to cache the analysis results across browser restarts, so it'd still
> need to checksum the file and look up the checksum in a local registry
> to avoid redoing the analysis during startup.)
>
>
> The malware check itself is not the most important part of this
> proposal. Being able to get information about and contact the
> developers of potentially harmful extensions and blocking the ability
> of malware to create per-install versions of their extensions that
> would be difficult for us to block is what we're trying to solve here.
> Local file checks just don't help.

The benefits of client-side analysis are (1) solving the scaling problem
and (2) providing a fallback for (good) addons that are intentionally
absent from the registry. It sounds like you don't really care about #1,
presumably because you'd be fine with the registry containing lots of
"outdated" entries (as in, entries only validated against old malware
lists.) This makes sense if the primary purposes are contact info and
blacklist-by-default. I still think that #2 is a major deal, because I
imagine that there are many many situations where uploading the full
code forever and having your IP stored forever are unwanted.

#2 is pretty antagonistic to the contact info goal. It still feels to me
like there ought to be some way to gain more contact info without doing
the full current proposal. For example, we could have a registry in
addition to or instead of the proposed registry, where nothing is stored
permanently but the author must respond to the given contact email
address before their hash is added to the registry.

To be clear, I still feel like my objections to the current proposal are
each individually enough to kill it in its current form, and I'd like to
see a response to them. I imagine the people who have put this proposal
together disagree with me. I am personally very sympathetic to attempts
to tighten up the addon ecosystem somewhat, because I get the general
anecdotal impression that unwanted addons are a serious problem -- and
so too is the problem of bad behavior inadvertently triggered by
*wanted* addons, which the contact info aspect of this proposal would
help to address. (And I'd love to have the source code of a large
percentage of the addons used in the field, so that we could check
whether we can remove problematic features without endless deprecation
cycles. But I don't think that's worth the cost of requiring all the
source to be registered.)

I'm also feeling like addon acceptance should be modular, so that the
Mozilla whitelist is one source of known-acceptable addons but
individuals and enterprises could add additional sources of truth
easily. For example, if it were straightforward to set up additional
local whitelist servers. Perhaps the community could stand up additional
ones for specific purposes. Obviously, you'd need to make it hard for
unwanted whitelists to get installed.

Matt Basta

unread,
Nov 1, 2013, 1:48:44 PM11/1/13
to Jorge Villalobos, dev-pl...@lists.mozilla.org
To fork the discussion on this towards the technical feasibility (let's assume we decided to go down this path):

1. Storing the files would be a daunting task.

Each file would need to be stored in some sort of key-value database which we don't presently have infrastructure for. If we use sha256 (as the PRD recommends), there's a very likely chance that eventually we'll hit a collision. That means we'd need to be able to do lookups by add-on ID + file path, hash + add-on ID + file path, etc. Every means of lookup would require an additional index (if we use an RDBMS like MySQL, which is what AMO uses now). We could conceivably start using HBase, but I know of only one team that uses it now (perhaps not anymore?) and they don't like it at all.

The infrastructure aside, this would require a non-trivial amount of space. If we're storing hashes for every file for all of time, that's going to get REALLY big very quickly. I don't think it's improbable that this would need a dedicated cluster to be able to store that much data.

2. The bandwidth required would be non-trivial.

Many add-ons contain large numbers of files. We'd need to register every file (you could just rename "malicious_file.js" to "completely_innocent.jpg" and `eval()` it). On the client side, that means pinging the API with hundreds, if not thousands (tens of thousands? more?) of hashes on a regular basis. Hashes are mostly random, so they won't compress well. This means that someone with a few dozen completely legitimate add-ons might get swamped with these multi-megabyte requests to our servers.

What happens if that request fails? We know it will, that's why we've started distributing Firefox with a CDN. The more you're pumping down the wire, the more likely the system is to fail. This issue could be solved by chunking the request, but that does not address the bandwidth issues.

3. Hackability

Sha256 is obviously strong, but in this day and age (especially going forward) it's not inconceivable that an attacker could buy a crapload of Butterfly Labs boxes (or botnet time) and crunch out a malicious file that has the same hash as jQuery or another very common file in add-ons. Countering this would require changes to Firefox to detect, but by the time we respond to the issue, the malicious add-on could have done any number of nefarious things to block whatever fix we deploy.

We could solve this by using a hashing algorithm with less probability of collisions and greater computational requirements, but that won't be a completely future-proof solution. Furthermore, the greater the size of the hash, the harder the above to issues are to solve.

4. Sheer brute force feasibility.

What's to stop an attacker from doing the following:

- Buy time on a botnet
- Create hundreds of thousands of permutations of an obfuscated malicious file and submit it to the automated API with different IDs (from different IPs)
- Distribute a unique piece of add-on malware to each victim

I can't see how we'd be able to seek out and/or blacklist a massive amount of malicious scripts, especially if they're mixed in with a massive amount of legitimate scripts. Munching a piece of JavaScript AST to be undetectably unique is not a hard problem to solve.
_______________________________________________
dev-planning mailing list
dev-pl...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-planning

Jorge Villalobos

unread,
Nov 1, 2013, 1:51:58 PM11/1/13
to
Having a profile-level preference would make this trivially easy to
bypass, and we already have an installation warning that clearly is not
sufficient to deter users from installing malware.

There will be ways to disable this feature, as explained in the spec,
but they won't be as easy to get to.

Jorge

Jorge Villalobos

unread,
Nov 1, 2013, 1:57:11 PM11/1/13
to
I think that's a good idea, and I'll give it some more thought. One
problem is that we need to set some bar of legitimacy for the developers
we're handing out the certs to, since it needs to be much higher than
the one we have for registration. Otherwise we would be back at where we
are now. It should be possible to design it in a way we can drop
specific certs if we learn they are being used for malicious purposes.

There's also a problem of perception, where this would look like big
businesses would be able to "buy" an exception to the rules that apply
to the majority. We would need to make the scope of this very clear to
avoid confusion.

Thanks!

Jorge

Dave Townsend

unread,
Nov 1, 2013, 1:57:27 PM11/1/13
to Steve Fink, dev-pl...@lists.mozilla.org
On Fri, Nov 1, 2013 at 10:30 AM, Steve Fink <sf...@mozilla.com> wrote:

> On 11/01/2013 08:55 AM, Dave Townsend wrote:
>
> On Thu, Oct 31, 2013 at 11:55 PM, Steve Fink <sf...@mozilla.com> wrote:
>
>> The proposal would slow down startup, since all enabled addons will need
>> to be hashed before loading them "for real". (And those hashes would
>> need to be looked up in a local cache of the registry, assuming you
>> don't want to wait on a network hit during startup.)
>>
>
> The intention is not to hash the add-ons during startup. We've talked
> about the option of hashing some time later to verify that all the
> installed add-ons are still valid. This does create a way for an
> application on the user's system to get an unknown add-on into Firefox but
> that seems like an ok trade-off against saving startup time.
>
>
> So as a malware author, I would need to be sure that on first run my addon
> disables the check either by adding a whitelist pref, preventing the API
> from being called, or blocking that network address; or that it does its
> damage then uninstalls itself (or replaces itself with a registered addon).
> That's still *some* additional protection, I suppose, since we can at least
> detect most of those (at least until the next turn in the arms race game.)
>

You have to get your add-on installed in the first place of course but its
true that once you have code running on the system circumventing this
wouldn't be all that difficult.
It isn't just blacklist by default, it's also gives us a far more flexible
blacklisting capability than we have now allowing us to quickly target
entire classes of add-ons if needed.

#2 is pretty antagonistic to the contact info goal. It still feels to me
> like there ought to be some way to gain more contact info without doing the
> full current proposal. For example, we could have a registry in addition to
> or instead of the proposed registry, where nothing is stored permanently
> but the author must respond to the given contact email address before their
> hash is added to the registry.
>

I know we've gone back and forth over whether to retain copies of the files
or not. The benefits are clear, we get the ability re-check existing
add-ons for new forms of malware as they are found. It could also give
Mozilla a lot more information that we currently have about how prevalent
API use is in extensions in the wild to help us understand the impact of
code changes. Right now we have to rely only on the add-ons on AMO but this
isn't the full picture.


>
> To be clear, I still feel like my objections to the current proposal are
> each individually enough to kill it in its current form, and I'd like to
> see a response to them. I imagine the people who have put this proposal
> together disagree with me. I am personally very sympathetic to attempts to
> tighten up the addon ecosystem somewhat, because I get the general
> anecdotal impression that unwanted addons are a serious problem -- and so
> too is the problem of bad behavior inadvertently triggered by *wanted*
> addons, which the contact info aspect of this proposal would help to
> address. (And I'd love to have the source code of a large percentage of the
> addons used in the field, so that we could check whether we can remove
> problematic features without endless deprecation cycles. But I don't think
> that's worth the cost of requiring all the source to be registered.)
>

The enterprise issue is certainly a problem that might warrant a change to
this proposal but for others I just don't see that the cost of having to
create an AMO account and upload a copy of your add-on (which should be
doable with a simple command line) is in any way onerous and the potential
benefits for users far outweigh it.


>
> I'm also feeling like addon acceptance should be modular, so that the
> Mozilla whitelist is one source of known-acceptable addons but individuals
> and enterprises could add additional sources of truth easily. For example,
> if it were straightforward to set up additional local whitelist servers.
> Perhaps the community could stand up additional ones for specific purposes.
> Obviously, you'd need to make it hard for unwanted whitelists to get
> installed.
>
>
I don't think that that is something enterprises would want to spend time
doing, do you?

Dave Townsend

unread,
Nov 1, 2013, 2:02:56 PM11/1/13
to Matt Basta, dev-pl...@lists.mozilla.org, Jorge Villalobos
On Fri, Nov 1, 2013 at 10:48 AM, Matt Basta <mba...@mozilla.com> wrote:

> To fork the discussion on this towards the technical feasibility (let's
> assume we decided to go down this path):
>
> 1. Storing the files would be a daunting task.
>
> Each file would need to be stored in some sort of key-value database which
> we don't presently have infrastructure for. If we use sha256 (as the PRD
> recommends), there's a very likely chance that eventually we'll hit a
> collision. That means we'd need to be able to do lookups by add-on ID +
> file path, hash + add-on ID + file path, etc. Every means of lookup would
> require an additional index (if we use an RDBMS like MySQL, which is what
> AMO uses now). We could conceivably start using HBase, but I know of only
> one team that uses it now (perhaps not anymore?) and they don't like it at
> all.
>
> The infrastructure aside, this would require a non-trivial amount of
> space. If we're storing hashes for every file for all of time, that's going
> to get REALLY big very quickly. I don't think it's improbable that this
> would need a dedicated cluster to be able to store that much data.
>

I'm not sure why we wouldn't just store the files in a filesystem rather
than a database.


> 2. The bandwidth required would be non-trivial.
>
> Many add-ons contain large numbers of files. We'd need to register every
> file (you could just rename "malicious_file.js" to
> "completely_innocent.jpg" and `eval()` it). On the client side, that means
> pinging the API with hundreds, if not thousands (tens of thousands? more?)
> of hashes on a regular basis. Hashes are mostly random, so they won't
> compress well. This means that someone with a few dozen completely
> legitimate add-ons might get swamped with these multi-megabyte requests to
> our servers.
>

To be clear this proposal doesn't talk about hashing every file, it's a
hash for the whole add-on so if you have 10 add-ons installed it would ping
for 10 hashes.


> What happens if that request fails? We know it will, that's why we've
> started distributing Firefox with a CDN. The more you're pumping down the
> wire, the more likely the system is to fail. This issue could be solved by
> chunking the request, but that does not address the bandwidth issues.
>

In failure or offline the client assumes acceptance.


> 3. Hackability
>
> Sha256 is obviously strong, but in this day and age (especially going
> forward) it's not inconceivable that an attacker could buy a crapload of
> Butterfly Labs boxes (or botnet time) and crunch out a malicious file that
> has the same hash as jQuery or another very common file in add-ons.
> Countering this would require changes to Firefox to detect, but by the time
> we respond to the issue, the malicious add-on could have done any number of
> nefarious things to block whatever fix we deploy.
>
> We could solve this by using a hashing algorithm with less probability of
> collisions and greater computational requirements, but that won't be a
> completely future-proof solution. Furthermore, the greater the size of the
> hash, the harder the above to issues are to solve.
>

Do we think people are really going to go to those lengths just to get an
add-on installed into Firefox? There are probably easier ways to attack a
user's machine.

4. Sheer brute force feasibility.
>
> What's to stop an attacker from doing the following:
>
> - Buy time on a botnet
> - Create hundreds of thousands of permutations of an obfuscated malicious
> file and submit it to the automated API with different IDs (from different
> IPs)
>

I believe there will be a limit on how many add-ons a user can register
over a time period.


> - Distribute a unique piece of add-on malware to each victim
>
> I can't see how we'd be able to seek out and/or blacklist a massive amount
> of malicious scripts, especially if they're mixed in with a massive amount
> of legitimate scripts. Munching a piece of JavaScript AST to be
> undetectably unique is not a hard problem to solve.
>
>
> ----- Original Message -----
> From: "Jorge Villalobos" <jo...@mozilla.com>
> To: dev-pl...@lists.mozilla.org
> Sent: Friday, November 1, 2013 10:24:04 AM
> Subject: Re: Add-on File Registration PRD
>

Steve Fink

unread,
Nov 1, 2013, 2:02:22 PM11/1/13
to dev-pl...@lists.mozilla.org
On 11/01/2013 10:48 AM, Matt Basta wrote:
> 4. Sheer brute force feasibility.
>
> What's to stop an attacker from doing the following:
>
> - Buy time on a botnet
> - Create hundreds of thousands of permutations of an obfuscated malicious file and submit it to the automated API with different IDs (from different IPs)
> - Distribute a unique piece of add-on malware to each victim
>
> I can't see how we'd be able to seek out and/or blacklist a massive amount of malicious scripts, especially if they're mixed in with a massive amount of legitimate scripts. Munching a piece of JavaScript AST to be undetectably unique is not a hard problem to solve.
>
I don't understand. The proposal is a whitelist, not a blacklist. So if
we can detect the malware, none of these hashes would get registered. If
we can't detect the malware, there's no need to generate permutations of
it. (Well, unless you're worried that we will later detect the malware
and remove that hash.)

Are you just saying that the registry won't solve the malware problem
because the analysis won't work? That's believable.

Jorge Villalobos

unread,
Nov 1, 2013, 2:04:21 PM11/1/13
to
Your add-on won't be flagged as malware.

I can't be very specific about the rules that we will use to qualify
malware, as they aren't fully written yet, but they would initially
target the forms of malware we know about:
* Impersonation of known developers. There are many malware add-ons that
claim to be built by Adobe or Mozilla, so they don't look suspicious in
the Add-ons Manager.
* Add-ons that through remote scripts hijack Facebook and other social
media accounts. These are generally built as Greasemonkey scripts and
follow certain patterns we should be able to detect.

We're not planning on using this to restrict security-sensitive APIs or
enforce other quality requirements on existing add-ons. Non-AMO
developers should be able to experiment and create potential footguns,
as long as they are properly presented to potential users.

In a nutshell, the malware checks will be aimed at add-ons with obvious
malicious intent.

Jorge

Jorge Villalobos

unread,
Nov 1, 2013, 2:04:56 PM11/1/13
to
On 10/31/13 7:07 PM, ert...@gmail.com wrote:
> I develop an extension for use on an intranet that has no access to the Internet. Will this affect my extension?
>

No. Failure to contact the API will allow the installation to happen.

Jorge

Jorge Villalobos

unread,
Nov 1, 2013, 2:11:48 PM11/1/13
to
On 11/1/13 9:45 AM, Dave Townsend wrote:
> On Thu, Oct 31, 2013 at 9:33 PM, Eric Moore <erm...@gmail.com> wrote:
>
>> On Wednesday, October 30, 2013 5:55:21 PM UTC-4, jorgev wrote:
>>> Cross posting to dev.planning, where I originally intended this to be.
>>
>> The proposal describes it as something that would be implemented in
>> Firefox. My concern is what impact this might have on Thunderbird.
>>
>> 1) Would this only be implemented in Firefox or might it actually be
>> implemented in Gecko or some other shared library that forces the
>> Thunderbird developers to support this proposal unless they want to fork
>> and maintain their own version of that component?
>>
>
> Due to the nature of the code this will certainly be implemented
> predominantly in toolkit but it will be done so in a way that allows
> applications to opt-in to it similar to other things that we have added in
> the past like the sideloading opt-in screen and hotfix support.

To add to this, I'm not aware of large malware problems in Thunderbird,
so I suspect that it wouldn't make sense to enable file registration for
this particular application. I can see it applying for SeaMonkey, though.

On AMO the plan is to register all add-ons, regardless of application
compatibility, just in case these applications decide to activate the
client-side portion. So that would cover Thunderbird add-ons on AMO, at
least.

>> 2) The "series of automatic malware checks" might block some Thunderbird
>> add-ons because whoever designed the malware checks is focused strictly on
>> Firefox.
>>
>
> I don't know the specifics of the checks but I believe it is more about
> checking for the existance of actual binary malware that the add-on might
> try to execute. This would apply equally to Thunderbird add-ons as well as
> Firefox add-ons. Regardless I'm sure the AMO team will be open to make
> changes if this became a problem.

The kinds of code checks we do on AMO are generally
application-agnostic. When they are biased towards Firefox it just means
that the check won't apply to Thunderbird. Since we've been doing this
sort of checks for a few years without any incidents for Thunderbird
add-ons (that I can recall), I don't expect any big problems to come out
of this new set of checks we'll create. If they do we'll deal with them,
of course.

Jorge

Matt Basta

unread,
Nov 1, 2013, 2:17:30 PM11/1/13
to Dave Townsend, dev-pl...@lists.mozilla.org, Jorge Villalobos
> Do we think people are really going to go to those lengths just to get an add-on installed into Firefox? There are probably easier ways to attack a user's machine.

If crunching numbers a single time to produce an undetectable add-on that we can't block because it matches, say, Firebug or Adblock's hash and can be installed on large numbers of users machines is possible, attackers are definitely going to do it. Couple that with a DNS attack or a Google bomb to target users looking to install legitimate add-ons and you've just found a sneaky way into users' machines.

We live in an age where leaked passwords hashed and salted using HMAC trigger mayhem; we shouldn't underestimate the connivery of an attacker just because a potential attack sounds implausible.
4. Sheer brute force feasibility.
>
> What's to stop an attacker from doing the following:
>
> - Buy time on a botnet
> - Create hundreds of thousands of permutations of an obfuscated malicious
> file and submit it to the automated API with different IDs (from different
> IPs)
>

I believe there will be a limit on how many add-ons a user can register
over a time period.


> - Distribute a unique piece of add-on malware to each victim
>
> I can't see how we'd be able to seek out and/or blacklist a massive amount
> of malicious scripts, especially if they're mixed in with a massive amount
> of legitimate scripts. Munching a piece of JavaScript AST to be
> undetectably unique is not a hard problem to solve.
>
>
> ----- Original Message -----
> From: "Jorge Villalobos" <jo...@mozilla.com>
> To: dev-pl...@lists.mozilla.org
> Sent: Friday, November 1, 2013 10:24:04 AM
> Subject: Re: Add-on File Registration PRD
>

Steve Fink

unread,
Nov 1, 2013, 2:19:01 PM11/1/13
to Dave Townsend, dev-pl...@lists.mozilla.org
Right, but whether the analysis is server-side or client-side doesn't
affect that benefit very much.

>
> #2 is pretty antagonistic to the contact info goal. It still feels
> to me like there ought to be some way to gain more contact info
> without doing the full current proposal. For example, we could
> have a registry in addition to or instead of the proposed
> registry, where nothing is stored permanently but the author must
> respond to the given contact email address before their hash is
> added to the registry.
>
>
> I know we've gone back and forth over whether to retain copies of the
> files or not. The benefits are clear, we get the ability re-check
> existing add-ons for new forms of malware as they are found. It could
> also give Mozilla a lot more information that we currently have about
> how prevalent API use is in extensions in the wild to help us
> understand the impact of code changes. Right now we have to rely only
> on the add-ons on AMO but this isn't the full picture.

Do we know a rough distribution of reasons for why addon authors host
them on non-AMO sites? (With the goal of estimating how many of these
addons would be willing to go through the registry.)

>
>
>
> To be clear, I still feel like my objections to the current
> proposal are each individually enough to kill it in its current
> form, and I'd like to see a response to them. I imagine the people
> who have put this proposal together disagree with me. I am
> personally very sympathetic to attempts to tighten up the addon
> ecosystem somewhat, because I get the general anecdotal impression
> that unwanted addons are a serious problem -- and so too is the
> problem of bad behavior inadvertently triggered by *wanted*
> addons, which the contact info aspect of this proposal would help
> to address. (And I'd love to have the source code of a large
> percentage of the addons used in the field, so that we could check
> whether we can remove problematic features without endless
> deprecation cycles. But I don't think that's worth the cost of
> requiring all the source to be registered.)
>
>
> The enterprise issue is certainly a problem that might warrant a
> change to this proposal but for others I just don't see that the cost
> of having to create an AMO account and upload a copy of your add-on
> (which should be doable with a simple command line) is in any way
> onerous and the potential benefits for users far outweigh it.

My objections were: (1) subpoenas and privacy, (2) organizational
policies against exposing internal code, (3) scaling, and (4) perf. We
were discussing perf (startup costs vs subverting the restrictions by
allowing a first run.) The developer friction point is important, but I
suspect it can be managed adequately.

>
>
>
> I'm also feeling like addon acceptance should be modular, so that
> the Mozilla whitelist is one source of known-acceptable addons but
> individuals and enterprises could add additional sources of truth
> easily. For example, if it were straightforward to set up
> additional local whitelist servers. Perhaps the community could
> stand up additional ones for specific purposes. Obviously, you'd
> need to make it hard for unwanted whitelists to get installed.
>
>
> I don't think that that is something enterprises would want to spend
> time doing, do you?

Depends on how juicy the carrot is. Enterprise IT departments really
don't like fixing problems caused by their users installing malware, and
having the Mozilla registry available to help with the problem is a big
plus from that perspective. If this just involved hosting a JSON file or
something, I could imagine enterprises very much liking this. (Though
perhaps it'd need to be integrated with whatever Enterprise IT
Architecture Synergy Management Command and Control Server bullshit they
already use.)

Jorge Villalobos

unread,
Nov 1, 2013, 2:28:31 PM11/1/13
to
One of the most important goals for the admin tools in this system is to
be able to monitor and deter this sort of spamming. We should be able to
detect mass registrations coming from a specific account or IP address /
IP range in order to look into it and see if it's legitimate or not.
Having the malicious files will also allow us to add checks for them so
similar patterns aren't allowed in the future, hopefully making Firefox
a much less attractive attack vector for malware authors.

Mike Connor

unread,
Nov 1, 2013, 2:34:17 PM11/1/13
to Jorge Villalobos, dev-planning@lists.mozilla.org planning

On Nov 1, 2013, at 1:57 PM, Jorge Villalobos <jo...@mozilla.com> wrote:

> I think that's a good idea, and I'll give it some more thought. One
> problem is that we need to set some bar of legitimacy for the developers
> we're handing out the certs to, since it needs to be much higher than
> the one we have for registration. Otherwise we would be back at where we
> are now. It should be possible to design it in a way we can drop
> specific certs if we learn they are being used for malicious purposes.

That’s how most of our crypto infrastructure works. We don’t need to reinvent the wheel here, we can just block specific certs, and (as a more nuclear option) distrust any roots that repeatedly provide certs to bad actors. There’s no need for us to wade into the world of providing code signing certs.

> There's also a problem of perception, where this would look like big
> businesses would be able to "buy" an exception to the rules that apply
> to the majority. We would need to make the scope of this very clear to
> avoid confusion.

I don’t think we should look at the file registration service as a rule, but as a mechanism to comply with a rule that all add-ons must be trackable and blockable through some Mozilla-controllable mechanism. If a file is registered with AMO, we can block/revoke that hash. If it’s a signed XPI, we can block the cert for all installs (if the blocklist isn’t sufficient here). If the point is to protect users and defend against bad actors, I don’t think it’s necessary for us to get all files.

As for an exception to the rules, I’d actually argue that blocking a cert is a much more effective mechanism than blocking a file hash, since I can create infinite free AMO accounts, but every cert that gets picked off costs me a couple of hundred dollars. My prediction is that bad actors will look to trick the automated tests through code obfuscation rather than spend cash on a large set of certs. If that leaves otherwise good actors with the option of buying a cert (or, more likely, using a cert they already have) and having a minimal amount of overhead, so much the better.

— Mike

Jorge Villalobos

unread,
Nov 1, 2013, 2:39:04 PM11/1/13
to
On 11/1/13 12:55 AM, Steve Fink wrote:
> This system involves permanently storing IP addresses and the full text
> of all addon files. That's a dangerous pile of data to be sitting on. Do
> we want to handle subpoenas for IP addresses known to belong to
> organizations and individuals under investigation? Do we want to risk a
> security breach exposing a whole lot of data that doesn't belong to us?

The collection of IP addresses is meant to fight spamming of the system,
which is a scenario that Matt Basta just brought up on a different
response. We want to able to detect multiple account registration and
multiple file registration from the same location, in order to deter
system abuse.

This doesn't require us to maintain IP address information permanently,
so we can work out a retention policy that allows us to continue to
benefit from its advantages while storing IP addresses for as little
time as possible. The document doesn't contemplate this, so thank you
for bringing it up.

> Some organizations will have policies that forbid exposing these data
> (in particular, the addon files) to external third parties, no matter
> how convincingly they promise to take care of them. We would lose those
> users unless we came up with a better mechanism for local registration.
> But such a mechanism might defeat part of the desired benefits of this
> proposal (eg having better contact info for addon authors.)

The spec already mentions a few ways for enterprises to bypass the
system if needed, and Gijs proposed an interesting idea where we would
be able to give them signing certs for their internal add-ons. There
will definitely be ways to deal with this for enterprises, we haven't
ignored their requirements.

> I don't know what the automatic malware analysis is, but every similar
> analysis that I've ever worked on has required constant change,
> especially when it's exposed as a tool in the arms race against people
> motivated to subvert it (eg malware authors.) I'm skeptical that we
> could rerun the analysis on the entire registry, every time the analysis
> changes. The registry will at least have to track what version of the
> analysis was run for a given entry, and the browser UI would need to
> accept a registered addon becoming unregistered at any time.

We've contemplated those situations and we believe we can support them.
We already have ways to search the source code of all AMO add-ons for
specific code patterns. The magnitude of non-AMO add-ons is unknown,
though, but I believe the scalability of these tools is mostly a matter
of hardware.

> The proposal would slow down startup, since all enabled addons will need
> to be hashed before loading them "for real". (And those hashes would
> need to be looked up in a local cache of the registry, assuming you
> don't want to wait on a network hit during startup.)

As Dave mentioned, we won't do a check that blocks startup.

> An alternative to consider would be to run the analysis on the client
> side. The code for the analysis could be updated with every Firefox
> release or more frequently. The perf hit could be mitigated by running
> the analysis for popular addons on the AMO side, and using the registry
> sketched out in the proposal for those. Sure, malware could subvert the
> local analysis, but it could just as easily lie about the content hash
> used to query the registry. (The client side would still probably want
> to cache the analysis results across browser restarts, so it'd still
> need to checksum the file and look up the checksum in a local registry
> to avoid redoing the analysis during startup.)

This would only address part of the problem, as Dave also pointed out.
Knowing what is out there and how to address is also a very big part of it.

Jorge

Mike Connor

unread,
Nov 1, 2013, 2:41:28 PM11/1/13
to Matt Basta, dev-planning@lists.mozilla.org planning, Jorge Villalobos, Dave Townsend

On Nov 1, 2013, at 2:17 PM, Matt Basta <mba...@mozilla.com> wrote:

>> Do we think people are really going to go to those lengths just to get an add-on installed into Firefox? There are probably easier ways to attack a user's machine.
>
> If crunching numbers a single time to produce an undetectable add-on that we can't block because it matches, say, Firebug or Adblock's hash and can be installed on large numbers of users machines is possible, attackers are definitely going to do it. Couple that with a DNS attack or a Google bomb to target users looking to install legitimate add-ons and you've just found a sneaky way into users' machines.

The amount of time it would take to generate a collision would be very much not free, even on a botnet, and it’d be trivial to publish an update with a different hash. I’d assume that the “block” response could be “block this version but grab the update” in this model.

Assuming we make the hashes reasonably non-trivial to calculate, I think the deck will be firmly stacked against this being an economically viable attack vector.

— Mike

Mike Connor

unread,
Nov 1, 2013, 2:43:12 PM11/1/13
to Jorge Villalobos, dev-planning@lists.mozilla.org planning

On Oct 31, 2013, at 4:56 PM, Jorge Villalobos <jo...@mozilla.com> wrote:

> On 10/31/13 11:16 AM, Mike Connor wrote:
>>
>> On Oct 31, 2013, at 12:57 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
>>
>>> We understand that because of various reasons, like business policies of
>>> not sharing certain information under any circumstances, registering
>>> add-ons for internal use will not be possible. In those cases you will
>>> have the option of just blocking the API in your internal network. API
>>> failures will allow the add-on to be installed. We're also looking into
>>> locked preferences to whitelist a set of IDs.
>>
>> I’m not convinced this will create a significant barrier, since a side loaded add-on is already coming in with admin privs, and thus can trivially subvert things like host resolution.
>>
>> Is there a reason we haven’t just gone with signing of XPIs (submit unsigned XPIs, get back signed XPIs) rather than a network-driven API? It’d be good to understand which options were considered over the last year.
>>
>> — Mike
>>
>
> There are a couple of reasons why signing isn't in this proposal. The
> first one is that we would need to set a cut-off point after which we
> only allow signed files. That would exclude all installations of
> previously unsigned files, which would be massive. In this proposal,
> developers have the possibility of registering their old files, so
> current users can continue using them. Even for old, abandoned add-ons,
> either us or one of their users can register old files so that they
> continue working. On the AMO side we can automatically register all
> files old, and new, fairly easily.

I’m curious why you think the use-case of installing old versions of add-ons would be “massive” since it seems like a pretty small corner case. I doubt either of us have data, but the idea that a significant number of users are archiving old versions seems somewhat far-fetched. Some do, for sure, but that doesn’t necessarily mean it’s a use-case we need to protect. I’m actually of the opinion that we would lose very little for the vast majority of users if we stop supporting unmaintained add-ons.

For this new model to be effective I believe we need to raise the bar on what we allow to be installed. Grandfathering everything doesn’t really solve that.

> Signing XPIs would require us to repackage the files to add the
> signature, and that's something we've had problems with in the past
> (like with SDK repacks). I don't have confidence that it would work for
> the the huge back-catalog of add-on files we have. And, even if we did
> sign them all, that still excludes all installs of old files.

I don’t know the nature of the issues you’ve experienced previously, but what you’re talking about here is just an automated repack system. We do those reliably for localized and custom partner builds already, we should be able to do the same thing for add-ons.

— Mike

Jorge Villalobos

unread,
Nov 1, 2013, 2:49:57 PM11/1/13
to
On 11/1/13 9:34 AM, J. Ryan Stinnett wrote:
> Jorge Villalobos wrote:
>> However, I think it's unlikely that we just decide not to do anything.
>> It will either be this or something similar. The security risks of the
>> current system are pretty large.
>
> Perhaps it is because I am newer here, but part of what made it so hard for me to read your initial post was that I personally do not perceive any malware risk, so it came across as many developer restrictions for a threat that did not even seem to be real.
>
> I would suggest adding more information about what Mozilla considers to be "malware" and how prevalent it seems to be today.

You're right, and that's partly intentional. There are certain risks
that I'd rather not mention publicly because I don't want to encourage
them to happen.

One thing I can point out (and will probably add to the doc) is the
amount of malware add-ons we block:
https://addons.mozilla.org/firefox/blocked/. They're all labeled
"(malware)". As you can see, we block at least a few instances per
month, and those are only the ones we hear about. They are constantly
mutating and reappearing, and there's very little we can do now to
prevent that.

We know this is a problem for Facebook and Google, and we have had
conversations with them about dealing with some of these problems.

> Jorge Villalobos wrote:
>> If a user installs a file that is not registered, a message will be shown
>> explaining that the file can’t be installed for their protection.
>
> It looks as if there is no way to get around the decision received from the registration service once a check has been made. Why is that? Alerting the user that Mozilla considers an add-on to be unsafe seems okay, but it just does not seem possible to truly know whether *each user* would agree the extension is unsafe if they knew what it was doing.
>
> Could there be a button to continue anyway? Or at least a link to some documentation explaining how to white list IDs?

The UX design for this hasn't even started, so I can't really say how it
will look. However, we don't want it to be easy to disable. We want
knowledgeable people to be able to override it because they're either
developers or have other environmental restrictions (like in
enterprises), but this will only be effective if the majority of users
keep it turned on.

>
> Jorge Villalobos wrote:
>> ...you will have the option of just blocking the API in your internal
>> network. API failures will allow the add-on to be installed.
>
> Do you plan to publish the host names of the servers used in this process so that they can easily be blocked for people / organizations who choose this option? Perhaps they could be part of the same documentation that I was suggesting above that would explain how to white list.

Like with the blocklist, this will just be one URL you'll need to block,
so it should be really easy to do.

> Jorge Villalobos wrote:
>> So, you're suggesting to have signing in addition to the hashes, as a
>> way to opt-out?
>
> I do think this is valuable to explore as a lighter-weight solution for any developers that do not feel comfortable handing over their code. As you mentioned earlier, supporting signing *only* seems bad, since it would lead to many broken extensions. But a "dual path" approach that also made signing available as an option seems more developer friendly. I hope that this approach can be evaluated.
>
> - Ryan

Yes, I agree this is a nice idea. We'll definitely give it more thought.

Jorge

Andrew Sutherland

unread,
Nov 1, 2013, 5:15:07 PM11/1/13
to dev-pl...@lists.mozilla.org
On 11/01/2013 02:49 PM, Jorge Villalobos wrote:
> On 11/1/13 9:34 AM, J. Ryan Stinnett wrote:
>> Do you plan to publish the host names of the servers used in this process so that they can easily be blocked for people / organizations who choose this option? Perhaps they could be part of the same documentation that I was suggesting above that would explain how to white list.
> Like with the blocklist, this will just be one URL you'll need to block,
> so it should be really easy to do.

Wouldn't it be an HTTPS URL and therefore harder to block?

http://mxr.mozilla.org/mozilla-central/source/browser/app/profile/firefox.js#53
is:

pref <http://mxr.mozilla.org/mozilla-central/ident?i=pref>("extensions.blocklist.url","https://addons.mozilla.org/blocklist/3/...");

That seems like the network would need to have addons.mozilla.org
black-holed with a fake DNS entry, or all of the IP addresses it could
resolve to would need to be blacklisted, or the Enterprise would need to
be actively MITM-ing all SSL connections on the network. The last one
is not something we would want to encourage people to do.

I'm assuming a non-addons.mozilla.org domain would be used with distinct
IP addresses so the organization wouldn't have to block AMO entirely?

Andrew



magli...@gmail.com

unread,
Nov 1, 2013, 5:17:46 PM11/1/13
to
I'm not sure if the PRD mentions this, but while it should allow the installation to happen, there should be a warning that the add-on can't be verified.

bre...@gmail.com

unread,
Nov 1, 2013, 6:46:10 PM11/1/13
to
Excellent, thanks for the reply and very good to see what appears to me to be a lot of effort toward balancing security with the needs of companies, experimenters, etc.

David E. Ross

unread,
Nov 1, 2013, 8:20:03 PM11/1/13
to
In other words, this "feature" will not be user-friendly. Instead, it
will be user-hostile.


--
David E. Ross
<http://www.rossde.com/>

Where does your elected official stand? Which
politicians refuse to tell us where they stand?
See the non-partisan Project Vote Smart at
<http://votesmart.org/>.

David E. Ross

unread,
Nov 1, 2013, 8:30:27 PM11/1/13
to
I normally disconnect from the Internet for all software installations,
except where only stub installers requiring further downloads are
involved. If I can install a non-registered extension while
disconnected from the Internet, will I be able to use it -- will my
Mozilla application have the extension's functionality -- after I
reconnect to the Internet?

David E. Ross

unread,
Nov 1, 2013, 8:35:41 PM11/1/13
to
On 11/1/2013 11:49 AM, Jorge Villalobos wrote [in part]:

>
> One thing I can point out (and will probably add to the doc) is the
> amount of malware add-ons we block:
> https://addons.mozilla.org/firefox/blocked/. They're all labeled
> "(malware)". As you can see, we block at least a few instances per
> month, and those are only the ones we hear about. They are constantly
> mutating and reappearing, and there's very little we can do now to
> prevent that.

The Blocklist page only lists plugins. Are you planning to include
plugins in this registration scheme?

If the scheme applies only to extenstions, where is the list of
extensions blocked in the past?

David E. Ross

unread,
Nov 1, 2013, 8:44:25 PM11/1/13
to
Why not merely make AMO the repository of analyzed, approved, safe
extensions? Then, you could "advertise" AMO as the place of safety.
Those of us who collect extensions from other sources would not lose
that ability.

How far are you planning to go to protect users? After all, Mozilla
cannot stop us from installing non-Mozilla applications that do not
directly interface with Mozilla applications. Just give us a source of
"good" applications without interfering with our ability to choose other
sources.

Robert Strong

unread,
Nov 1, 2013, 9:01:53 PM11/1/13
to dev-pl...@lists.mozilla.org
On 11/1/2013 5:35 PM, David E. Ross wrote:
> On 11/1/2013 11:49 AM, Jorge Villalobos wrote [in part]:
>
>> One thing I can point out (and will probably add to the doc) is the
>> amount of malware add-ons we block:
>> https://addons.mozilla.org/firefox/blocked/. They're all labeled
>> "(malware)". As you can see, we block at least a few instances per
>> month, and those are only the ones we hear about. They are constantly
>> mutating and reappearing, and there's very little we can do now to
>> prevent that.
> The Blocklist page only lists plugins.
No, it also includes extensions. TornTV (the first one listed currently)
is one example.

Dagger

unread,
Nov 4, 2013, 1:17:12 AM11/4/13
to
On 31/10/2013 23:56, Jorge Villalobos wrote:
> On 10/31/13 3:35 PM, Gijs Kruitbosch wrote:
>> On 31/10/13, 22:30 , a.very.l...@gmail.com wrote:
>>> The command line switch (-noaddonreg)(which I didn't notice, sorry
>>> about that) is a much better idea. A whitelist in unworkable in a
>>> corporate situation simply because the amount of effort it would take
>>> to roll it out to 200+ isn't worth the money it would cost.
>>> -noaddonreg would solve the problem nicely since it would be a very
>>> easy way to take Mozilla out of seconding guessing our IT department's
>>> choices.
>>>
>> How do we envision this working? For it to work for people in enterprise
>> situations, it having a click-through UI is a no-no. As an add-on dev,
>> I'd get pretty tired of having to click through it all the time. If
>> there's no warning at all, I'd imagine malware will have started using
>> it as a startup parameter for Firefox before it's been through the 12
>> weeks from nightly to release...
>>
>> ~ Gijs
>
> There would be a keyboard shortcut, of course. At the very least
> something like "Left arrow + Return", which is something that a
> developer should be able to do without even thinking about it.
>
> Jorge

Speaking with my extension developer hat on: this would get very old
very quickly. Having to click/keyboard through a warning every time I
restart (which happens a lot when developing, and at least daily even
when not) would be extremely irritating, no matter how easy it is to do.

I suggest dropping the persistent nag screen and putting the switch on
the other side of the airtight hatchway: anybody with write access to
the Firefox application folder can already do whatever the heck they
want (including replacing Firefox with a version that silently doesn't
check add-ons), so we may just as well use a file in that folder to turn
development mode on or off, and we won't lose anything by not prompting
on every startup.

Or similarly, the proposal already specifies that anybody with write
access to /etc can disable the checking without getting a nag screen
(via /etc/hosts, or the networking config, or editing the firewall,
or...), so we could use a file in there too.

(We'd also need a per-profile preference to ignore the global setting
and turn checking back on, to give the user some way to override it
without needing write permission to the folder.)

Onno Ekker

unread,
Nov 4, 2013, 3:41:24 AM11/4/13
to
Jorge Villalobos wrote:
> Cross posting to dev.planning, where I originally intended this to be.
> Please follow up to dev.planning.
>
> Jorge
>
> On 10/30/13 3:42 PM, Jorge Villalobos wrote:
>> Hello!
>>
>> As many of you know, the Add-ons Team, User Advocacy Team, Firefox Team
>> and others have been collaborating for over a year in a project called
>> Squeaky [1]. Our aim is to improve user experience for add-ons,
>> particularly add-ons that we consider bad for various levels of "bad".
>>
>> Part of our work consists on pushing forward improvements in Firefox
>> that we think will significantly achieve our goals, which is why I'm
>> submitting this spec for discussion:
>>
>> https://docs.google.com/document/d/1SZx7NlaMeFxA55-u8blvgCsPIl041xaJO5YLdu6HyOk/edit?usp=sharing
>>
>> The Add-on File Registration System is intended to create an add-on file
>> repository that all add-on developers need to submit their files to.
>> This repository won't publish any of the files, and inclusion won't
>> require more than passing a series of automatic malware checks. We will
>> store the files and generated hashes for them.
>>
>> On the client side, Firefox will compute the hashes of add-on files
>> being installed and query the API for it. If the file is registered, it
>> can be installed, otherwise it can't (there is planned transition period
>> to ease adoption). There will also be periodic checks of installed
>> add-ons to make sure they are registered. All AMO files would be
>> registered automatically.
>>
>> This system will allow us to better keep track of add-on IDs, be able to
>> easily find the files they correspond to, and have effective
>> communication channels to their developers. It's not a silver bullet to
>> solve add-on malware problems, but it raises the bar for malware developers.
>>
>> We believe this strikes the right balance between a completely closed
>> system (where only AMO add-ons are allowed) and the completely open but
>> risky system we currently have in place. Developers are still free to
>> distribute add-ons as they please, while we get a much-needed set of
>> tools to fight malware and keep it at bay.
>>
>> There are more details in the doc, so please give it a read and post
>> your comments and questions on this thread.
>>
>> Jorge Villalobos
>> Add-ons Developer Relations Lead
>>
>> [1] https://wiki.mozilla.org/AMO/Squeaky
>>
>

Hi,

I have another use case which isn't clearly described by the current doc.

I have an English version of Firefox/Thunderbird installed with
additional language packs from
<http://ftp.mozilla.org/pub/mozilla.org/%APP%/%CHANNEL%/%VERSION%/%OS%/xpi/>.

After each update I have to manually add the language packs again.
Those files are created by Mozilla but aren't published to amo.

It would be a real shame if it wouldn't be possible anymore to add
different languages to your installation.

Onno

Henri Sivonen

unread,
Nov 4, 2013, 4:20:34 AM11/4/13
to dev-pl...@lists.mozilla.org
Dave Townsend <dtow...@mozilla.com> wrote:
> I don't know the specifics of the checks but I believe it is more about
> checking for the existance of actual binary malware that the add-on might
> try to execute.

This implies that Mozilla's malware scanner would be better than
Windows anti-virus software run on the users' computers.

Why would Mozilla's scanner be better for Windows than anti-virus
solutions that users are already supposed to be using?
Is there a big need to run Mac & Linux malware scans and end users
just don't have anti-virus software installed?

> The intention is not to hash the add-ons during startup. We've talked about
> the option of hashing some time later to verify that all the installed
> add-ons are still valid. This does create a way for an application on the
> user's system to get an unknown add-on into Firefox but that seems like an
> ok trade-off against saving startup time.

Wouldn't that mean that by the time the check runs, the malware has
already had the chance to do its misdeeds and to cover its tracks?

Also, the notion that startup might be the first time Firefox sees an
add-on suggests that the add-on hasn't been installed as an .xpi
through Firefox itself but has been dropped on the system by an .exe.
This means that arbitrary malicious code (the dropper/installer .exe)
has already had a chance to run.

Since as a technical matter the solution doesn't address malicious
code in the dropper/installer .exe and as a policy matter the solution
doesn't address unwanted toolbars or search hijacks created by known
companies (as opposed to criminals in the shadows), I think it would
be helpful to have a clearer statement of the threat model to
understand what threat the proposed solution would address.

> The malware check itself is not the most important part of this proposal.
> Being able to get information about and contact the developers of
> potentially harmful extensions and blocking the ability of malware to
> create per-install versions of their extensions that would be difficult for
> us to block is what we're trying to solve here.

If the malware check isn't the most important part and tracing
software to developers is the important part, is there a reason why a
solution like the OS X default mode wouldn't work? That is, instead of
requiring developers to disclose software to Mozilla ahead of time,
would it not be sufficient to require developers to sign the software
with a key that's registered with Mozilla together with contact info
and that Mozilla can revoke?

>> So I think it would be a very bad idea to proceed with this plan
>> without solving the enterprise deployment concerns before proceeding.
>> (That is, proceeding and maybe solving the enterprise concerns later
>> would mean the damage would be done by the times of the solution
>> arrives.)
>
>
> Earlier comments pointed out that enterprises can just block access to the
> API which would effectively turn off this feature. Do you think that isn't
> good enough to solve the problem?

I think it's not good enough, because it means that the part of the
organization that handles Firefox deployment has to coordinate with
the part of the organization that manages all firewalls. Anecdotes
suggest that it's notoriously difficult to get firewall changes done
organization-wide. Moreover, there might not even be
organization-managed firewalls for remote employees or laptops that
travel outside the organization's internal network.

One option would be making ESR not have this feature, but that would
mean telling organizations that otherwise would be OK with running
up-to-date software to go run out-of-date (+ sec-critical backports)
software.

> This is a general tool to help combat the prevalence of malware add-ons by
> blocking the install of add-ons that we don't know about and giving us the
> ability to look at add-ons after the fact to evaluate them for potential
> blocklisting or developer outreach. That doesn't mean we can't also work on
> specific types of malware. We have already added some tools to mitigate
> against search engine hijacking in Firefox.

As noted, the mockups clearly suggest that unwanted toolbars and
search hijacks would not the categorized as "malware". What's the
reason for not trying to eradicate unwanted toolbars and search
hijacks as part of *this* initiative? If the answer is that it
wouldn't work because of reason X, why wouldn't reason X also make
this solution not work for the extensions that are categorized as
"malware"?

--
Henri Sivonen
hsiv...@hsivonen.fi
http://hsivonen.fi/

Axel Hecht

unread,
Nov 4, 2013, 6:48:50 AM11/4/13
to
Most language packs are featured on AMO these days, please check
https://addons.mozilla.org/En-us/firefox/language-tools/. They're pulled
from ftp by a admin tool in amo before the release.

But yes, I'll actually need to read the original post with l10n in mind.

Axel

Axel Hecht

unread,
Nov 4, 2013, 8:10:18 AM11/4/13
to
Hi,

read the thread now. I ignored it based on the subject, btw, didn't seem
to affect anything in real life from just glancing at that.

I'd like to get langpacks excluded. Maybe we need to make their
abilities to do stuff more robustly checked, but for a localizer wanting
to test their work, this is really cumbersome. Note, the default config
of those might even fail the malware checks, as the default identifies
the author as "mozilla.org".

For non-l10n questions:

We'd need developers to grant us a license to their code, right? Do we
know for how many add-ons we'd actually need a license agreement that's
not covered by the EULA of the add-on?

I think that the current proposals for developers and internal org
add-ons are too course. The proposal seems to expose them to malware
just for the sake of their own development or deployment.

I think that blocking the install is too hard for non-registered
add-ons. A consistent UI that encourages users to uninstall
non-registered add-ons might be all we need to get developers to
register voluntarily.

Also, the "just break the network" path seems to be easy to get to for
malware installed by .exe installers on windows, at least. Or at least
be open to social engineering as much as dismissing a non-registered
add-on UI.

Axel

Avi Hal

unread,
Nov 4, 2013, 8:44:12 AM11/4/13
to
"Official" opt-out by blocking the network/DNS/etc during install is both uncomfortable (yet can be induced also by malware) and also logically inconsistent since it could be considered a temporary and valid network state, and users don't know what to expect when the network is back on (i.e. would the addon get disabled or not).

Also, as noted earlier, a central repo with so much data is ripe for abuse from various vectors (official of otherwise). We could have as many mechanisms in place to reduce the risk, but the risk is still there.

A voluntary one is reasonable, as is AMO in its current state, but a mandatory one is certain to raise quite a bit of hostility for the idea, both from enterprises but also from those with privacy concerns, of which Mozilla surely is one of. Collecting data on every developer as a mandatory step for not rejecting her addon falls also into this category.

I think that the current status-quo where plugins/apps/addons could either be opted-in voluntarily by the developer (signing/whitelist/etc, without getting into the technical details), or spawn a clear yet manageable message about the dangers of such action is the maximum we should consider.

-avih

Dave Townsend

unread,
Nov 4, 2013, 11:44:41 AM11/4/13
to Henri Sivonen, dev-pl...@lists.mozilla.org
On Mon, Nov 4, 2013 at 1:20 AM, Henri Sivonen <hsiv...@hsivonen.fi> wrote:

> Dave Townsend <dtow...@mozilla.com> wrote:
> > I don't know the specifics of the checks but I believe it is more about
> > checking for the existance of actual binary malware that the add-on might
> > try to execute.
>
> This implies that Mozilla's malware scanner would be better than
> Windows anti-virus software run on the users' computers.
>

I was mistaken on this, there are other checks going on as Jorge mentioned
earlier.


> > The intention is not to hash the add-ons during startup. We've talked
> about
> > the option of hashing some time later to verify that all the installed
> > add-ons are still valid. This does create a way for an application on the
> > user's system to get an unknown add-on into Firefox but that seems like
> an
> > ok trade-off against saving startup time.
>
> Wouldn't that mean that by the time the check runs, the malware has
> already had the chance to do its misdeeds and to cover its tracks?
>

We hash and check the add-on at install time so checks at startup are only
useful to detect the case where something already running on the user's
computer has changed an add-on's files.


> Also, the notion that startup might be the first time Firefox sees an
> add-on suggests that the add-on hasn't been installed as an .xpi
> through Firefox itself but has been dropped on the system by an .exe.
> This means that arbitrary malicious code (the dropper/installer .exe)
> has already had a chance to run.
>

> Since as a technical matter the solution doesn't address malicious
> code in the dropper/installer .exe and as a policy matter the solution
> doesn't address unwanted toolbars or search hijacks created by known
> companies (as opposed to criminals in the shadows), I think it would
> be helpful to have a clearer statement of the threat model to
> understand what threat the proposed solution would address.
>

I think the point is that this can address unwanted toolbars and search
hijacks, it is an extension to the existing blocklisting capabilities that
we have in place that make it easier for us to control the spread of bad
add-ons.


> > The malware check itself is not the most important part of this proposal.
> > Being able to get information about and contact the developers of
> > potentially harmful extensions and blocking the ability of malware to
> > create per-install versions of their extensions that would be difficult
> for
> > us to block is what we're trying to solve here.
>
> If the malware check isn't the most important part and tracing
> software to developers is the important part, is there a reason why a
> solution like the OS X default mode wouldn't work? That is, instead of
> requiring developers to disclose software to Mozilla ahead of time,
> would it not be sufficient to require developers to sign the software
> with a key that's registered with Mozilla together with contact info
> and that Mozilla can revoke?
>

It's the difference between being able to stop an attach before it starts
versus having to clean up after many users have been attacked.


> > This is a general tool to help combat the prevalence of malware add-ons
> by
> > blocking the install of add-ons that we don't know about and giving us
> the
> > ability to look at add-ons after the fact to evaluate them for potential
> > blocklisting or developer outreach. That doesn't mean we can't also work
> on
> > specific types of malware. We have already added some tools to mitigate
> > against search engine hijacking in Firefox.
>
> As noted, the mockups clearly suggest that unwanted toolbars and
> search hijacks would not the categorized as "malware". What's the
> reason for not trying to eradicate unwanted toolbars and search
> hijacks as part of *this* initiative? If the answer is that it
> wouldn't work because of reason X, why wouldn't reason X also make
> this solution not work for the extensions that are categorized as
> "malware"?
>

Which mockups are you referring to here?

David E. Ross

unread,
Nov 4, 2013, 12:43:41 PM11/4/13
to
Is there a bugzilla.mozilla.org bug report for this change? If so, what
is the bug number?

Jorge Villalobos

unread,
Nov 4, 2013, 3:42:50 PM11/4/13
to
On 11/1/13 6:30 PM, David E. Ross wrote:
> On 11/1/2013 11:04 AM, Jorge Villalobos wrote:
>> On 10/31/13 7:07 PM, ert...@gmail.com wrote:
>>> I develop an extension for use on an intranet that has no access to the Internet. Will this affect my extension?
>>>
>>
>> No. Failure to contact the API will allow the installation to happen.
>>
>> Jorge
>>
>
> I normally disconnect from the Internet for all software installations,
> except where only stub installers requiring further downloads are
> involved. If I can install a non-registered extension while
> disconnected from the Internet, will I be able to use it -- will my
> Mozilla application have the extension's functionality -- after I
> reconnect to the Internet?

There will be a regular check for add-on registration, so it would be
disabled (or removed) eventually. If you don't want registration to
apply to you at all, the easiest way would be to block the URL used for
the registration API.

Jorge

Jorge Villalobos

unread,
Nov 4, 2013, 3:48:41 PM11/4/13
to
On 11/1/13 12:43 PM, Mike Connor wrote:
>
> On Oct 31, 2013, at 4:56 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
>
>> On 10/31/13 11:16 AM, Mike Connor wrote:
>>>
>>> On Oct 31, 2013, at 12:57 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
>>>
>>>> We understand that because of various reasons, like business policies of
>>>> not sharing certain information under any circumstances, registering
>>>> add-ons for internal use will not be possible. In those cases you will
>>>> have the option of just blocking the API in your internal network. API
>>>> failures will allow the add-on to be installed. We're also looking into
>>>> locked preferences to whitelist a set of IDs.
>>>
>>> I’m not convinced this will create a significant barrier, since a side loaded add-on is already coming in with admin privs, and thus can trivially subvert things like host resolution.
>>>
>>> Is there a reason we haven’t just gone with signing of XPIs (submit unsigned XPIs, get back signed XPIs) rather than a network-driven API? It’d be good to understand which options were considered over the last year.
>>>
>>> — Mike
>>>
>>
>> There are a couple of reasons why signing isn't in this proposal. The
>> first one is that we would need to set a cut-off point after which we
>> only allow signed files. That would exclude all installations of
>> previously unsigned files, which would be massive. In this proposal,
>> developers have the possibility of registering their old files, so
>> current users can continue using them. Even for old, abandoned add-ons,
>> either us or one of their users can register old files so that they
>> continue working. On the AMO side we can automatically register all
>> files old, and new, fairly easily.
>
> I’m curious why you think the use-case of installing old versions of add-ons would be “massive” since it seems like a pretty small corner case. I doubt either of us have data, but the idea that a significant number of users are archiving old versions seems somewhat far-fetched. Some do, for sure, but that doesn’t necessarily mean it’s a use-case we need to protect. I’m actually of the opinion that we would lose very little for the vast majority of users if we stop supporting unmaintained add-ons.
>
> For this new model to be effective I believe we need to raise the bar on what we allow to be installed. Grandfathering everything doesn’t really solve that.


Add-ons can have a fairly long shelf life. They can be abandoned for
years without their users noticing a problem. Having compatibility on by
default means that users won't notice until the add-on breaks.

And we're not only talking about old add-ons. We would be forcing all
add-on developers to produce new versions of their add-ons to support a
system where signatures are required, and then expect all users to
update to those versions. I think this would introduce a lot of friction
in the transition to the new system.

>> Signing XPIs would require us to repackage the files to add the
>> signature, and that's something we've had problems with in the past
>> (like with SDK repacks). I don't have confidence that it would work for
>> the the huge back-catalog of add-on files we have. And, even if we did
>> sign them all, that still excludes all installs of old files.
>
> I don’t know the nature of the issues you’ve experienced previously, but what you’re talking about here is just an automated repack system. We do those reliably for localized and custom partner builds already, we should be able to do the same thing for add-ons.
>
> — Mike

I can assure you it hasn't worked well when we've had to repack hundreds
of add-ons in the past. I'd rather avoid doing something like this when
the results have been far from ideal.

Jorge

Jorge Villalobos

unread,
Nov 4, 2013, 3:50:12 PM11/4/13
to
That's one of the possible overrides we're looking into, which is having
a locked pref with an ID whitelist. These prefs can only be changed in
the installation directory.

Jorge

Jorge Villalobos

unread,
Nov 4, 2013, 4:12:40 PM11/4/13
to
On 11/4/13 3:20 AM, Henri Sivonen wrote:
> Dave Townsend <dtow...@mozilla.com> wrote:
>> I don't know the specifics of the checks but I believe it is more about
>> checking for the existance of actual binary malware that the add-on might
>> try to execute.
>
> This implies that Mozilla's malware scanner would be better than
> Windows anti-virus software run on the users' computers.
>
> Why would Mozilla's scanner be better for Windows than anti-virus
> solutions that users are already supposed to be using?
> Is there a big need to run Mac & Linux malware scans and end users
> just don't have anti-virus software installed?

While we'll probably do a virus check in extensions with binaries, the
majority of checks will be focused on malicious JavaScript patterns. Our
knowledge of malicious add-ons in the wild gives us a much better chance
of detecting a malicious add-on than any general antivirus tool.

>> The intention is not to hash the add-ons during startup. We've talked about
>> the option of hashing some time later to verify that all the installed
>> add-ons are still valid. This does create a way for an application on the
>> user's system to get an unknown add-on into Firefox but that seems like an
>> ok trade-off against saving startup time.
>
> Wouldn't that mean that by the time the check runs, the malware has
> already had the chance to do its misdeeds and to cover its tracks?
>
> Also, the notion that startup might be the first time Firefox sees an
> add-on suggests that the add-on hasn't been installed as an .xpi
> through Firefox itself but has been dropped on the system by an .exe.
> This means that arbitrary malicious code (the dropper/installer .exe)
> has already had a chance to run.
>
> Since as a technical matter the solution doesn't address malicious
> code in the dropper/installer .exe and as a policy matter the solution
> doesn't address unwanted toolbars or search hijacks created by known
> companies (as opposed to criminals in the shadows), I think it would
> be helpful to have a clearer statement of the threat model to
> understand what threat the proposed solution would address.

We can't effectively stop an EXE running locally with admin privileges.
We can try to make it harder for these installers to inject malware into
Firefox, but a sufficiently determined attacker will find a way. We're
trying to block the less sophisticated attacks (which we believe are
most of them) and prevent other types of attack that we know are
possible but aren't happening widely (as far as we know).

Unwanted toolbar installs is something we already deal with at a policy
level and using blocklisting as the hammer, and it has yielded positive
results so far.

>> The malware check itself is not the most important part of this proposal.
>> Being able to get information about and contact the developers of
>> potentially harmful extensions and blocking the ability of malware to
>> create per-install versions of their extensions that would be difficult for
>> us to block is what we're trying to solve here.
>
> If the malware check isn't the most important part and tracing
> software to developers is the important part, is there a reason why a
> solution like the OS X default mode wouldn't work? That is, instead of
> requiring developers to disclose software to Mozilla ahead of time,
> would it not be sufficient to require developers to sign the software
> with a key that's registered with Mozilla together with contact info
> and that Mozilla can revoke?

The malware check is also important and necessary. I responded to the
certificate proposals in other replies on this thread.
I don't believe these toolbars are malware if they disclose their
purpose clearly and can be easily removed by users. We have a set of
guidelines they need to follow
(https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Add-on_guidelines)
in order to not be blocklisted. We have blocked many of them in the past
and will continue to do so.

One of the main problems we've had when dealing with these toolbars is
the lack of information we have about their developers, or even their
add-on ID. Some have resorted to randomizing their add-on ID, for
various reasons, one of them probably to avoid detection by us. This
system will help us with these problems, provided the developers don't
resort to subverting the entire system (and in that case we can possibly
resort to other means of escalation, since these are for the most part
real companies).

So, this proposal will help us in dealing with these add-ons, but the
malware checks won't. Determining whether a greyware toolbar add-on is
within our policies or not requires a more cautious look at what it does
and we won't block all of them by default.

Jorge

Jorge Villalobos

unread,
Nov 4, 2013, 4:18:08 PM11/4/13
to
There aren't any bugs for this at the moment.

Jorge

Mike Connor

unread,
Nov 4, 2013, 5:01:47 PM11/4/13
to Jorge Villalobos, dev-pl...@lists.mozilla.org
On 2013-11-04 3:48 PM, Jorge Villalobos wrote:
> On 11/1/13 12:43 PM, Mike Connor wrote:
>>
>> I’m curious why you think the use-case of installing old versions of add-ons would be “massive” since it seems like a pretty small corner case. I doubt either of us have data, but the idea that a significant number of users are archiving old versions seems somewhat far-fetched. Some do, for sure, but that doesn’t necessarily mean it’s a use-case we need to protect. I’m actually of the opinion that we would lose very little for the vast majority of users if we stop supporting unmaintained add-ons.
>>
>> For this new model to be effective I believe we need to raise the bar on what we allow to be installed. Grandfathering everything doesn’t really solve that.
> Add-ons can have a fairly long shelf life. They can be abandoned for
> years without their users noticing a problem. Having compatibility on by
> default means that users won't notice until the add-on breaks.

None of this really answers my concern, which is that this use-case
matters to relatively few users, and thus doesn't merit weakening the
strength of the system. From what I've seen, the most-used add-ons
still get updated on a somewhat regular basis. I'd like to see an
evaluation of the tradeoffs (stronger system for all users vs.
disruption for a minority).

Have we used FHR data to identify the real-world stats on these tradeoffs?

> And we're not only talking about old add-ons. We would be forcing all
> add-on developers to produce new versions of their add-ons to support a
> system where signatures are required, and then expect all users to
> update to those versions. I think this would introduce a lot of friction
> in the transition to the new system.

I think anything we do to enforce centralization will create friction
with developers not already using AMO. If the solution doesn't require
submission of code to Mozilla and matches how other software is
typically deployed (signing of binaries), I'd expect less resistance to
that solution. (If you're on a closed intranet, you don't even need to
pay for a cert, you can deploy your own root and and sign with a
certificate signed by that root.)

>>> Signing XPIs would require us to repackage the files to add the
>>> signature, and that's something we've had problems with in the past
>>> (like with SDK repacks). I don't have confidence that it would work for
>>> the the huge back-catalog of add-on files we have. And, even if we did
>>> sign them all, that still excludes all installs of old files.
>> I don’t know the nature of the issues you’ve experienced previously, but what you’re talking about here is just an automated repack system. We do those reliably for localized and custom partner builds already, we should be able to do the same thing for add-ons.
>>
>> — Mike
> I can assure you it hasn't worked well when we've had to repack hundreds
> of add-ons in the past. I'd rather avoid doing something like this when
> the results have been far from ideal.

This sounds like we're making decisions on how to protect users based on
implementation problems we've had on AMO in the past. I'm not sure
that's the right approach. If sign+repack was trivial and reliable,
would we implement that instead? I think it's worth checking the
assumption that this would be hard against building a high-volume
service to validate hashes and store lots of add-ons.

-- Mike

mka...@gmail.com

unread,
Nov 4, 2013, 7:15:18 PM11/4/13
to
On Friday, November 1, 2013 10:55:43 AM UTC-5, Dave Townsend wrote:

>
>
> The malware check itself is not the most important part of this proposal.
>
> Being able to get information about and contact the developers of
>
> potentially harmful extensions and blocking the ability of malware to
>
> create per-install versions of their extensions that would be difficult for
>
> us to block is what we're trying to solve here. Local file checks just
>
> don't help.

What's to prevent someone from just creating a fake AMO account? You guys don't do any validation.

Henri Sivonen

unread,
Nov 5, 2013, 6:21:50 AM11/5/13
to dev-pl...@lists.mozilla.org
On Mon, Nov 4, 2013 at 11:12 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
> We can't effectively stop an EXE running locally with admin privileges.

Right.

> We can try to make it harder for these installers to inject malware into
> Firefox, but a sufficiently determined attacker will find a way. We're
> trying to block the less sophisticated attacks (which we believe are
> most of them) and prevent other types of attack that we know are
> possible but aren't happening widely (as far as we know).

How likely it is that the outcome of this effort is that the attackers
raise their level of sophistication and Firefox suffers marketshare
and goodwill damage from private extensions becoming too hard to
maintain and deploy (leaving us with both attacks and
marketshare/goodwill damage)?

Is there an articulation of what threats exactly this proposal is
designed to address and why those threats are worth addressing in an
environment where attackers can escalate by putting the malicious code
inside their installer .exes?

> Unwanted toolbar installs is something we already deal with at a policy
> level and using blocklisting as the hammer, and it has yielded positive
> results so far.

In the blocklisting bugs I've followed, the blocklisting process has
been terribly slow. It's been a while though since I've paid
attention. How long does it take these days to get a deceptively
installed toolbar blocked?

> I responded to the
> certificate proposals in other replies on this thread.

Do you mean "The
first one is that we would need to set a cut-off point after which we
only allow signed files. That would exclude all installations of
previously unsigned files, which would be massive."?

Dave Townsend wrote:
> Which mockups are you referring to here?

https://wiki.mozilla.org/Add-ons/FileRegistration/Mockups

Johnathan Nightingale

unread,
Nov 5, 2013, 11:52:32 AM11/5/13
to Henri Sivonen, dev-pl...@lists.mozilla.org
On Nov 5, 2013, at 6:21 AM, Henri Sivonen wrote:

>> We can try to make it harder for these installers to inject malware into
>> Firefox, but a sufficiently determined attacker will find a way. We're
>> trying to block the less sophisticated attacks (which we believe are
>> most of them) and prevent other types of attack that we know are
>> possible but aren't happening widely (as far as we know).
>
> How likely it is that the outcome of this effort is that the attackers
> raise their level of sophistication and Firefox suffers marketshare
> and goodwill damage from private extensions becoming too hard to
> maintain and deploy (leaving us with both attacks and
> marketshare/goodwill damage)?
>
> Is there an articulation of what threats exactly this proposal is
> designed to address and why those threats are worth addressing in an
> environment where attackers can escalate by putting the malicious code
> inside their installer .exes?


Many add-on practices today are in a grey area: packaged with unrelated installers, opt-out, unclear effects. They hurt our experience, and are unwanted. Straight blocklisting as currently implemented is possible, and we use it, but it's not a durable solution since it's tied to addon ID, which can be permuted by the author or, worse, randomly by each installer.

A registration system prevents this obvious approach to avoiding the blocklist (as would some of the other approaches under discussion here). Truly nefarious addons could still try to subvert Firefox by altering our binaries or our prefs, but that is an overt attack and pushes those applications clearly into the realm of malware, at which point they'd run afoul of OS-level malware detection tools as well (which can interdict executables before they run on the system, something a userland app can't do.)

The goal of this proposal is not to solve user-executed malware. The goal is to disambiguate the grey area so that bad actors have no plausible way to claim that this is something we support/permit. This is our product; it's extensible but it's not a free for all. We should and will exercise controls when we feel it's being misused.

J

---
Johnathan Nightingale
VP Firefox
@johnath

Blair McBride

unread,
Nov 5, 2013, 9:43:35 PM11/5/13
to Steve Fink, Dave Townsend, dev-pl...@lists.mozilla.org
On 2/11/2013 7:19 a.m., Steve Fink wrote:
>> >I don't think that that is something enterprises would want to spend
>> >time doing, do you?
> Depends on how juicy the carrot is. Enterprise IT departments really
> don't like fixing problems caused by their users installing malware, and
> having the Mozilla registry available to help with the problem is a big
> plus from that perspective. If this just involved hosting a JSON file or
> something, I could imagine enterprises very much liking this. (Though
> perhaps it'd need to be integrated with whatever Enterprise IT
> Architecture Synergy Management Command and Control Server bullshit they
> already use.)

Hmm, I rather like the base idea there - although I think it would be
best targeted at just enterprise-style deployments. They could set an
alternate URL for the registration server (as part of the deployed
default preferences), and have it point to an in-house proxy
registration server. That would have it's own DB - if an add-on wasn't
in their DB, the proxy would just request the info from AMO and relay it
to the client. That would give organisations a central way to easily
manage a list of safe add-ons that are only available inside that
organisation, and they wouldn't need to submit potentially sensitive
add-ons to AMO.

All that is entirely doable in the current proposed spec. However, it
would be nice to have an example/reference (open source) implementation
of a proxy server provided by Mozilla.

- Blair

Blair McBride

unread,
Nov 5, 2013, 9:48:42 PM11/5/13
to mka...@gmail.com, dev-pl...@lists.mozilla.org
On 5/11/2013 1:15 p.m., mka...@gmail.com wrote:
> What's to prevent someone from just creating a fake AMO account? You guys don't do any validation.

What would be the effects of a fake account? Being able to contact the
author is primarily a means of helping resolve issues with an add-on. An
author could simply ignore emails from AMO too.

- Blair

Blair McBride

unread,
Nov 5, 2013, 11:21:38 PM11/5/13
to and...@ducker.org.uk, dev-pl...@lists.mozilla.org
On 2/11/2013 5:18 a.m., and...@ducker.org.uk wrote:
> On Friday, 1 November 2013 16:04:31 UTC, Dave Townsend wrote:
>>> Earlier comments pointed out that enterprises can just block
>>> access to the API which would effectively turn off this feature.
>>> Do you think that isn't good enough to solve the problem?
> I'd say so. You're asking people to add rules to their firewall in
> order to solve an application usability issue. This means that (a)
> there's going to be a pain point in tracking down the firewall team
> and getting them to make a change in the first place (always
> frustrating) and then (b) when someone is cleaning up the firewall
> and can't work out why a rule is there, and removes it, suddenly your
> internal-only Firefox addons stop working, and nobody can work out
> why!

There will also be a preference organisations can deploy to the
application directory to override default behaviour.

- Blair

Blair McBride

unread,
Nov 5, 2013, 11:26:42 PM11/5/13
to David E. Ross, dev-pl...@lists.mozilla.org
Because that approach isn't good enough, as evidenced by the problems
with the current system. I'd like to flag add-ons as AMO-approved as
well - but that's in addition to/complementary the file registration
system. They're solving two different issues.

- Blair

Blair McBride

unread,
Nov 5, 2013, 11:32:44 PM11/5/13
to Jorge Villalobos, dev-pl...@lists.mozilla.org
On 2/11/2013 6:57 a.m., Jorge Villalobos wrote:
> On 11/1/13 2:29 AM, Gijs Kruitbosch wrote:
>>> So, you're suggesting to have signing in addition to the hashes, as a
>>> way to opt-out?

[...]

> I think that's a good idea

+1, I've thought about this myself as an additional way to prove an
add-on is ok to install. I think it's additional to the current method
though, so not something we need to do right away (especially
considering the extreme void of signed add-ons at the moment).

- Blair

Dagger

unread,
Nov 6, 2013, 2:42:18 AM11/6/13
to
That would do the trick if the whitelist supported wildcards.

Henri Sivonen

unread,
Nov 6, 2013, 8:57:49 AM11/6/13
to Johnathan Nightingale, dev-pl...@lists.mozilla.org
On Tue, Nov 5, 2013 at 6:52 PM, Johnathan Nightingale
<joh...@mozilla.com> wrote:
> The goal of this proposal is not to solve user-executed malware. The goal is
> to disambiguate the grey area so that bad actors have no plausible way to
> claim that this is something we support/permit.

Considering the potential for collateral damage, this is a pretty
heavy way of disambiguating the gray area to accomplish "bad actors
have no plausible way to claim". Making claims isn't a technical thing
but goes back to policy. Why do we need to make a claiming contest
about what our policy is a technical escalation instead of merely
making it an escalation of words by just claiming authority over what
Mozilla "support[s]/permit[s]" and proceeding to blocklisting
accordingly? That is, if we are shy to act on the gray area now, why
do we need to shrink the gray area technically to get rid of our
shyness instead of shrinking the gray area on the policy statement
level by saying that previously "gray" extensions are now blocklisted?

As for id morphing that evades the current policy enforcement
mechanism, isn't id morphing already unambiguously in the "bad actor"
category so there's really nothing to disambiguate there? Is there a
reason why OS-level malware detection maintaners are unwilling to
pursue id morphing add-ons as malware but would be willing to pursue
those add-ons if we forced the add-ons installer .exes to become more
nefarious if the add-ons no longer could morph their ids?

OTOH, since this really is about *actors*, it seems weird to me to
require the registration of individual software artifacts instead of
requiring the registration of *actors* (OS X default-style), so that
non-bad actors would't need to disclose their artifacts to Mozilla and
all artifacts from a given bad actor could be blocked as one
operation. As Jorge noted, there's a grandfathering problem, but is it
really that much harder for extension authors to issue signed updates
than to upload all their old releases to the registration Web app?

David E. Ross

unread,
Nov 6, 2013, 1:23:15 PM11/6/13
to
I have archived the .xpi files of all extensions that I have installed.
Some of them are "abandoned" but still have the functionality that I
want. If I get a new computer and install Mozilla applications with
which those extensions still work, I will want to install them. I do
not want to lose their functionality. This proposal would block such
installations.

Mike Connor

unread,
Nov 6, 2013, 1:45:58 PM11/6/13
to dev-pl...@lists.mozilla.org
The more I consider this proposal, the more I'm convinced that it's not
quite right yet. That said, there's a path forward I'd like us to consider:

* Signing add-ons should be an acceptable alternative to registration of
add-ons. This is a broadly understood requirement for most developers,
as Windows and OS X already rely on code signing. This would be the
best way to opt out for most of the "internal"/"sensitive" add-on use cases.

* The requirement to pre-register add-ons creates a lot of overhead for
humans. If we're relying on automated scanning anyway, why pre-register
at all? The goal here is to block "bad" things, not to block "unknown"
things on the presumption of guilt. Instead, I'd like us to consider a
model where clients can submit "unknown" add-ons to Mozilla for
verification if they don't match a recognized hash. This would avoid
the grandfathering problem (assuming those older add-ons don't get
rejected for Bad Things) while still building a database of "grey"
add-ons for later analysis and remediation.

-- Mike
> _______________________________________________
> dev-planning mailing list
> dev-pl...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-planning

wsha...@gmail.com

unread,
Nov 6, 2013, 7:32:33 PM11/6/13
to
> > What is needed is an end-user option that says "I accept the risk. Let
>
> > me install and enable the addons I want." There already is a similar
>
> > existing option for about:config
>
>
>
> Having a profile-level preference would make this trivially easy to
>
> bypass, and we already have an installation warning that clearly is not
>
> sufficient to deter users from installing malware.
>
>
>
> There will be ways to disable this feature, as explained in the spec,
>
> but they won't be as easy to get to.
>
>
>
> Jorge

Is a profile-level preference that has to be entered into about:config manually really not sufficient? People install malware add-ons despite the warning now, but that is a window that appears when the user tries to install something and allows installation with a single click of the "Allow" button. If, when an unregistered add-on tried to install, a popup stated that the add-on could not be installed because it was not registered with Mozilla and made no mention of how to override this behavior, would that not be a sufficient barrier to installation for most users who end up installing malware now?

Jorge Villalobos

unread,
Nov 7, 2013, 11:52:46 AM11/7/13
to
On 11/6/13 12:23 PM, David E. Ross wrote:
> On 11/5/2013 8:26 PM, Blair McBride wrote:
>> Because that approach isn't good enough, as evidenced by the problems
>> with the current system. I'd like to flag add-ons as AMO-approved as
>> well - but that's in addition to/complementary the file registration
>> system. They're solving two different issues.
>>
>> - Blair
>>
>>
>>
>>
>> On 2/11/2013 1:44 p.m., David E. Ross wrote:
>>> Why not merely make AMO the repository of analyzed, approved, safe
>>> extensions? Then, you could "advertise" AMO as the place of safety.
>>> Those of us who collect extensions from other sources would not lose
>>> that ability.
>>>
>>> How far are you planning to go to protect users? After all, Mozilla
>>> cannot stop us from installing non-Mozilla applications that do not
>>> directly interface with Mozilla applications. Just give us a source of
>>> "good" applications without interfering with our ability to choose other
>>> sources.
>>>
>
> I have archived the .xpi files of all extensions that I have installed.
> Some of them are "abandoned" but still have the functionality that I
> want. If I get a new computer and install Mozilla applications with
> which those extensions still work, I will want to install them. I do
> not want to lose their functionality. This proposal would block such
> installations.

If they are indeed abandoned and you know are safe, you could register
them yourself. We might need to do that ourselves for other popular
abandoned add-ons.

Jorge

Jorge Villalobos

unread,
Nov 7, 2013, 12:02:56 PM11/7/13
to
On 11/6/13 6:32 PM, wsha...@gmail.com wrote:
>>> What is needed is an end-user option that says "I accept the risk. Let
>>
>>> me install and enable the addons I want." There already is a similar
>>
>>> existing option for about:config
>>
>>
>>
>> Having a profile-level preference would make this trivially easy to
>>
>> bypass, and we already have an installation warning that clearly is not
>>
>> sufficient to deter users from installing malware.
>>
>>
>>
>> There will be ways to disable this feature, as explained in the spec,
>>
>> but they won't be as easy to get to.
>>
>>
>>
>> Jorge
>
> Is a profile-level preference that has to be entered into about:config manually really not sufficient?

No. That can be easily subverted by pretty much any software running in
the system. We're planning on having an application-level preference to
override it, so only admin users and software running with admin
privileges will be able to touch it.

> People install malware add-ons despite the warning now, but that is a window that appears when the user tries to install something and allows installation with a single click of the "Allow" button. If, when an unregistered add-on tried to install, a popup stated that the add-on could not be installed because it was not registered with Mozilla and made no mention of how to override this behavior, would that not be a sufficient barrier to installation for most users who end up installing malware now?

That is what is being proposed. Unregistered add-ons would not be
installed and the user would be notified about it. There would be no
visible override.

Putting the override behind a profile-level pref would make things
difficult for users but not difficult at all for malware creators, so it
wouldn't be sufficient.

Jorge

Jorge Villalobos

unread,
Nov 7, 2013, 1:10:59 PM11/7/13
to
On 11/5/13 5:21 AM, Henri Sivonen wrote:
> On Mon, Nov 4, 2013 at 11:12 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
>> We can't effectively stop an EXE running locally with admin privileges.
>
> Right.
>
>> We can try to make it harder for these installers to inject malware into
>> Firefox, but a sufficiently determined attacker will find a way. We're
>> trying to block the less sophisticated attacks (which we believe are
>> most of them) and prevent other types of attack that we know are
>> possible but aren't happening widely (as far as we know).
>
> How likely it is that the outcome of this effort is that the attackers
> raise their level of sophistication

Some probably will, but others won't. It really depends on the incentive
behind their attacks. In some cases it will be worth doing the extra
effort, but I suspect in most cases it won't. If you have a malicious
binary running with admin privileges in the user's system I think there
are more attractive targets than spamming users' Facebook walls or
changing their search settings or default pages. And even if they all
did, we would not be worse off than we are today.

> and Firefox suffers marketshare
> and goodwill damage from private extensions becoming too hard to
> maintain and deploy (leaving us with both attacks and
> marketshare/goodwill damage)?

There are different categories of private extensions that I think need
to be handled separately.

There are internal enterprise add-ons, which we're trying to address in
this proposal by offering different ways to override the system. There's
the whitelist pref, blocking the API URL, creating a proxy for it (see a
recent response by Blair about it), or possibly offering a signing
certificate. Deploying any of these has some cost, but we're trying to
make it as low as possible.

The majority of non-AMO add-on installs we track could be classified as
greyware. It's mostly the toolbar add-ons that range from the absolutely
unwelcome to the seemingly necessary (like what is bundled with AV
software). Since these are mostly about monetization and most developers
are willing to modify their software to comply with our policies (when
they are threatened with blocklisting), I expect them to just register
their add-ons. If they decide to subvert our system we will have to
chase them around and find ways to make them comply, like we do now. My
only concern for this category is that we get the more user desired
add-ons (the AV ones) on board so users don't have them blocked.

There are add-ons that are distributed outside of AMO out of convenience
or because the developers chose not to deal with our admittedly slow
review process, or their add-on was rejected by us for some reason.
These are add-ons we don't know much about but for the most part have
relatively little usage. It's possible that some of them decide to let
their add-ons break because they don't want to deal with AMO, but I
wouldn't expect that to have a huge impact in terms of usage.

Then are also non-AMO abandoned add-ons, where one possible solution is
to just register them ourselves to reduce friction during the transition.

> Is there an articulation of what threats exactly this proposal is
> designed to address and why those threats are worth addressing in an
> environment where attackers can escalate by putting the malicious code
> inside their installer .exes?

The document lists the main motivations behind this proposal:

* There are numerous add-on IDs being distributed and installed in the
wild that the Add-ons Team have no knowledge about. Many are believed to
be generated per-install.
* Finding contact information for add-on developers can be a daunting
task, when sometimes all the Add-ons Team have is an add-on ID and a name.
* Finding file samples of these add-ons can be just as difficult.

I think Johnath put it very well in his message. There will be a group
of developers who will realize they're doing things wrong and fall in
line, while there will be others who will continue avoiding our system,
or avoid it more overtly. If we can use this system to reduce this
problem, and have better control over add-on IDs and malware out there,
I think that's a big gain for our users. We're balancing this with the
inconvenience for non-AMO developers and some non-AMO add-on users, but
we believe the net effect will be positive.

>> Unwanted toolbar installs is something we already deal with at a policy
>> level and using blocklisting as the hammer, and it has yielded positive
>> results so far.
>
> In the blocklisting bugs I've followed, the blocklisting process has
> been terribly slow. It's been a while though since I've paid
> attention. How long does it take these days to get a deceptively
> installed toolbar blocked?

It's variable since it's a manual process where the I'm right in the
middle of the critical path. Our resources are extremely limited, so I
haven't been able to reach a point where I can guarantee specific
response times.

In some cases we blocklist on the day the bug is filed, which is
generally when the add-on is clearly malicious or there's no way to
contact the developer. In the murkier cases we contact the developer,
usually as soon as the bug is filed, and then give them 2-3 weeks to
respond. If there's no response, we block. If they respond then this
becomes much less predictable because of development timelines and back
and forth between us and the developers.

The sources of truth here would be the list of blocked add-ons, linking
to the bugs, and the list of open blocklist bugs:

https://addons.mozilla.org/en-US/firefox/blocked/
https://bugzilla.mozilla.org/buglist.cgi?cmdtype=dorem&remaction=run&namedcmd=Blocklist%20pending&sharer_id=189742&list_id=8483185

>> I responded to the
>> certificate proposals in other replies on this thread.
>
> Do you mean "The
> first one is that we would need to set a cut-off point after which we
> only allow signed files. That would exclude all installations of
> previously unsigned files, which would be massive."?

Yes. Mike replied with more comments, which I need to follow up on.

Jorge Villalobos

unread,
Nov 7, 2013, 1:24:24 PM11/7/13
to
On 11/4/13 4:01 PM, Mike Connor wrote:
> On 2013-11-04 3:48 PM, Jorge Villalobos wrote:
>> On 11/1/13 12:43 PM, Mike Connor wrote:
>>>
>>> I’m curious why you think the use-case of installing old versions of
>>> add-ons would be “massive” since it seems like a pretty small corner
>>> case. I doubt either of us have data, but the idea that a
>>> significant number of users are archiving old versions seems somewhat
>>> far-fetched. Some do, for sure, but that doesn’t necessarily mean
>>> it’s a use-case we need to protect. I’m actually of the opinion that
>>> we would lose very little for the vast majority of users if we stop
>>> supporting unmaintained add-ons.
>>>
>>> For this new model to be effective I believe we need to raise the bar
>>> on what we allow to be installed. Grandfathering everything doesn’t
>>> really solve that.
>> Add-ons can have a fairly long shelf life. They can be abandoned for
>> years without their users noticing a problem. Having compatibility on by
>> default means that users won't notice until the add-on breaks.
>
> None of this really answers my concern, which is that this use-case
> matters to relatively few users, and thus doesn't merit weakening the
> strength of the system. From what I've seen, the most-used add-ons
> still get updated on a somewhat regular basis. I'd like to see an
> evaluation of the tradeoffs (stronger system for all users vs.
> disruption for a minority).

Since this will affect all add-ons, we can't just think about the most
used ones. There's a very long tail of add-on usage, and even on AMO we
have very popular add-ons that haven't been updated in years.


> Have we used FHR data to identify the real-world stats on these tradeoffs?

No. We're only beginning to use FHR to look into add-on trends, and it
would be fairly difficult to come up with a report that tells us how
many people have add-ons installed that were created within some time
range, especially if what we want is to evaluate the non-AMO add-ons world.

>> And we're not only talking about old add-ons. We would be forcing all
>> add-on developers to produce new versions of their add-ons to support a
>> system where signatures are required, and then expect all users to
>> update to those versions. I think this would introduce a lot of friction
>> in the transition to the new system.
>
> I think anything we do to enforce centralization will create friction
> with developers not already using AMO. If the solution doesn't require
> submission of code to Mozilla and matches how other software is
> typically deployed (signing of binaries), I'd expect less resistance to
> that solution. (If you're on a closed intranet, you don't even need to
> pay for a cert, you can deploy your own root and and sign with a
> certificate signed by that root.)

I think the suggestion of allowing a certificate system limited to
internal deployments is very good. I don't think it's the right way to
go for the entire system because of the reasons I've explained in this
thread, on top of the loss of insight we would have into the add-on
files and potential malware patterns.

>>>> Signing XPIs would require us to repackage the files to add the
>>>> signature, and that's something we've had problems with in the past
>>>> (like with SDK repacks). I don't have confidence that it would work for
>>>> the the huge back-catalog of add-on files we have. And, even if we did
>>>> sign them all, that still excludes all installs of old files.
>>> I don’t know the nature of the issues you’ve experienced previously,
>>> but what you’re talking about here is just an automated repack
>>> system. We do those reliably for localized and custom partner builds
>>> already, we should be able to do the same thing for add-ons.
>>>
>>> — Mike
>> I can assure you it hasn't worked well when we've had to repack hundreds
>> of add-ons in the past. I'd rather avoid doing something like this when
>> the results have been far from ideal.
>
> This sounds like we're making decisions on how to protect users based on
> implementation problems we've had on AMO in the past. I'm not sure
> that's the right approach. If sign+repack was trivial and reliable,
> would we implement that instead? I think it's worth checking the
> assumption that this would be hard against building a high-volume
> service to validate hashes and store lots of add-ons.
>
> -- Mike

This isn't the only reason I don't think we should go with this, but it
is one of them. We've had initiatives fail in the past because of
implementation problems and lack of resources to fix them, so I won't
dismiss this concern this easily. If the system we're proposing is
easier to implement, I don't see how that's not a point in its favor.

Jorge

Joshua Cranmer 🐧

unread,
Nov 8, 2013, 12:36:46 AM11/8/13
to
On 11/7/2013 12:10 PM, Jorge Villalobos wrote:
> There are different categories of private extensions that I think need
> to be handled separately.

You neglected one category: add-ons undergoing development (as in "I
made a typo, let me touch the JS file and restart Firefox" phase of
development). Of all the categories of private extensions, this is by
far the most important one, since it is from this that the entire
ecosystem of add-ons springs into existence.

--
Joshua Cranmer
Thunderbird and DXR developer
Source code archæologist

wsha...@gmail.com

unread,
Nov 8, 2013, 7:25:27 AM11/8/13
to
On Thursday, November 7, 2013 12:02:56 PM UTC-5, jorgev wrote:

> No. That can be easily subverted by pretty much any software running in
>
> the system. We're planning on having an application-level preference to
>
> override it, so only admin users and software running with admin
>
> privileges will be able to touch it.
>

I guess for the purpose of enabling the preference intentionally it is not too important whether it is profile-level or application-level. I'm still curious about a few things though.

1. If software in the system is changing Firefox preferences in unwanted ways, isn't the system already compromised?

2. Or is the problem that the add-on registration malware analysis would not be able to check if an add-on changed the noaddonreg profile-level preference? So such an add-on could still be installed and could then allow malware add-ons in after it.

3. If the software changing Firefox preferences was installed by the user (because it was thought to be safe), couldn't it just prompt for admin privileges like many installers do?

4. Is the -noaddonreg flag still being considered? Could software in the system change Firefox shortcuts to add this switch without admin privileges?

5. Regardless of how the local override works, it is going to be accompanied by a UI prompt that can not be disabled, correct? And that is why these more elaborate overrides (IP blocking, signing add-ons) are being considered as well?

mka...@gmail.com

unread,
Nov 8, 2013, 8:57:04 AM11/8/13
to
A malware author could create a fake account, create malware, get it registered and then distribute. When we find it, we could block it, but they could just do it again.

As long as there is a low barrier to entry to getting an account on AMO, the system can be bypassed.

Jorge Villalobos

unread,
Nov 8, 2013, 11:25:45 AM11/8/13
to
On 11/8/13 6:25 AM, wsha...@gmail.com wrote:
> On Thursday, November 7, 2013 12:02:56 PM UTC-5, jorgev wrote:
>
>> No. That can be easily subverted by pretty much any software running in
>>
>> the system. We're planning on having an application-level preference to
>>
>> override it, so only admin users and software running with admin
>>
>> privileges will be able to touch it.
>>
>
> I guess for the purpose of enabling the preference intentionally it is not too important whether it is profile-level or application-level. I'm still curious about a few things though.
>
> 1. If software in the system is changing Firefox preferences in unwanted ways, isn't the system already compromised?

Not necessarily. Any extension and software running on the system can
change any profile-level preference without necessarily compromising
anything else.

A malicious application running with admin privileges is what would
compromise the system and what we can't battle effectively. The way
we're planning the overrides is so that only these applications are
capable of subverting the system.

>
> 2. Or is the problem that the add-on registration malware analysis would not be able to check if an add-on changed the noaddonreg profile-level preference? So such an add-on could still be installed and could then allow malware add-ons in after it.
>

A malicious installer could change the shortcut to add the flag, which
is why we're also planning on having a startup prompt when the flag is
enabled so it takes additional user action to enable this mode.

> 3. If the software changing Firefox preferences was installed by the user (because it was thought to be safe), couldn't it just prompt for admin privileges like many installers do?
>

Yes, we can't really prevent that and it goes into AV software territory.

> 4. Is the -noaddonreg flag still being considered? Could software in the system change Firefox shortcuts to add this switch without admin privileges?
>

See response for #2.

> 5. Regardless of how the local override works, it is going to be accompanied by a UI prompt that can not be disabled, correct? And that is why these more elaborate overrides (IP blocking, signing add-ons) are being considered as well?
>

The application-level preference is probably going to be the most
practical of all approaches. But having the command line flag might be
necessary for non-admin users. The other overrides we're thinking about
(certs, blocking the API) are aimed at enterprise deployments and
internal add-ons, which we don't need to monitor as closely.

Jorge

Jorge Villalobos

unread,
Nov 8, 2013, 11:27:42 AM11/8/13
to
On 11/7/13 11:36 PM, Joshua Cranmer 🐧 wrote:
> On 11/7/2013 12:10 PM, Jorge Villalobos wrote:
>> There are different categories of private extensions that I think need
>> to be handled separately.
>
> You neglected one category: add-ons undergoing development (as in "I
> made a typo, let me touch the JS file and restart Firefox" phase of
> development). Of all the categories of private extensions, this is by
> far the most important one, since it is from this that the entire
> ecosystem of add-ons springs into existence.
>

True. There are also add-ons that people manually modify to override
compatibility or do minor fixes. For those cases we have various
overrides you can use to bypass the system. The most attractive of them,
I think, is to change an application level preference that will act as
an ID whitelist.

Jorge
It is loading more messages.
0 new messages