Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Proposal for cryptographically secure method of opting out from extension signing requirement

526 views
Skip to first unread message

Jeff Lyon

unread,
Feb 12, 2015, 6:45:18 PM2/12/15
to mozilla-addons-...@lists.mozilla.org
In this post Jorge Villalobos explains that the extension signing requirement cannot be overridden by the user: https://blog.mozilla.org/addons/2015/02/10/extension-signing-safer-experience/

I've conferred with Mike Connor, who explains that making this an about:flag option would be trivially easy to disable, either by malicious software or an unwitting user.

While I disagree with the reasoning behind the decision to force extension signing (you can read my critique of it here: http://blog.rubbingalcoholic.com/post/110743007958/mozillas-mandatory-add-on-review-considered ), I believe that it is possible to make it optional, while still maintaining overall security.

Basically, I propose making the opt-out setting a cryptographic hash, signed by Mozilla, instead of a boolean flag. The method would be as follows:

1. There would be a link in Mozilla settings to disable the signing requirement
2. Users who wish to opt-out from signing would click the link and be taken to a special page on Mozilla's site, where they can (ironically) generate a signed opt-out token
3. Before getting the token, the user would have to pass a "human test" and be educated about the dangers of disabling the signing requirement
4. The token would be a unique checksum representing the user's system environment signed by Mozilla's private key, and some private mechanism would pull it into the user's settings.
5. Thereafter, whenever the user tries to install an unsigned add-on, the signature would be checked against Mozilla's public key, and assuming the checksum matches the user's system environment, they would be able to install, after clicking through a warning.

I believe that users must have a choice to disable the code signing requirement in order to preserve the core principles listed in the Mozilla Manifesto and the spirit of Free Software. Please consider my suggestion.

luck...@musites.com

unread,
Feb 12, 2015, 7:40:09 PM2/12/15
to mozilla-addons-...@lists.mozilla.org
On Thursday, 12 February 2015 23:45:18 UTC, Jeff Lyon wrote:
> the user would have to pass a "human test"

What "human test" exists that is not trivial for malware authors to bypass?

Even if (and it's a big if) there is really no automated way to pass this test it is very cheap to pay real humans to complete these tests on behalf of a malicious actor. That approach has been used to bypass CAPTCHAs for many years.

Unfortunately, that makes the proposal unsound because it can be easily utilised by an automated malware installer.

Thanks,
Chris

Jeff Lyon

unread,
Feb 12, 2015, 8:07:50 PM2/12/15
to mozilla-addons-...@lists.mozilla.org
There are two levels of "bad actors" I have been led to believe are at issue here.

1. "Gray area" crapware bundled with software installers from certain popular download websites.
2. Organized crime / true malware developers.

In the case of the "gray area" people, they can point to checkboxes they put in installers as providing "informed consent" to the changes they make to the browser, which include disabling security settings. These people (ostensibly) operate under the jurisdiction of law, including the Computer Fraud and Abuse Act and Digital Millenium Copyright Act, which together criminalize unauthorized access to remote systems and bypassing DRM restrictions.

My proposal would absolutely stop these people from being able to (legally) alter anyone's code-signing settings.

For the organized criminals--supposing they have the resources to crack CAPTCHAs, well then they have the resources to do much worse with peoples' computers, as far as sideloading of malicious software goes. There is absolutely no way any DRM protections Mozilla puts in place would stop a sufficiently motivated criminal hacker from compromising Firefox in all sorts of creative ways.

Not giving people the ability to opt-out will cause all sorts of collateral damage in the FOSS community, and you're not correct in thinking that it will prevent any exceptionally bad behavior.

luck...@musites.com

unread,
Feb 13, 2015, 5:01:35 AM2/13/15
to mozilla-addons-...@lists.mozilla.org
On Friday, 13 February 2015 01:07:50 UTC, Jeff Lyon wrote:
> There are two levels of "bad actors" I have been led to believe are at issue here.
>
> 1. "Gray area" crapware bundled with software installers from certain popular download websites.
> 2. Organized crime / true malware developers.
>
> In the case of the "gray area" people, they can point to checkboxes they put in installers as providing "informed consent" to the changes they make to the browser, which include disabling security settings. These people (ostensibly) operate under the jurisdiction of law, including the Computer Fraud and Abuse Act and Digital Millenium Copyright Act, which together criminalize unauthorized access to remote systems and bypassing DRM restrictions.
>
> My proposal would absolutely stop these people from being able to (legally) alter anyone's code-signing settings.

IANAL but even setting aside the differences between relevant laws in the USA compared to the rest of the world and the difficulty in determining relevant jurisdiction, I'm not sure whether bypassing your proposed solution could be unequivocally defined as unauthorised access to a remote system or bypassing a DRM restriction. Such an argument is probably also complicated by the question of whether a computer program or a human-being was involved in the actual malicious solving of the CAPTCHA, and whether they even knew that what they were doing was malicious.

Putting the legalities aside, I appreciate that your proposal would raise the bar for those "gray area" product installers, but not nearly as much as Mozilla's current proposal would. Personally I don't think your proposal raises the bar far enough but I have no greater say in the matter than you.

> For the organized criminals--supposing they have the resources to crack CAPTCHAs, well then they have the resources to do much worse with peoples' computers, as far as sideloading of malicious software goes. There is absolutely no way any DRM protections Mozilla puts in place would stop a sufficiently motivated criminal hacker from compromising Firefox in all sorts of creative ways.
>
> Not giving people the ability to opt-out will cause all sorts of collateral damage in the FOSS community, and you're not correct in thinking that it will prevent any exceptionally bad behaviour.

I appreciate your comment is also directed at a more general audience but personally I do not think it will prevent exceptionally bad behaviour. I can't speak for Mozilla's official stance but looking through the various responses from Mozilla employees indicates that they too understand the limitations of this single security measure.

I do think that closing off easy attack vectors will help, by both directly addressing "gray area" products and indirectly addressing more serious crime by forcing those bad actors to have to also work around 3rd party security software that the user could have installed as part of a system-wide protection strategy. This recent post also covers this point: https://groups.google.com/d/msg/mozilla.addons.user-experience/2Tb9ndgzBkg/jpG812euPukJ

Thanks,
Chris

Gijs Kruitbosch

unread,
Feb 13, 2015, 8:21:07 AM2/13/15
to mozilla-addons-...@lists.mozilla.org
Assuming a sufficiently complex pageflow that makes this nontrivial for
malware authors to bypass, I think this would work. What I'm confused
about is why such users wouldn't download & use the builds that have the
feature turned off. That would take a lot fewer steps.

~ Gijs

Mike Kaply

unread,
Feb 13, 2015, 8:56:22 AM2/13/15
to mozilla-addons-...@lists.mozilla.org
Why should someone be made to download a new browser just to test someone's add-on? That's what he's trying to solve here.

I send out test versions of my add-ons all the time for logging, to verify a fix, etc.

You're assuming this is all about developers. It's not.

And in the long run, do you really want to live in a world where everyone downloads an unbranded Firefox? :)

JS

unread,
Feb 13, 2015, 11:32:50 AM2/13/15
to mozilla-addons-...@lists.mozilla.org
On 2/13/2015 8:20 AM, Gijs Kruitbosch wrote:

> Assuming a sufficiently complex pageflow that makes this nontrivial for
> malware authors to bypass, I think this would work. What I'm confused
> about is why such users wouldn't download & use the builds that have the
> feature turned off. That would take a lot fewer steps.

If there are many seats you'd probably need to go the latter route.
Some users might not want Mozilla to know they are making exceptions.
That Mozilla server might not be accessible to the user's machine.
For such reasons I'm thinking the proposal wouldn't eliminate the
need for alternate builds.

However, what about a case like:

1) User isn't aware of the signing issue, alternate builds, etc
2) User installs normal release and uses it for awhile
3) User runs into the signing problem and wants to make an exception
4) User doesn't know what to do at first. They probably want to
use the "unbranded version", but given the intention to not advertise
it and put it someplace where only developers might see it, the
user would have to dig for it. Then download it. Then install it.
Hopefully, an install over would work and they know that so in
the end they don't reinstall everything including customizations.

The proposal would seem a bit easier on this user. They might
see the link while using Firefox and become aware of the issue
and solution ahead of time. Firefox could explicitly point the
user in the right direction when it detects an attempt to install
an unsigned extension. The user is forced to go through a
process that can satisfy clear and unavoidable notice as well
as explicit consent. Which is good for them.

Jorge Villalobos

unread,
Feb 13, 2015, 4:00:24 PM2/13/15
to mozilla-addons-...@lists.mozilla.org
Part of the reason we don't want any overrides in release versions is
that a significant portion of our audience will jump through whatever
hoops are in place to get whatever they thing they need. This is a huge
vector for malware.

For example, certain malware creators create fake video sites with
catchy titles that tell the user to "install X in order to see the
video", and then the user will do whatever is needed. It doesn't matter
if it involves clicking through a bunch of screens of installers,
without reading any of it. We have a warning dialog that is shown during
add-on installation, and an additional warning when installed from
outside of AMO, and they haven't done much to deter users from
installing bad add-ons.

Whatever "human tests" we add won't be useful against actual humans :).
And, as Gijs pointed out, there's a point after which it's just more
convenient to install and use one of the builds that will allow unsigned
extensions.

Jorge

Botond Ballo

unread,
Feb 13, 2015, 4:14:07 PM2/13/15
to Jorge Villalobos, mozilla-addons-...@lists.mozilla.org
On Fri, Feb 13, 2015 at 3:59 PM, Jorge Villalobos <jo...@mozilla.com> wrote:
> And, as Gijs pointed out, there's a point after which it's just more
> convenient to install and use one of the builds that will allow unsigned
> extensions.

Could you at least point to the unbranded builds when a user tries to
install an unsigned extension?

Regards,
Botond

Kris Maglione

unread,
Feb 13, 2015, 4:42:41 PM2/13/15
to Jorge Villalobos, mozilla-addons-...@lists.mozilla.org
On Fri, Feb 13, 2015 at 02:59:21PM -0600, Jorge Villalobos wrote:
>Whatever "human tests" we add won't be useful against actual humans :).
>And, as Gijs pointed out, there's a point after which it's just more
>convenient to install and use one of the builds that will allow unsigned
>extensions.

I think this rather understates matters. Even leaving aside the
fact that many users would probably enable an override with
little thought, it would also be trivial to enable without user
interaction. Even if we did require a CAPTCHA to generate the
token, CAPTCHA solving farms are readily available and, for all
intents and purposes, free. Which means that it would be
trivially possible for a piece of software to unlock a fully
branded version of Firefox for use with unsigned add-ons.

In the end, I think that this significantly lowers the cognitive
barriers that might prevent most users from doing this, without
making things significantly easier for those with good reason,
and also significantly lowers the technical barriers that might
prevent distributors from doing the same.

--
Kris Maglione
Add-ons Technical Editor
Mozilla Corporation

Lisp has jokingly been called "the most intelligent way to misuse a
computer". I think that description is a great compliment because it
transmits the full flavor of liberation: it has assisted a number of
our most gifted fellow humans in thinking previously impossible
thoughts.
--Edsger W. Dijkstra, CACM, 15:10

Mike Connor

unread,
Feb 13, 2015, 5:18:37 PM2/13/15
to Botond Ballo, mozilla-addons-...@lists.mozilla.org, Jorge Villalobos
That idea presumes that a) it's a good idea to install those add-ons and b)
most users understand the ramifications of installing those builds. I
don't think we can presume either of those things, let alone both.
On Feb 13, 2015 4:14 PM, "Botond Ballo" <bba...@mozilla.com> wrote:

> On Fri, Feb 13, 2015 at 3:59 PM, Jorge Villalobos <jo...@mozilla.com>
> wrote:
> > And, as Gijs pointed out, there's a point after which it's just more
> > convenient to install and use one of the builds that will allow unsigned
> > extensions.
>
> Could you at least point to the unbranded builds when a user tries to
> install an unsigned extension?
>
> Regards,
> Botond
> _______________________________________________
> addons-user-experience mailing list
> addons-user...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/addons-user-experience
>

Botond Ballo

unread,
Feb 13, 2015, 7:44:17 PM2/13/15
to Mike Connor, mozilla-addons-...@lists.mozilla.org, Jorge Villalobos
My motivation in requesting this is that I think it's imprortant that
*if* the user has an independent reason to trust the extension (for
example, because they wrote it, or because a trusted friend wrote it,
or because it's an add-on internal to their organization), they should
know what they need to do exercise their right to run software that
they trust. My preference, as I've said elsewhere, would be to have an
"I know what I'm doing button", but if we *must* set the barrier so
high that they have to install another browser altogether, they should
at least know how to do that.

Regards,
Botond

Mike Connor

unread,
Feb 13, 2015, 9:52:52 PM2/13/15
to Botond Ballo, mozilla-addons-...@lists.mozilla.org, Jorge Villalobos
In all of those situations, the individual or organization distributing the
unsigned add-on should already be telling the user how to use it.

In general, "I know what I'm doing" buttons are easily socially engineered
and/or blindly clicked. This is why SSL error dialogs don't exist in modern
browsers.

Jim Porter

unread,
Feb 16, 2015, 3:31:19 PM2/16/15
to mozilla-addons-...@lists.mozilla.org
On 02/13/2015 02:59 PM, Jorge Villalobos wrote:
> Part of the reason we don't want any overrides in release versions is
> that a significant portion of our audience will jump through whatever
> hoops are in place to get whatever they thing they need. This is a huge
> vector for malware.

If you're assuming that, what's to stop users from installing the
unbranded builds to get the thing they want? Presumably the plan is for
these builds to be accessible from Mozilla's servers, so even the
momentary suspicion of "should I really be downloading a new version of
Firefox for this" would be belayed by seeing that it's being served from
a Mozilla server. All the malware author has to do is write a little
blurb about how you're downloading a "jailbroken"/"unlocked" Firefox*
and you'd be back to almost exactly the same position you are now.

About the only case where users would still be protected would be if
they didn't have permissions to install software, but if they're on a
locked-down computer, they could be protected in many other ways instead
(e.g. by using lockPref to ensure that you can't enable the hypothetical
"unsigned extensions" pref).

I suppose the argument against this will be that installing a whole new
version of Firefox increases the "cognitive barriers" of this process,
but has there been any objective research indicating this? Would users
really find installing Firefox to be harder than Jeff Lyon's proposal?
Furthermore, isn't one of our goals to make installing Firefox simple?
How does that interact with the "cognitive barriers" in this situation?

- Jim

* Which is actually pretty much true.

Jeff Lyon

unread,
Feb 17, 2015, 11:41:39 AM2/17/15
to mozilla-addons-...@lists.mozilla.org
On Friday, February 13, 2015 at 8:52:52 PM UTC-6, Mike Connor wrote:
> In all of those situations, the individual or organization distributing the
> unsigned add-on should already be telling the user how to use it.

And, as Jim Porter mentions above, this would include malware developers telling people how to download their "free video downloader" (or whatever): just download the unbranded Firefox. If people are really willing to jump through whatever hoops are in place to get whatever thing they need, as Jorge Villalobos says, then is getting a cool "unlocked" Firefox really such a cognitive barrier?

Whether you allow opt-outs, or whether the user downloads a different browser, the worst-case scenario is the same: the user's browser is compromised by their own poor choices. In both cases, we're still talking about Mozilla's code--the user will blame Mozilla for their bad experience.

My proposal gives Mozilla the opportunity to educate the user about **why** what they're trying to do is probably a really bad idea. Your current plan does no such thing, leaving any explaining to the malware developers, and externalizing the problem of user security onto these forthcoming unbranded builds.

By not giving the user a choice, you're missing a crucial opportunity to warn them. Thus you're actually doing more harm than good, both in terms of user security _and_ adhering to the principles of decentralization and openness that are core to the Mozilla Manifesto.

Mike Connor

unread,
Feb 17, 2015, 3:15:46 PM2/17/15
to Jeff Lyon, mozilla-addons-...@lists.mozilla.org
If they convince a user to install a different browser (one that would look
similar, but without trademarks) and continue to use that browser... none
of this matters. Of course, if you're _really_ malware, you're probably
not using add-ons at all. As I've said a number of times: the point here
is not to stop the actually illegal malware, but to dissuade all of the
greyware/crapware installers from targeting Firefox users. They are our
biggest threat, frankly.

My first concern about this proposal is that "system environment" is
notoriously fragile (Windows XP activation, anyone?) It'd also be
relatively trivial to duplicate that process and integrate that flow into
an installer, without disclosing any of that to the end user. There's no
magic client side crypto pixie dust here that isn't easily duplicated by
another app.

My second concern is that it appears to assume users "get it" when all
studies have shown they don't. Warnings they can click through are
ineffective (i.e. we killed SSL dialogs because users clicked through >90%
of MITM attacks). Understanding that unsigned add-ons can completely
compromise your browser is probably hard-to-impossible to communicate to
non-technical users. Sometimes we have to act on the user's behalf in
their best interests. This is one of them.

My third concern is that there's no good way to do this selectively, unless
you make the user jump the hoop for every add-on. It's unlocking a door to
let in a friend, then leaving it unlocked forever. Not the best of
security practices.

Ultimately, I don't like the idea of broadly accessible opt-outs, because
it destroys the herd immunity factor. If 99% of installs aren't easily
targeted, the monetization and scale vanishes. There's a massive trade-off
here that I don't love, but I've concluded has the best chance of success.

-- Mike

On 17 February 2015 at 11:41, Jeff Lyon <je...@rubbingalcoholic.com> wrote:

> On Friday, February 13, 2015 at 8:52:52 PM UTC-6, Mike Connor wrote:
> > In all of those situations, the individual or organization distributing
> the
> > unsigned add-on should already be telling the user how to use it.
>
> And, as Jim Porter mentions above, this would include malware developers
> telling people how to download their "free video downloader" (or whatever):
> just download the unbranded Firefox. If people are really willing to jump
> through whatever hoops are in place to get whatever thing they need, as
> Jorge Villalobos says, then is getting a cool "unlocked" Firefox really
> such a cognitive barrier?
>
> Whether you allow opt-outs, or whether the user downloads a different
> browser, the worst-case scenario is the same: the user's browser is
> compromised by their own poor choices. In both cases, we're still talking
> about Mozilla's code--the user will blame Mozilla for their bad experience.
>
> My proposal gives Mozilla the opportunity to educate the user about
> **why** what they're trying to do is probably a really bad idea. Your
> current plan does no such thing, leaving any explaining to the malware
> developers, and externalizing the problem of user security onto these
> forthcoming unbranded builds.
>
> By not giving the user a choice, you're missing a crucial opportunity to
> warn them. Thus you're actually doing more harm than good, both in terms of
> user security _and_ adhering to the principles of decentralization and
> openness that are core to the Mozilla Manifesto.

Jeff Lyon

unread,
Feb 17, 2015, 3:38:37 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On Tuesday, February 17, 2015 at 2:15:46 PM UTC-6, Mike Connor wrote:

> My first concern about this proposal is that "system environment" is
> notoriously fragile (Windows XP activation, anyone?) It'd also be
> relatively trivial to duplicate that process and integrate that flow into
> an installer, without disclosing any of that to the end user. There's no
> magic client side crypto pixie dust here that isn't easily duplicated by
> another app.

Windows XP activation was a boolean flag in the registry :) You're right that another app could replicate the checksum process. So how about we add in a Step 3.5: user creates an override password that is also signed by Mozilla and required whenever they install an unsigned extension. This would be impossible for a client-side installer to hijack or do without the user's knowledge.

> My third concern is that there's no good way to do this selectively, unless
> you make the user jump the hoop for every add-on. It's unlocking a door to
> let in a friend, then leaving it unlocked forever. Not the best of
> security practices.

Having a password prompt every time the user tries to install an unsigned extension would solve this problem. If the user forgets the password, he or she can start the opt-out process over again.

> Ultimately, I don't like the idea of broadly accessible opt-outs, because
> it destroys the herd immunity factor. If 99% of installs aren't easily
> targeted, the monetization and scale vanishes. There's a massive trade-off
> here that I don't love, but I've concluded has the best chance of success.

My proposal, as amended, would still prevent the crapware sideloading you're concerned about.

Mike Connor

unread,
Feb 17, 2015, 3:50:15 PM2/17/15
to Jeff Lyon, mozilla-addons-...@lists.mozilla.org
Step 3.5 would easily be duplicated and abused as well. For the first
install, they'd simply use a fake password as a part of the signed artifact
flow, and then hook the install API to use that fake password for all
subsequent calls. XPCOM is powerful, for both good and evil.

On 17 February 2015 at 15:38, Jeff Lyon <je...@rubbingalcoholic.com> wrote:

> On Tuesday, February 17, 2015 at 2:15:46 PM UTC-6, Mike Connor wrote:
>
> > My first concern about this proposal is that "system environment" is
> > notoriously fragile (Windows XP activation, anyone?) It'd also be
> > relatively trivial to duplicate that process and integrate that flow into
> > an installer, without disclosing any of that to the end user. There's no
> > magic client side crypto pixie dust here that isn't easily duplicated by
> > another app.
>
> Windows XP activation was a boolean flag in the registry :) You're right
> that another app could replicate the checksum process. So how about we add
> in a Step 3.5: user creates an override password that is also signed by
> Mozilla and required whenever they install an unsigned extension. This
> would be impossible for a client-side installer to hijack or do without the
> user's knowledge.
>
> > My third concern is that there's no good way to do this selectively,
> unless
> > you make the user jump the hoop for every add-on. It's unlocking a door
> to
> > let in a friend, then leaving it unlocked forever. Not the best of
> > security practices.
>
> Having a password prompt every time the user tries to install an unsigned
> extension would solve this problem. If the user forgets the password, he or
> she can start the opt-out process over again.
>
> > Ultimately, I don't like the idea of broadly accessible opt-outs, because
> > it destroys the herd immunity factor. If 99% of installs aren't easily
> > targeted, the monetization and scale vanishes. There's a massive
> trade-off
> > here that I don't love, but I've concluded has the best chance of
> success.
>
> My proposal, as amended, would still prevent the crapware sideloading
> you're concerned about.

Jeff Lyon

unread,
Feb 17, 2015, 3:59:20 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On Tuesday, February 17, 2015 at 2:50:15 PM UTC-6, Mike Connor wrote:
> Step 3.5 would easily be duplicated and abused as well. For the first
> install, they'd simply use a fake password as a part of the signed artifact
> flow, and then hook the install API to use that fake password for all
> subsequent calls. XPCOM is powerful, for both good and evil.

You're worried that "gray area" crapware will bypass CAPTCHAs and use Mozilla's server to sign fake passwords? People have been sent to prison in the United States for less than that. I assume your signing server will be running in the United States?

Robert Kaiser

unread,
Feb 17, 2015, 4:22:50 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
Jorge Villalobos schrieb:
> For example, certain malware creators create fake video sites with
> catchy titles that tell the user to "install X in order to see the
> video", and then the user will do whatever is needed.

And with that, they can easily install a DLL that hooks into the Firefox process and can do *anything* to it, no matter what mechanisms you/we put there for add-ons specifically.
And note that 1) we can only block DLLs at all with patches to our binaries and 2) there are ways to craft DLL injections that we cannot block at all. So forcing people to go for DLL injections/hooks instead of add-ons actually makes the whole situation of blocking them much worse.

KaiRo

Mike Connor

unread,
Feb 17, 2015, 4:24:33 PM2/17/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
At which point we need to work with OS vendors and security software
vendors to start blocking those DLLs. The key part here is to define a
clear line between good and bad so it's easy to convince those vendors to
act. Right now it's a much harder proposition.

On 17 February 2015 at 16:21, Robert Kaiser <ka...@kairo.at> wrote:

> Jorge Villalobos schrieb:
>
>> For example, certain malware creators create fake video sites with
>> catchy titles that tell the user to "install X in order to see the
>> video", and then the user will do whatever is needed.
>>
>
> And with that, they can easily install a DLL that hooks into the Firefox
> process and can do *anything* to it, no matter what mechanisms you/we put
> there for add-ons specifically.
> And note that 1) we can only block DLLs at all with patches to our
> binaries and 2) there are ways to craft DLL injections that we cannot block
> at all. So forcing people to go for DLL injections/hooks instead of add-ons
> actually makes the whole situation of blocking them much worse.
>
> KaiRo
>
>

Robert Kaiser

unread,
Feb 17, 2015, 4:26:44 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
Mike Connor schrieb:
> In general, "I know what I'm doing" buttons are easily socially engineered
> and/or blindly clicked. This is why SSL error dialogs don't exist in modern
> browsers.

You might have noticed that the "modern browsers" you speak of still have some way for the user to accept/install very specific certificate exceptions for the sites they really want to visit but which have certificate errors.
Why is there no way something like that can be done for add-ons?

KaiRo

Jeff Lyon

unread,
Feb 17, 2015, 4:31:04 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On Tuesday, February 17, 2015 at 3:24:33 PM UTC-6, Mike Connor wrote:
> At which point we need to work with OS vendors and security software
> vendors to start blocking those DLLs. The key part here is to define a
> clear line between good and bad so it's easy to convince those vendors to
> act. Right now it's a much harder proposition.

The Computer Fraud and Abuse Act defines this line clearly enough to stop the
"gray area" people according to what I've recommended.

But with your approach:

1. You miss out on the opportunity to warn users why installing unsigned
add-ons is bad

2. You force crapware to start patching Firefox with DLLs, completely negating
any security improvements you attempt to implement

3. You frustrate the FOSS community and violate at least 3 of the core
principles of the Mozilla manifesto.

Ultimately you're in a **worse** position with respect to user security, and
you've alienated a lot of technical people who would otherwise evangelize your
software.

Mike Connor

unread,
Feb 17, 2015, 4:34:46 PM2/17/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
It's a different threat model. Sites can't escalate to chrome privs to add
that themselves. Installers with admin rights can exempt themselves.



On 17 February 2015 at 16:25, Robert Kaiser <ka...@kairo.at> wrote:

> Mike Connor schrieb:
>
>> In general, "I know what I'm doing" buttons are easily socially engineered
>> and/or blindly clicked. This is why SSL error dialogs don't exist in
>> modern
>> browsers.
>>
>
> You might have noticed that the "modern browsers" you speak of still have
> some way for the user to accept/install very specific certificate
> exceptions for the sites they really want to visit but which have
> certificate errors.
> Why is there no way something like that can be done for add-ons?
>
> KaiRo
>
>

Mike Connor

unread,
Feb 17, 2015, 4:38:38 PM2/17/15
to Jeff Lyon, mozilla-addons-...@lists.mozilla.org
I'm worried that it's an easily duplicated model that could be implemented
in a way that is legal enough. "To proceed with installation, please
answer this CAPTCHA and enter a password." with a checkbox saying "never
ask for this password again."

Like I keep saying: the goal here is to draw a bright line between okay and
not-okay. It's hard to see how this wouldn't keep the line blurred enough
to leave lots of users vulnerable.

That all said, I don't think you're going to convince me that we can
succeed in protecting our users if we have an easy opt-out/backdoor around
protections. I'm personally not even sure this goes far enough to ensure
protect users, but it's the least-bad idea that has good enough odds.
Replacing the entire add-on ecosystem with a sandboxed and restricted model
like Chrome's would be the next step, and I consider that to be a much
greater restriction on freedom than passing an automated check. But if
this isn't effective, we might need to go there.

In case it's not clear, I consider the current model to be an unacceptable
risk both to our users and to Firefox as a viable product (which also
places the Mozilla Mission in jeopardy). If we decide we've gone too far,
we can figure out how to relax things, but first we have to get there.

-- Mike

On 17 February 2015 at 15:59, Jeff Lyon <je...@rubbingalcoholic.com> wrote:

> On Tuesday, February 17, 2015 at 2:50:15 PM UTC-6, Mike Connor wrote:
> > Step 3.5 would easily be duplicated and abused as well. For the first
> > install, they'd simply use a fake password as a part of the signed
> artifact
> > flow, and then hook the install API to use that fake password for all
> > subsequent calls. XPCOM is powerful, for both good and evil.
>
> You're worried that "gray area" crapware will bypass CAPTCHAs and use
> Mozilla's server to sign fake passwords? People have been sent to prison in
> the United States for less than that. I assume your signing server will be
> running in the United States?
>

Jorge Villalobos

unread,
Feb 17, 2015, 4:46:35 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On 2/16/15 2:30 PM, Jim Porter wrote:
> On 02/13/2015 02:59 PM, Jorge Villalobos wrote:
>> Part of the reason we don't want any overrides in release versions is
>> that a significant portion of our audience will jump through whatever
>> hoops are in place to get whatever they thing they need. This is a huge
>> vector for malware.
>
> If you're assuming that, what's to stop users from installing the
> unbranded builds to get the thing they want? Presumably the plan is for
> these builds to be accessible from Mozilla's servers, so even the
> momentary suspicion of "should I really be downloading a new version of
> Firefox for this" would be belayed by seeing that it's being served from
> a Mozilla server. All the malware author has to do is write a little
> blurb about how you're downloading a "jailbroken"/"unlocked" Firefox*
> and you'd be back to almost exactly the same position you are now.

The unbranded versions are intended for developer use, so they won't be
as accessible as regular versions. My guess is we will only link
directly to the installers from MDN, which isn't as user-friendly as "go
to mozilla.com/whatever" Also, the fact that they won't be branded as
Firefox should be another significant hurdle. I doubt this will be 100%
effective, but it should raise the bar significantly.

> About the only case where users would still be protected would be if
> they didn't have permissions to install software, but if they're on a
> locked-down computer, they could be protected in many other ways instead
> (e.g. by using lockPref to ensure that you can't enable the hypothetical
> "unsigned extensions" pref).
>
> I suppose the argument against this will be that installing a whole new
> version of Firefox increases the "cognitive barriers" of this process,
> but has there been any objective research indicating this? Would users
> really find installing Firefox to be harder than Jeff Lyon's proposal?
> Furthermore, isn't one of our goals to make installing Firefox simple?
> How does that interact with the "cognitive barriers" in this situation?

I don't follow. How is this making Firefox more difficult to install?

JS

unread,
Feb 17, 2015, 6:25:35 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On 2/17/2015 11:41 AM, Jeff Lyon wrote:
> My proposal gives Mozilla the opportunity to educate the
> user about **why** what they're trying to do is probably a
> really bad idea. Your current plan does no such thing,
> leaving any explaining to the malware developers, and
> externalizing the problem of user security onto these
> forthcoming unbranded builds.

Are there not other ways to warn and inform users? You
could arrange for information to surround the download.
You could put HTTPS downloads behind a click through
process as informative as the one your proposal assumed.
You could even put the click through process into the
alternate build itself so it would be encountered when
someone begins the install and/or runs it for the first
time.



Mike Connor

unread,
Feb 17, 2015, 6:58:08 PM2/17/15
to JS, mozilla-addons-...@lists.mozilla.org
It's also worth noting that, in a world where we're blocking these installs
we'll still have an opportunity to educate those users who are interested
in why we're blocking them. I'd actually expect users who get blocked will
be much more likely to engage with the why.

Telling someone why something is a bad idea while they're doing it doesn't
really achieve much in practice.

On 17 February 2015 at 18:26, JS <js_losxc...@comcast.net> wrote:

> On 2/17/2015 11:41 AM, Jeff Lyon wrote:
>
>> My proposal gives Mozilla the opportunity to educate the
>>
> > user about **why** what they're trying to do is probably a
> > really bad idea. Your current plan does no such thing,
> > leaving any explaining to the malware developers, and
> > externalizing the problem of user security onto these
> > forthcoming unbranded builds.
>
> Are there not other ways to warn and inform users? You
> could arrange for information to surround the download.
> You could put HTTPS downloads behind a click through
> process as informative as the one your proposal assumed.
> You could even put the click through process into the
> alternate build itself so it would be encountered when
> someone begins the install and/or runs it for the first
> time.
>
>
>
>

Jim Porter

unread,
Feb 17, 2015, 8:12:43 PM2/17/15
to mozilla-addons-...@lists.mozilla.org
On 02/17/2015 03:45 PM, Jorge Villalobos wrote:
> On 2/16/15 2:30 PM, Jim Porter wrote:
>> On 02/13/2015 02:59 PM, Jorge Villalobos wrote:
>>> Part of the reason we don't want any overrides in release versions is
>>> that a significant portion of our audience will jump through whatever
>>> hoops are in place to get whatever they thing they need. This is a huge
>>> vector for malware.
>>
>> If you're assuming that, what's to stop users from installing the
>> unbranded builds to get the thing they want? Presumably the plan is for
>> these builds to be accessible from Mozilla's servers, so even the
>> momentary suspicion of "should I really be downloading a new version of
>> Firefox for this" would be belayed by seeing that it's being served from
>> a Mozilla server. All the malware author has to do is write a little
>> blurb about how you're downloading a "jailbroken"/"unlocked" Firefox*
>> and you'd be back to almost exactly the same position you are now.
>
> The unbranded versions are intended for developer use, so they won't be
> as accessible as regular versions. My guess is we will only link
> directly to the installers from MDN, which isn't as user-friendly as "go
> to mozilla.com/whatever" Also, the fact that they won't be branded as
> Firefox should be another significant hurdle. I doubt this will be 100%
> effective, but it should raise the bar significantly.

Wouldn't a malware author just link to the installer page directly? Hiding the page doesn't do much good if an external site links to it anyway. Even if the page is from MDN, it's still Mozilla-branded, and putting it there isn't going to protect the users who *think* they know more than they really do. I imagine that the bar isn't really being raised very much, but we'd need to do some objective research to verify whether or not this helps. In the end, *every* solution relies on users realizing that they're doing something dangerous and stopping before they make themselves vulnerable. I don't think it's entirely obvious which solutions are more effective in this regard.

>> I suppose the argument against this will be that installing a whole new
>> version of Firefox increases the "cognitive barriers" of this process,
>> but has there been any objective research indicating this? Would users
>> really find installing Firefox to be harder than Jeff Lyon's proposal?
>> Furthermore, isn't one of our goals to make installing Firefox simple?
>> How does that interact with the "cognitive barriers" in this situation?
>
> I don't follow. How is this making Firefox more difficult to install?

What I mean is that Firefox is (and should be!) easy to install, including the developer versions. If the installation process is really simple, then there's not much of a hurdle if someone tells you, "Go here and download a jailbroken Firefox to use my Totally Not Evil extension." You just download a file and click "Next" a few times. If we assume that users will click past any "This might void your warranty!" alerts, why would they be stopped by installing a different Firefox? This isn't merely a rhetorical question; I'm honestly curious about how people came to the conclusion that this will be an effective solution.

I admit that I haven't read all of the old discussion about this, but I haven't seen any user research to determine if the current proposal prevents users from making themselves vulnerable in practice. Has any been done and I'm just not finding it? I think it's unwise to add extra restrictions to what a user can do with their software if we're not totally sure that it will help protect the majority. That doesn't completely eliminate arguments about where to draw the line here, but I think it would be easier to come to a satisfactory conclusion for everyone if we had more solid facts to work with.

- Jim

johanne...@gmail.com

unread,
Feb 18, 2015, 9:49:04 AM2/18/15
to mozilla-addons-...@lists.mozilla.org
On Friday, February 13, 2015 at 12:45:18 AM UTC+1, Jeff Lyon wrote:
> 4. The token would be a unique checksum representing the user's system environment signed by Mozilla's private key, and some private mechanism would pull it into the user's settings.

How do you define/detect "user's system environment", so that it is unique for every Firefox user or at least the majority? How do you prevent a malicious third-party from impersonating the "user's system environment"?

>From what I understand for the token to be unique there would have to be a unique ID for every Firefox User. This ID would have to be unmodifiable by malware so that it cannot be replaced by it along with a matching token.

So a malware/adware developer would just have to get one token for his ID and then just replace both on the users system -> bypassed security measure.

I know there are several unique IDs per user already used for health report and other things by Mozilla, but afaik none of these meets the "unmodifiable" requirement.

Jorge Villalobos

unread,
Feb 18, 2015, 11:11:19 AM2/18/15
to mozilla-addons-...@lists.mozilla.org
On 2/17/15 7:11 PM, Jim Porter wrote:
> On 02/17/2015 03:45 PM, Jorge Villalobos wrote:
>> On 2/16/15 2:30 PM, Jim Porter wrote:
>>> On 02/13/2015 02:59 PM, Jorge Villalobos wrote:
>>>> Part of the reason we don't want any overrides in release versions is
>>>> that a significant portion of our audience will jump through whatever
>>>> hoops are in place to get whatever they thing they need. This is a huge
>>>> vector for malware.
>>>
>>> If you're assuming that, what's to stop users from installing the
>>> unbranded builds to get the thing they want? Presumably the plan is for
>>> these builds to be accessible from Mozilla's servers, so even the
>>> momentary suspicion of "should I really be downloading a new version of
>>> Firefox for this" would be belayed by seeing that it's being served from
>>> a Mozilla server. All the malware author has to do is write a little
>>> blurb about how you're downloading a "jailbroken"/"unlocked" Firefox*
>>> and you'd be back to almost exactly the same position you are now.
>>
>> The unbranded versions are intended for developer use, so they won't be
>> as accessible as regular versions. My guess is we will only link
>> directly to the installers from MDN, which isn't as user-friendly as "go
>> to mozilla.com/whatever" Also, the fact that they won't be branded as
>> Firefox should be another significant hurdle. I doubt this will be 100%
>> effective, but it should raise the bar significantly.
>
> Wouldn't a malware author just link to the installer page directly? Hiding the page doesn't do much good if an external site links to it anyway. Even if the page is from MDN, it's still Mozilla-branded, and putting it there isn't going to protect the users who *think* they know more than they really do. I imagine that the bar isn't really being raised very much, but we'd need to do some objective research to verify whether or not this helps. In the end, *every* solution relies on users realizing that they're doing something dangerous and stopping before they make themselves vulnerable. I don't think it's entirely obvious which solutions are more effective in this regard.

Silently dropping an add-on in a user's profile vs. having to install
and run a new program that isn't branded as Firefox AND changing the
signing pref AND silently installing the add-on AND having the user be
warned about an unsigned extension. I think that's a big difference. If
a user is adamant on using an unsigned extension they will surely find
the way, but there are much more hoops to jump through.

>>> I suppose the argument against this will be that installing a whole new
>>> version of Firefox increases the "cognitive barriers" of this process,
>>> but has there been any objective research indicating this? Would users
>>> really find installing Firefox to be harder than Jeff Lyon's proposal?
>>> Furthermore, isn't one of our goals to make installing Firefox simple?
>>> How does that interact with the "cognitive barriers" in this situation?
>>
>> I don't follow. How is this making Firefox more difficult to install?
>
> What I mean is that Firefox is (and should be!) easy to install, including the developer versions. If the installation process is really simple, then there's not much of a hurdle if someone tells you, "Go here and download a jailbroken Firefox to use my Totally Not Evil extension." You just download a file and click "Next" a few times. If we assume that users will click past any "This might void your warranty!" alerts, why would they be stopped by installing a different Firefox? This isn't merely a rhetorical question; I'm honestly curious about how people came to the conclusion that this will be an effective solution.

There were concerns at some point that users would just migrate to ESR
because they don't want to be updated every 6 weeks. ESR has a solid
audience, but users aren't moving in droves. A similar argument could be
used for alternative browsers like Palemoon and major changes like
Australis.

I don't think we would complain much if we got more usage on Nightly and
Dev Edition, even if it's for the wrong reasons. If we found that
migration to be much larger than what we want (which seems extremely
unlikely), we would certainly revisit this approach. But, as Tyler
pointed out in a different thread, the expectation is that most users
won't even notice this change.

> I admit that I haven't read all of the old discussion about this, but I haven't seen any user research to determine if the current proposal prevents users from making themselves vulnerable in practice. Has any been done and I'm just not finding it? I think it's unwise to add extra restrictions to what a user can do with their software if we're not totally sure that it will help protect the majority. That doesn't completely eliminate arguments about where to draw the line here, but I think it would be easier to come to a satisfactory conclusion for everyone if we had more solid facts to work with.
>
> - Jim

I'm not sure what kind of data you would like to have here. We have data
on users affected by malware in the wild, and we have data on ways that
malware developers are attacking users and avoiding our blocking
mechanisms. As to whether most users will be less willing to go through
various hoops rather than almost none in order to install malware on
their system, I don't think there's formal research we've done on that.


Jorge

Jim Porter

unread,
Feb 18, 2015, 3:25:02 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
On 02/18/2015 10:10 AM, Jorge Villalobos wrote:
> Silently dropping an add-on in a user's profile vs. having to install
> and run a new program that isn't branded as Firefox AND changing the
> signing pref AND silently installing the add-on AND having the user
> be warned about an unsigned extension. I think that's a big
> difference. If a user is adamant on using an unsigned extension they
> will surely find the way, but there are much more hoops to jump
> through.

I think most of the arguments against this proposal are only against the first step ("having to install and run a new program that isn't branded as Firefox"). I haven't seen any objections to the other steps. If we're concerned about malware authors automating all the other steps, what's to stop them from telling you to install Firefox Developer Edition and then automating those steps anyway? The end result is that the user only faces a single hurdle.

To be clear: I don't think anyone is arguing against requiring add-ons to be signed by default; they're only arguing against requiring users to download a different build to run unsigned add-ons.

These proposals seem to be assuming that malware authors don't have access to Firefox's Program Files folder (or similar for other platforms, but let's focus on Windows since that's where the userbase is), or at least that we have some recourse if malware authors just patched the Firefox binary. Doesn't this imply that *any* solution that requires write access to Firefox's Program Files folder would be equally good?

For instance, if we took something like Jeff Lyon's proposal of generating a crypto token from a Mozilla website, but required users to personally place the token in Firefox's Program Files folder, doesn't this provide as much security as downloading a new build? In either case, Windows' UAC will prevent this activity from being automated. In effect, by adding the token, you're installing a new "build" of Firefox, but with the benefit that you're still on the same version as before. This way, users/developers could still use the release channel as needed and be able to run unsigned code, which I see as a hard requirement for add-on development.

> As to whether most users will be less willing to go through various
> hoops rather than almost none in order to install malware on their
> system, I don't think there's formal research we've done on that.

That's the kind of research that I think needs to happen, e.g. getting some users into an office and trying to convince them to install malware under a few different systems (the status quo, this new proposal, something like Jeff Lyon's proposal, etc). I think it's extremely risky to assume that we can actually predict how persistent users will be (and in how much they'll refuse to read anything that we say to warn them!).

Without this kind of research, we're choosing to restrict the freedoms of other users based on only a guess. I think we have a responsibility to do better than that. We should only restrict the freedoms of a class of users because we *know* it will help the majority.*

- Jim

* And even then, we should try our best to minimize the restrictions provided we can get similar levels of protection.

Robert Kaiser

unread,
Feb 18, 2015, 4:29:51 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
Mike Connor schrieb:
> At which point we need to work with OS vendors and security software
> vendors to start blocking those DLLs.

If we foresee that we'll need to do that, then we better start right now, as we already have a large amount of issues with those malware-injected DLLs at this point in time - I'm pretty sure that the crash issues with them that we see flare up every now and then are only the tip of the iceberg as those authors want to have them run without crashes as well.

Also, there's already a really huge amount of DLLs running around that are in the grey zone between good and bad already. I do not think we should push more into the realm.

KaiRo

Robert Kaiser

unread,
Feb 18, 2015, 4:33:12 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
Installers with admin rights can undermine any bit of code we put in there, including the stuff we do for checking signatures.

KaiRo


Mike Connor schrieb:

Mike Connor

unread,
Feb 18, 2015, 4:45:18 PM2/18/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
There's been early conversations. There will be more. This is all part of
the overall sequence we planned out a long time ago.

On 18 February 2015 at 16:28, Robert Kaiser <ka...@kairo.at> wrote:

> Mike Connor schrieb:
>
>> At which point we need to work with OS vendors and security software
>> vendors to start blocking those DLLs.
>>
>
> If we foresee that we'll need to do that, then we better start right now,
> as we already have a large amount of issues with those malware-injected
> DLLs at this point in time - I'm pretty sure that the crash issues with
> them that we see flare up every now and then are only the tip of the
> iceberg as those authors want to have them run without crashes as well.
>
> Also, there's already a really huge amount of DLLs running around that are
> in the grey zone between good and bad already. I do not think we should
> push more into the realm.
>
>

Mike Connor

unread,
Feb 18, 2015, 4:46:41 PM2/18/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
Not legally. See many previous statements about clear delineation between
"okay" and "not okay" in terms of installer behaviour.

Though I don't see the point you're making. A user-controlled override
only works if the attacker can't override it as well.

On 18 February 2015 at 16:32, Robert Kaiser <ka...@kairo.at> wrote:

> Installers with admin rights can undermine any bit of code we put in
> there, including the stuff we do for checking signatures.
>
> KaiRo
>
>
> Mike Connor schrieb:
>

Robert Kaiser

unread,
Feb 18, 2015, 5:47:27 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
IMHO, that just shows that the proposed solution isn't complete yet, as we still need to find that way of how a user can override safely and keep control of their experience and their browser without the attacker being able to override. There must be a good solution for that.

KaiRo


Mike Connor schrieb:

Mike Connor

unread,
Feb 18, 2015, 6:02:12 PM2/18/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
We've yet to come up with something a user can do that an installer can't
trivially duplicate. Especially since we'll publish the source code that
tells them exactly what we're doing in those cases.

I'd love to see a solution, but I don't believe in magic or golden keys.

On 18 February 2015 at 17:46, Robert Kaiser <ka...@kairo.at> wrote:

> IMHO, that just shows that the proposed solution isn't complete yet, as we
> still need to find that way of how a user can override safely and keep
> control of their experience and their browser without the attacker being
> able to override. There must be a good solution for that.
>
> KaiRo
>
>
> Mike Connor schrieb:
>

Robert Kaiser

unread,
Feb 18, 2015, 6:18:27 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
Mike Connor schrieb:
> We've yet to come up with something a user can do that an installer can't
> trivially duplicate. Especially since we'll publish the source code that
> tells them exactly what we're doing in those cases.

We could for example tie the exceptions to their Firefox Account. Encrypt with a key from the account? Stored on the web on some form tied to the account?

KaiRo

Mike Connor

unread,
Feb 18, 2015, 6:42:07 PM2/18/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
And that isn't trivially duplicated how?

On 18 February 2015 at 18:17, Robert Kaiser <ka...@kairo.at> wrote:

> Mike Connor schrieb:
>
>> We've yet to come up with something a user can do that an installer can't
>> trivially duplicate. Especially since we'll publish the source code that
>> tells them exactly what we're doing in those cases.
>>
>
> We could for example tie the exceptions to their Firefox Account. Encrypt
> with a key from the account? Stored on the web on some form tied to the
> account?
>
>

Robert Kaiser

unread,
Feb 18, 2015, 6:57:21 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
How would something duplicate things that are only stored server-side?

KaiRo


Mike Connor schrieb:

Mike Connor

unread,
Feb 18, 2015, 7:15:26 PM2/18/15
to Robert Kaiser, mozilla-addons-...@lists.mozilla.org
By duplicating the client code that talks to the server to fetch the key,
add the add-on to the server list, or whatever other mechanism the client
uses.

Of course, that doesn't solve the "private" case, especially for mass
deployments.

On 18 February 2015 at 18:56, Robert Kaiser <ka...@kairo.at> wrote:

> How would something duplicate things that are only stored server-side?
>
> KaiRo
>
>
> Mike Connor schrieb:
>

Robert Kaiser

unread,
Feb 18, 2015, 7:33:25 PM2/18/15
to mozilla-addons-...@lists.mozilla.org
Mike Connor schrieb:
> By duplicating the client code that talks to the server to fetch the key,
> add the add-on to the server list, or whatever other mechanism the client
> uses.

If they can do that, they can evade the code that does signature checking for the add-on anyhow, or not?

> Of course, that doesn't solve the "private" case, especially for mass
> deployments.

Actually, we have documentation and code to deploy your own FxA server.

KaiRo

Jorge Villalobos

unread,
Feb 19, 2015, 4:06:46 PM2/19/15
to mozilla-addons-...@lists.mozilla.org
On 2/18/15 2:24 PM, Jim Porter wrote:
> On 02/18/2015 10:10 AM, Jorge Villalobos wrote:
>> Silently dropping an add-on in a user's profile vs. having to install
>> and run a new program that isn't branded as Firefox AND changing the
>> signing pref AND silently installing the add-on AND having the user
>> be warned about an unsigned extension. I think that's a big
>> difference. If a user is adamant on using an unsigned extension they
>> will surely find the way, but there are much more hoops to jump
>> through.
>
> I think most of the arguments against this proposal are only against the first step ("having to install and run a new program that isn't branded as Firefox"). I haven't seen any objections to the other steps. If we're concerned about malware authors automating all the other steps, what's to stop them from telling you to install Firefox Developer Edition and then automating those steps anyway? The end result is that the user only faces a single hurdle.

The argument is not only about what the user is willing to do, but also
to what extent is a malware developer going to go to accomplish what
they want. Is this approach sufficiently effective to be worth the cost?
There's also the point that Mike has brought up a couple of times on
this thread, which is that the line between acceptable and unacceptable
behavior is much clearer after we implement this. A piece of software
that goes into all of this effort to trick the user is clearly malicious
and should be dealt with at the AV / OS level.

>
> To be clear: I don't think anyone is arguing against requiring add-ons to be signed by default; they're only arguing against requiring users to download a different build to run unsigned add-ons.

I think there are many objections about other parts of this proposal,
like the centralization of control and the efficiency of the signing
process.

> These proposals seem to be assuming that malware authors don't have access to Firefox's Program Files folder (or similar for other platforms, but let's focus on Windows since that's where the userbase is), or at least that we have some recourse if malware authors just patched the Firefox binary. Doesn't this imply that *any* solution that requires write access to Firefox's Program Files folder would be equally good?

1) There's plenty of malware that acts without admin access.
2) Even if the malware runs with admin access, there are some things we
can do about it, like making it sufficiently difficult for the malware
to be practical, and making it sufficiently clear that the malware is
hacking into Firefox or tricking the user into installing other software
rather than just changing a configuration setting that happens to be in
the programs folder.

>
> For instance, if we took something like Jeff Lyon's proposal of generating a crypto token from a Mozilla website, but required users to personally place the token in Firefox's Program Files folder, doesn't this provide as much security as downloading a new build? In either case, Windows' UAC will prevent this activity from being automated. In effect, by adding the token, you're installing a new "build" of Firefox, but with the benefit that you're still on the same version as before. This way, users/developers could still use the release channel as needed and be able to run unsigned code, which I see as a hard requirement for add-on development.
>
>> As to whether most users will be less willing to go through various
>> hoops rather than almost none in order to install malware on their
>> system, I don't think there's formal research we've done on that.
>
> That's the kind of research that I think needs to happen, e.g. getting some users into an office and trying to convince them to install malware under a few different systems (the status quo, this new proposal, something like Jeff Lyon's proposal, etc). I think it's extremely risky to assume that we can actually predict how persistent users will be (and in how much they'll refuse to read anything that we say to warn them!).
>
> Without this kind of research, we're choosing to restrict the freedoms of other users based on only a guess. I think we have a responsibility to do better than that. We should only restrict the freedoms of a class of users because we *know* it will help the majority.*
>
> - Jim
>

I don't think that getting a few users to try this out is enough to
reach any useful conclusions. There's certainly value in doing user
research, but I think that the kind of study that would be needed to
confirm this would be costlier and take much more time. I'm not in a
position to say if that's something we could invest money on, but I
wouldn't be in favor of delaying this project anymore (it has existed in
one form or another for years).

> * And even then, we should try our best to minimize the restrictions
provided we can get similar levels of protection.

If you look at the past history of this list, you'll see how I've been
fighting to do that for a very long time. I think this signature
proposal strikes a very good balance of giving developers the freedom to
create and distribute the add-ons they want, giving users the freedom to
install what they want, and giving us some level of control over
malicious add-on distribution.

Jorge

Jim Porter

unread,
Feb 20, 2015, 5:48:28 AM2/20/15
to mozilla-addons-...@lists.mozilla.org
On 02/19/2015 03:05 PM, Jorge Villalobos wrote:
> The argument is not only about what the user is willing to do, but
> also to what extent is a malware developer going to go to accomplish
> what they want. Is this approach sufficiently effective to be worth
> the cost? There's also the point that Mike has brought up a couple of
> times on this thread, which is that the line between acceptable and
> unacceptable behavior is much clearer after we implement this. A
> piece of software that goes into all of this effort to trick the user
> is clearly malicious and should be dealt with at the AV / OS level.

I'm not sure I agree. A developer who simply chooses not to comply with these restrictions (e.g. for ideological reasons) and tells users to install an unbranded build isn't malicious. I think it's a pretty fine line between that and a developer who tells users who install an unbranded build because they want to do something nefarious. In terms of how the information is presented to the user, these two cases could be nearly identical.

To be fair, it might not be worth accommodating developers who are ideologically opposed to this, but I can certainly imagine a number of cases where it wouldn't be as clear that the software is malicious (at least from a legal perspective). For instance, is it sufficiently-malicious if the software just provides a link to the unbranded build for the user to install themselves?

> 1) There's plenty of malware that acts without admin access.

It should be possible to eliminate this kind of malware with a much less-restrictive system, so I don't see this as an argument in favor of the current proposal by itself.

> 2) Even if the malware runs with admin access, there are some things we can
> do about it, like making it sufficiently difficult for the malware to
> be practical, and making it sufficiently clear that the malware is
> hacking into Firefox or tricking the user into installing other
> software rather than just changing a configuration setting that
> happens to be in the programs folder.

I think there are many things that are more clearly-evil than just swapping a pref, such as installing a cryptographic certificate to subvert a system's security. In fact, this sort of tactic is getting quite a bit of news lately with Superfish, so I think it would be pretty uncontroversial to claim that the addition of a cert is "evil".

Using this as a guide, you could draft a proposal requiring the installation of a cert (or certs; perhaps one for each unsigned add-on) to Firefox's programs folder in order to allow unsigned add-ons. This might be getting close to the existing plan for "extensions that will never be publicly distributed and will never leave an internal network", but I'm not entirely clear on what that plan is.

> I don't think that getting a few users to try this out is enough to
> reach any useful conclusions. There's certainly value in doing user
> research, but I think that the kind of study that would be needed to
> confirm this would be costlier and take much more time. I'm not in a
> position to say if that's something we could invest money on, but I
> wouldn't be in favor of delaying this project anymore (it has existed
> in one form or another for years).

I just don't think it's acceptable to make guesses in this situation. If we intend to protect users from themselves*, we should be *sure* we're protecting them. It's possible this solution goes too far, or not far enough, but without testing, we can't know. I know some folks have suggested that we can dial the restrictions up or down after we've deployed this, but given that this is a rather inflammatory issue (a fact that I'm sure did not surprise you in the least), I think we should try to get it right the first time. It doesn't exactly inspire confidence in me if this system keeps changing after it's deployed.

> If you look at the past history of this list, you'll see how I've
> been fighting to do that for a very long time. I think this
> signature proposal strikes a very good balance of giving developers
> the freedom to create and distribute the add-ons they want, giving
> users the freedom to install what they want, and giving us some level
> of control over malicious add-on distribution.

As I'm sure you can tell, I (and others) disagree. Had I known such discussions were happening, I'd have gotten involved sooner, but unfortunately, discussions like this never seem to escape to the rest of Mozilla until they're nearing the end of the design phase (or else I'm just really out of the loop, which is entirely possible).

I think some of the issue is that the original blog post is rather short on details (e.g. what exactly the requirements are, what proposals were rejected, etc). This makes it harder for anyone to submit counter-proposals, because now they have to read the entire archives or risk proposing something that doesn't meet the requirements, and it also serves to make people even more upset, because they won't be aware of what all the requirements are. Perhaps there really is a reason that no other solution on earth would work, but I can't tell that from the blog post.

This sort of thing happened with the Firefox Tiles announcement too: the initial blog post was light on details, leading people to fear the worst. I was critical of that at the time too (although half of my criticism was the way the announcement was made), and I recall someone asking me, "Don't you trust the Firefox team to make the best decision?" In that case, as well as this, the answer is "No." That's not to say that I *distrust* my coworkers, but in the absence of evidence, I choose to remain skeptical. I think the same holds true for a lot of people, both in and out of Mozilla. I imagine the backlash would have been softened had more details been given in the beginning.

- Jim

* Sorry, I couldn't help myself.

The Wanderer

unread,
Feb 20, 2015, 1:05:49 PM2/20/15
to mozilla-addons-...@lists.mozilla.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 02/18/2015 at 05:46 PM, Robert Kaiser wrote:

> IMHO, that just shows that the proposed solution isn't complete
> yet, as we still need to find that way of how a user can override
> safely and keep control of their experience and their browser
> without the attacker being able to override. There must be a good
> solution for that.

There is no such solution - or rather, such a solution would require
"strong AI", in order to distinguish attacker from user. (And even then,
there's the question of whether to trust that AI.) Anything which a user
can do, a malicious attacker which has gained sufficient control over
the rest of the system (or tricked the user into thinking it's
non-malicious) can also do.

The question therefore becomes which is more important: enabling the
user to do what they want, or preventing the attacker from doing what
they want.

Many people (including me) say that it's the former. This proposal, and
some of the ways in which it is being defended, would appear to say that
it's the latter.

- --
The Wanderer

The reasonable man adapts himself to the world; the unreasonable one
persists in trying to adapt the world to himself. Therefore all
progress depends on the unreasonable man. -- George Bernard Shaw
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQIcBAEBCgAGBQJU53c8AAoJEASpNY00KDJrYccP/iC00YEb0OFYt1ToIlc1REqB
HzNhtu/r9SvDXr/1y3r4KLX7fiNJYdXwUIxS8Ae/eKI77p5DSKpwZOVD28ZZh1Qv
aL+/ry5SviMTW5ttTo/sh3d7t5UW+33InUEF7aWW/mEKywehwpUngjMWNjRyKXjP
ClHhSF5mevIdsr5vJB0Y1laat1PdXCldf3izrMtCtP+Hbtbg4i0375CGGyC1i+JY
uTEIBtIujbMfA/GcF3ddrQRsD0Edku8JrLZikGJy/+5yYZ0sNa3I4/3RfsZFwIfY
2jS6wlbSP2pdghBcyn3pdodPthF/4LLtG2MhUTsE3/f+R5R086t6Dy1vGfxp+oDW
vA9KKcBEZ4RdRTF+qchy7QlCyF5A3R7DJFikZWwvYgxQNYKaerUFgZaB402Y71oI
ek1U93e9sWm0XKheYilyEnyS4qwylPWh/nCBfCU1GULrZW3AbpqLWvqMi4BoYfBf
uRs+QthhAfqSxkjyrlPVZ7SsJiOhcFVk3qFgk185+D10w/WXZzHEAzqAixSoHQBt
wQg+c/300h+roXRR7IS8K+u1qJzdzWutlZb5+rDO+KRfp6Dj4/R+u9pFTWjFEyKK
UlEOQpCTBSvrB/B/Fq6uJ5Pot0POt7mRYgxN1KyjE6y49c9xOtk39kEdufkfyMdI
VxPfZnAlvaTJlJOIBFmO
=K014
-----END PGP SIGNATURE-----

Jorge Villalobos

unread,
Feb 23, 2015, 1:35:18 PM2/23/15
to mozilla-addons-...@lists.mozilla.org
On 2/20/15 4:47 AM, Jim Porter wrote:
> On 02/19/2015 03:05 PM, Jorge Villalobos wrote:
>> The argument is not only about what the user is willing to do, but
>> also to what extent is a malware developer going to go to accomplish
>> what they want. Is this approach sufficiently effective to be worth
>> the cost? There's also the point that Mike has brought up a couple of
>> times on this thread, which is that the line between acceptable and
>> unacceptable behavior is much clearer after we implement this. A
>> piece of software that goes into all of this effort to trick the user
>> is clearly malicious and should be dealt with at the AV / OS level.
>
> I'm not sure I agree. A developer who simply chooses not to comply with these restrictions (e.g. for ideological reasons) and tells users to install an unbranded build isn't malicious. I think it's a pretty fine line between that and a developer who tells users who install an unbranded build because they want to do something nefarious. In terms of how the information is presented to the user, these two cases could be nearly identical.
>
> To be fair, it might not be worth accommodating developers who are ideologically opposed to this, but I can certainly imagine a number of cases where it wouldn't be as clear that the software is malicious (at least from a legal perspective). For instance, is it sufficiently-malicious if the software just provides a link to the unbranded build for the user to install themselves?

If the user is making an informed choice of installing one of those
builds in order to use some add-on, I don't think that's malicious
behavior. If a piece of software auto-installs one of those builds and
its extension, and attempts to make it the default application to
override Firefox, then I think that's malicious.

>
>> 1) There's plenty of malware that acts without admin access.
>
> It should be possible to eliminate this kind of malware with a much less-restrictive system, so I don't see this as an argument in favor of the current proposal by itself.

We're open to suggestions. I'll be posting a followup this week (I hope)
in which I'll explain the restrictions we're working with.

>> 2) Even if the malware runs with admin access, there are some things we can
>> do about it, like making it sufficiently difficult for the malware to
>> be practical, and making it sufficiently clear that the malware is
>> hacking into Firefox or tricking the user into installing other
>> software rather than just changing a configuration setting that
>> happens to be in the programs folder.
>
> I think there are many things that are more clearly-evil than just swapping a pref, such as installing a cryptographic certificate to subvert a system's security. In fact, this sort of tactic is getting quite a bit of news lately with Superfish, so I think it would be pretty uncontroversial to claim that the addition of a cert is "evil".
>
> Using this as a guide, you could draft a proposal requiring the installation of a cert (or certs; perhaps one for each unsigned add-on) to Firefox's programs folder in order to allow unsigned add-ons. This might be getting close to the existing plan for "extensions that will never be publicly distributed and will never leave an internal network", but I'm not entirely clear on what that plan is.
>
>> I don't think that getting a few users to try this out is enough to
>> reach any useful conclusions. There's certainly value in doing user
>> research, but I think that the kind of study that would be needed to
>> confirm this would be costlier and take much more time. I'm not in a
>> position to say if that's something we could invest money on, but I
>> wouldn't be in favor of delaying this project anymore (it has existed
>> in one form or another for years).
>
> I just don't think it's acceptable to make guesses in this situation. If we intend to protect users from themselves*, we should be *sure* we're protecting them. It's possible this solution goes too far, or not far enough, but without testing, we can't know. I know some folks have suggested that we can dial the restrictions up or down after we've deployed this, but given that this is a rather inflammatory issue (a fact that I'm sure did not surprise you in the least), I think we should try to get it right the first time. It doesn't exactly inspire confidence in me if this system keeps changing after it's deployed.

While I agree that working with more data would be good, I don't agree
we're just working based on guesses.

>> If you look at the past history of this list, you'll see how I've
>> been fighting to do that for a very long time. I think this
>> signature proposal strikes a very good balance of giving developers
>> the freedom to create and distribute the add-ons they want, giving
>> users the freedom to install what they want, and giving us some level
>> of control over malicious add-on distribution.
>
> As I'm sure you can tell, I (and others) disagree. Had I known such discussions were happening, I'd have gotten involved sooner, but unfortunately, discussions like this never seem to escape to the rest of Mozilla until they're nearing the end of the design phase (or else I'm just really out of the loop, which is entirely possible).
>
> I think some of the issue is that the original blog post is rather short on details (e.g. what exactly the requirements are, what proposals were rejected, etc). This makes it harder for anyone to submit counter-proposals, because now they have to read the entire archives or risk proposing something that doesn't meet the requirements, and it also serves to make people even more upset, because they won't be aware of what all the requirements are. Perhaps there really is a reason that no other solution on earth would work, but I can't tell that from the blog post.

Yes, most of this should be covered in the followup. This was a big
announcement and we wanted to keep it focused on what it means for
developers and Firefox users. The followup will have more details and
should hopefully make it easier to discuss this on a technical level.

Robert Kaiser

unread,
Feb 27, 2015, 3:55:14 PM2/27/15
to mozilla-addons-...@lists.mozilla.org
Jorge Villalobos schrieb:
> 2) Even if the malware runs with admin access, there are some things we
> can do about it

I think that's a major point our disagreement here is about. We may be able to do a few things to make greyware/adware authors to go farther into malware territory, but IMHO, if they run with admin access, we pretty much have lost already. They can hook into our process and manipulate it in whatever fashion they want, they can run keyloggers, they can emulate mouse and keyboard actions. I do not think it's feasible to fight against that.

KaiRo

Robert Kaiser

unread,
Feb 27, 2015, 3:59:03 PM2/27/15
to mozilla-addons-...@lists.mozilla.org
The Wanderer schrieb:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA512
>
> On 02/18/2015 at 05:46 PM, Robert Kaiser wrote:
>
>> IMHO, that just shows that the proposed solution isn't complete
>> yet, as we still need to find that way of how a user can override
>> safely and keep control of their experience and their browser
>> without the attacker being able to override. There must be a good
>> solution for that.
>
> There is no such solution - or rather, such a solution would require
> "strong AI", in order to distinguish attacker from user. (And even then,
> there's the question of whether to trust that AI.) Anything which a user
> can do, a malicious attacker which has gained sufficient control over
> the rest of the system (or tricked the user into thinking it's
> non-malicious) can also do.

If they have "gained sufficient control over the rest of the system", then I think it's not reasonable to even think we can fight them.

> The question therefore becomes which is more important: enabling the
> user to do what they want, or preventing the attacker from doing what
> they want.
>
> Many people (including me) say that it's the former. This proposal, and
> some of the ways in which it is being defended, would appear to say that
> it's the latter.

https://www.mozilla.org/en-US/about/manifesto/

"05
Individuals must have the ability to shape the Internet and their own experiences on it."

That say enough for me on where Mozilla *has* to stand or it's not true to its own principles.

KaiRo

Alex

unread,
Jun 3, 2015, 12:05:42 PM6/3/15
to mozilla-addons-...@lists.mozilla.org
On Wednesday, 18 February 2015 15:25:02 UTC-5, Jim Porter wrote:
> To be clear: I don't think anyone is arguing against requiring add-ons to be signed by default; they're only arguing against requiring users to download a different build to run unsigned add-ons.
>
> These proposals seem to be assuming that malware authors don't have access to Firefox's Program Files folder (or similar for other platforms, but let's focus on Windows since that's where the userbase is), or at least that we have some recourse if malware authors just patched the Firefox binary. Doesn't this imply that *any* solution that requires write access to Firefox's Program Files folder would be equally good?
>
> For instance, if we took something like Jeff Lyon's proposal of generating a crypto token from a Mozilla website, but required users to personally place the token in Firefox's Program Files folder, doesn't this provide as much security as downloading a new build? In either case, Windows' UAC will prevent this activity from being automated. In effect, by adding the token, you're installing a new "build" of Firefox, but with the benefit that you're still on the same version as before. This way, users/developers could still use the release channel as needed and be able to run unsigned code, which I see as a hard requirement for add-on development.

This is an excellent suggestions that should be addressed.
0 new messages