Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Enabling certified app debugging on production phones.

31 views
Skip to first unread message

Paul Theriault

unread,
Sep 8, 2014, 5:20:02 AM9/8/14
to Moz dev-b2g
Currently in order to debug certified apps (i.e. gaia apps) you need a phone which is rooted, in order to set the "devtools.debugger.forbid-certified-apps" preference to false. Having this preference set to true is required on production phones as it allows basically root-level access through the remote debugger. But it leaves the strange situation where you can install certified apps, but you can’t debug them, which isn’t particularly useful, and also means a large attack surface for an attacker with physical access.

The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.

My team has been working on a proposal to remedy this situation:
- Introduce an “os-developer” mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.

Pros:
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data

Cons:
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU

There are a lot of other details and considerations, but I’ll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]

Thoughts & suggestions welcome.

- Paul

[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#

signature.asc

Jan Jongboom

unread,
Sep 9, 2014, 8:16:33 AM9/9/14
to mozilla...@lists.mozilla.org
Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.

Pin code sounds like a proper way of enabling this on consumer phones.

Kartikaya Gupta

unread,
Sep 9, 2014, 9:00:12 AM9/9/14
to Paul Theriault
On 8/9/2014, 5:20, Paul Theriault wrote:
> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.

Thanks for pointing this out, as it is an important distinction that is
the heart of the problem.

> Cons:
> - A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset

When they do a factory reset, is there a mechanism available for them to
backup and restore their data? (I admit I'm unfamiliar with what the
average user would use for this - a quick search online seems to
indicate you have to use adb to do this). If there is a mechanism, what
prevents the "malicious person who just found your phone" from doing
this data backup and stealing your data? Is this somehow a less-bad
scenario than the malicious person being able to enable os-developer mode?

I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.

Seeing as the heart of the problem is distinguishing the device owner
and Mr. Malicious, perhaps we could ask for some piece of information
the device owner is much more likely to have. The SIM PIN might be such
a thing, or maybe some other unique identifier that comes with the phone
but isn't physically present or accessible on the handset itself.

Cheers,
kats

Stéphanie Ouillon

unread,
Sep 9, 2014, 9:53:11 AM9/9/14
to dev...@lists.mozilla.org
Hi,
On 09/09/2014 15:00, Kartikaya Gupta wrote:
> On 8/9/2014, 5:20, Paul Theriault wrote:
>> The challenge we had when talking through this situation previously
>> was that its difficult to distinguish between the device's owner &
>> someone who has just found your phone, and wants to take advantage of
>> developer mode to compromise your phone and/or data.
>
> Thanks for pointing this out, as it is an important distinction that is
> the heart of the problem.
>
>> Cons:
>> - A user must set passcode at FTU (and remember it!), else they wont
>> be able to use this mode without a factory reset
>
> When they do a factory reset, is there a mechanism available for them to
> backup and restore their data? (I admit I'm unfamiliar with what the
> average user would use for this - a quick search online seems to
> indicate you have to use adb to do this). If there is a mechanism, what
> prevents the "malicious person who just found your phone" from doing
> this data backup and stealing your data? Is this somehow a less-bad
> scenario than the malicious person being able to enable os-developer mode?

Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.

>
> I just worry that forcing a factory reset in this scenario is going to
> place a big barrier to allowing our users to organically grow from
> "users" to "webmaker". That is, they will find it much harder to learn
> and hack their phones in ways that we should be should be actively
> encouraging.
>

This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?


> Seeing as the heart of the problem is distinguishing the device owner
> and Mr. Malicious, perhaps we could ask for some piece of information
> the device owner is much more likely to have. The SIM PIN might be such
> a thing, or maybe some other unique identifier that comes with the phone
> but isn't physically present or accessible on the handset itself.

Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
The issue we're hitting is always the same: how to make sure it's the
actual owner of the device who is initializing _first_ the
authentication service (setting a PIN code, synchronizing to a backup
service, etc) while protecting the data. Hence the reset factory solution...


St�phanie

Dale Harvey

unread,
Sep 9, 2014, 10:25:28 AM9/9/14
to Stéphanie Ouillon, dev-b2g
I am likely ignorant about the reasoning behind some of the security decisions made for our device, however I have been fustrated by them, in previous lives doing android and ios development I have found fxos fairly similiarly fustrating as ios between different areas, and android comparatively a huge amount easier.

Its possible I am entirely off base and there are reasons we cant make life this easy, but at least having them explained would maybe help ease the fustration.

With android development, I get my device (user build), enable the developer menu (it used to be tap a button 7 times, now its shake it I think), turn on debugging, when my phone connects to my computer I accept a prompt that says that computer is allowed to access my device, at that point I have unfetered access and adb continues to work no matter if my screen is switched off or restarted or is in various other states in which we lose adb access.

If my phone has a pin then the only way to get adb access is to get access to my device unlocked, at that point all bets are off which I think is entirely reasonable.

Even in development builds but particularly with user builds adb access to the device is extremely flakey and I routinely have to go into fastboot mode to reflash my entire device, like if I push a syntax error in the system app in a user build then adb will never be reenabled.

Having to do a factory reset, or as was mentioned in the google doc sign up to some firefox online account seems straight up developer hostile to me.

On 9 September 2014 15:53, Stéphanie Ouillon <stepho...@mozilla.com> wrote:
Hi,
On 09/09/2014 15:00, Kartikaya Gupta wrote:
> On 8/9/2014, 5:20, Paul Theriault wrote:
>> The challenge we had when talking through this situation previously
>> was that its difficult to distinguish between the device's owner &
>> someone who has just found your phone, and wants to take advantage of
>> developer mode to compromise your phone and/or data.
>
> Thanks for pointing this out, as it is an important distinction that is
> the heart of the problem.
>
>> Cons:
>> - A user must set  passcode at FTU (and remember it!), else they wont
>> be able to use this mode without a factory reset
>


Stéphanie
_______________________________________________
dev-b2g mailing list
dev...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-b2g

Stéphanie Ouillon

unread,
Sep 9, 2014, 11:28:13 AM9/9/14
to Dale Harvey, dev-b2g
On 09/09/2014 16:25, Dale Harvey wrote:

> Even in development builds but particularly with user builds adb access
> to the device is extremely flakey and I routinely have to go into
> fastboot mode to reflash my entire device, like if I push a syntax error
> in the system app in a user build then adb will never be reenabled.
>


Sounds like an issue unrelated with security decisions, but I see your
point. To understand a bit more the UX flow: if you flash your phone,
then either

1) you have a standard build, and you have to go through the FTU. You
can then enable the 'os-developer' mode without having to add too much
steps (well, this is a matter of discussion of course). And since I'm
not sure if it was stated clearly enough in the first email: you would
need to go through this procedure only once, should you remember the PIN
code you set the first time.

2) or you flash your custom builds, in which case maybe it would be
possible to set a pref to enable os-developer mode by default if you
already configured it to skip the FTU, activate adb, etc?


> Having to do a factory reset, or as was mentioned in the google doc sign
> up to some firefox online account seems straight up developer hostile to me.
>

The thing is, if the use case is a developer who is often flashing his
device to do testing or development, then the security risk related to
(security sensitive) data loss is pretty low, imho.
But if you're an attacker or a lambda user deciding to turn your
everyday phone into the os-developer mode (which currently is done
through the same routine of stopping/starting b2g via adb and setting a
pref, assuming you have a rooted device), then it's another story.


As a side note, for people debugging with the help of the App
Manager/WebIDE, the devtools team is working on implementing remote
debugging over Wi-Fi (without using adb at all) [1] .

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=962308

Fabrice Desré

unread,
Sep 9, 2014, 12:00:56 PM9/9/14
to dev...@lists.mozilla.org
On 09/09/2014 06:53 AM, St�phanie Ouillon wrote:

> Definitely not, since what we want to achieve ultimately is protecting
> the user's data. But I don't know the details of the possible solutions
> for the backup and restore mechanism, so I'll let better informed people
> answer this.

Right now we don't have any good solution. We discussed possible use
case/api on this mailing list already, and maybe we'll have time to
implement for the next release!

Fabrice
--
Fabrice Desr�
b2g team
Mozilla Corporation

Kartikaya Gupta

unread,
Sep 9, 2014, 1:32:49 PM9/9/14
to Stéphanie Ouillon
On 9/9/2014, 9:53, St�phanie Ouillon wrote:
>> I just worry that forcing a factory reset in this scenario is going to
>> place a big barrier to allowing our users to organically grow from
>> "users" to "webmaker". That is, they will find it much harder to learn
>> and hack their phones in ways that we should be should be actively
>> encouraging.
>>
>
> This 'os-developer' mode is meant for people who want to write and debug
> certified apps. This factory reset scenario won't impact web app
> developers (privileged, web). Are would-be Gaia developers the target
> you're concerned about?
>

I'm concerned about all users. It may seem like the number of users who
would want to debug certified apps is small, but consider that many
developers start because of the "scratch your own itch" paradigm - that
is, many developers start digging, debugging and hacking because there's
some deficiency in the app that bothers them and that they actually want
to fix. Given that most of their interaction will be with the core
built-in apps, which are mostly (entirely?) certified apps, it makes
sense that most of the itches they will want to scratch will be in this
category.

> Since the SIM can be removed and replaced by the attacker's SIM, it
> doesn't look like a right candidate. That's why we consider the device
> PIN code instead.

Good point, the SIM is probably not the right thing then. I don't have
any better ideas :(

kats

Kartikaya Gupta

unread,
Sep 9, 2014, 1:34:29 PM9/9/14
to Fabrice Desré
On 9/9/2014, 12:00, Fabrice Desr� wrote:
> On 09/09/2014 06:53 AM, St�phanie Ouillon wrote:
>
>> Definitely not, since what we want to achieve ultimately is protecting
>> the user's data. But I don't know the details of the possible solutions
>> for the backup and restore mechanism, so I'll let better informed people
>> answer this.
>
> Right now we don't have any good solution. We discussed possible use
> case/api on this mailing list already, and maybe we'll have time to
> implement for the next release!

Ok, it just feels premature to me to make a decision on requiring a
factory reset without having decided how the user will do a
backup/restore. Particularly if the backup/restore allows malicious
people to grab the user data anyway.

kats

Jared Hirsch

unread,
Sep 9, 2014, 4:05:46 PM9/9/14
to Jan Jongboom, mozilla...@lists.mozilla.org
Hi Paul,

Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.

BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.

On Sep 9, 2014, at 5:16 AM, Jan Jongboom <janjo...@gmail.com> wrote:

> On Monday, September 8, 2014 11:20:02 AM UTC+2, Paul Theriault wrote:
>>
>> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.

Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.

It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).

So, maybe the user doesn't need to prove device ownership before enabling certified debugging?

>>
>>
>>
>> My team has been working on a proposal to remedy this situation:
>>
>> - Introduce an "os-developer" mode
>>
>> - Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
>>
>> - Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
>>
>> - When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
>>
>> - The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
>>
>> - Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.

The "developer PIN" concept and UX seem quite complex.

What if we just add an "enable certified app debugging" checkbox to the developer menu?

The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).

Cheers,

Jared


>>
>>
>>
>> Pros:
>>
>> - Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
>>
>>
>>
>> Cons:
>>
>> - A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
>>
>> - In the past there has been pushback on having passcode selection in FTU
>>
>>
>>
>> There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
>>
>>
>>
>> Thoughts & suggestions welcome.
>>
>>
>>
>> - Paul
>>
>>
>>
>> [1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
>
> Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.
>
> Pin code sounds like a proper way of enabling this on consumer phones.

Jan Jongboom

unread,
Sep 9, 2014, 4:31:46 PM9/9/14
to Jared Hirsch, mozilla...@lists.mozilla.org
Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.

On Tue, Sep 9, 2014 at 10:05 PM, Jared Hirsch <6a...@mozilla.com> wrote:
Hi Paul,

Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.

BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.

On Sep 9, 2014, at 5:16 AM, Jan Jongboom <janjo...@gmail.com> wrote:

> On Monday, September 8, 2014 11:20:02 AM UTC+2, Paul Theriault wrote:
>>
>> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.

Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.

It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).

So, maybe the user doesn't need to prove device ownership before enabling certified debugging?

>>
>>
>>
>> My team has been working on a proposal to remedy this situation:
>>
>> - Introduce an "os-developer" mode
>>
>> - Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
>>
>> - Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
>>
>> - When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
>>
>> - The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
>>
>> - Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.

The "developer PIN" concept and UX seem quite complex.

What if we just add an "enable certified app debugging" checkbox to the developer menu?

The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).

Cheers,

Jared


>>
>>
>>
>> Pros:
>>
>> - Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
>>
>>
>>
>> Cons:
>>
>> - A user must set  passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
>>
>> - In the past there has been pushback on having passcode selection in FTU
>>
>>
>>
>> There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
>>
>>
>>
>> Thoughts & suggestions welcome.
>>
>>
>>
>> - Paul
>>
>>
>>
>> [1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
>

Jared Hirsch

unread,
Sep 9, 2014, 4:49:29 PM9/9/14
to Jan Jongboom, mozilla...@lists.mozilla.org
On Sep 9, 2014, at 1:31 PM, Jan Jongboom <janjo...@gmail.com> wrote:

> Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.

Actually, I think we're fine here - I don't see how this differs from the threat mentioned already.

If an attacker finds your phone, then presumably you have lost it. In that case, you can use FMD to lock or wipe it remotely.

I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^

>
> On Tue, Sep 9, 2014 at 10:05 PM, Jared Hirsch <6a...@mozilla.com> wrote:
> Hi Paul,
>
> Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.
>
> BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
>
> On Sep 9, 2014, at 5:16 AM, Jan Jongboom <janjo...@gmail.com> wrote:
>
> > On Monday, September 8, 2014 11:20:02 AM UTC+2, Paul Theriault wrote:
> >>
> >> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
>
> Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
>
> It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).
>
> So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
>
> >>
> >>
> >>
> >> My team has been working on a proposal to remedy this situation:
> >>
> >> - Introduce an "os-developer" mode
> >>
> >> - Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
> >>
> >> - Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
> >>
> >> - When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
> >>
> >> - The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
> >>
> >> - Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
>
> The "developer PIN" concept and UX seem quite complex.
>
> What if we just add an "enable certified app debugging" checkbox to the developer menu?
>
> The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).
>
> Cheers,
>
> Jared
>
>
> >>
> >>
> >>
> >> Pros:
> >>
> >> - Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
> >>
> >>
> >>
> >> Cons:
> >>
> >> - A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
> >>
> >> - In the past there has been pushback on having passcode selection in FTU
> >>
> >>
> >>
> >> There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
> >>
> >>
> >>
> >> Thoughts & suggestions welcome.
> >>
> >>
> >>
> >> - Paul
> >>
> >>
> >>
> >> [1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
> >

Paul Theriault

unread,
Sep 9, 2014, 11:14:18 PM9/9/14
to Dale Harvey, Stéphanie Ouillon, dev-b2g
On 10 Sep 2014, at 12:25 am, Dale Harvey <da...@arandomurl.com> wrote:

I am likely ignorant about the reasoning behind some of the security decisions made for our device, however I have been fustrated by them, in previous lives doing android and ios development I have found fxos fairly similiarly fustrating as ios between different areas, and android comparatively a huge amount easier.

Its possible I am entirely off base and there are reasons we cant make life this easy, but at least having them explained would maybe help ease the fustration.

With android development, I get my device (user build), enable the developer menu (it used to be tap a button 7 times, now its shake it I think), turn on debugging, when my phone connects to my computer I accept a prompt that says that computer is allowed to access my device, at that point I have unfetered access and adb continues to work no matter if my screen is switched off or restarted or is in various other states in which we lose adb access.

This is useful feedback but i think adb access is separate to the discussion here. The only reason to prevent new ADB connections from a security perspective is when the device is locked with a passcode, so anything else we should consider a bug and fix. Which I think would alleviate most of your pain points above.

But that is a little separate to what I was proposing. 

What I am talking about here is access to debug Firefox OS itself, with requiring the phone to be rooted, but whilst also protecting user data.

To debug the main process or certified apps, you currently you need root access, which means you have to have access to, and flash a build with root enabled. Im trying to improve this situation, by providing a way to get this access, but while also protecting the user data which is only protected by virtue of root access not being available.

If my phone has a pin then the only way to get adb access is to get access to my device unlocked, at that point all bets are off which I think is entirely reasonable.

Even in development builds but particularly with user builds adb access to the device is extremely flakey and I routinely have to go into fastboot mode to reflash my entire device, like if I push a syntax error in the system app in a user build then adb will never be reenabled.
Having to do a factory reset, or as was mentioned in the google doc sign up to some firefox online account seems straight up developer hostile to me.

I don’t see why? Enabling this level of debugging is equivalent to rooting. Im struggling to see how providing a way for developers to have effectively root access production phones MORE hostile than a situation where its currently not possible. 

AFAIK Android does exactly the same thing btw, with "fastboot oem unlock” when rooting. 



On 9 September 2014 15:53, Stéphanie Ouillon <stepho...@mozilla.com> wrote:
Hi,
On 09/09/2014 15:00, Kartikaya Gupta wrote:
> On 8/9/2014, 5:20, Paul Theriault wrote:
>> The challenge we had when talking through this situation previously
>> was that its difficult to distinguish between the device's owner &
>> someone who has just found your phone, and wants to take advantage of
>> developer mode to compromise your phone and/or data.
>
> Thanks for pointing this out, as it is an important distinction that is
> the heart of the problem.
>
>> Cons:
>> - A user must set  passcode at FTU (and remember it!), else they wont
>> be able to use this mode without a factory reset
>
> When they do a factory reset, is there a mechanism available for them to
> backup and restore their data? (I admit I'm unfamiliar with what the
> average user would use for this - a quick search online seems to
> indicate you have to use adb to do this). If there is a mechanism, what
> prevents the "malicious person who just found your phone" from doing
> this data backup and stealing your data? Is this somehow a less-bad
> scenario than the malicious person being able to enable os-developer mode?

Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.

>

> I just worry that forcing a factory reset in this scenario is going to
> place a big barrier to allowing our users to organically grow from
> "users" to "webmaker". That is, they will find it much harder to learn
> and hack their phones in ways that we should be should be actively
> encouraging.
>

This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?
> Seeing as the heart of the problem is distinguishing the device owner
> and Mr. Malicious, perhaps we could ask for some piece of information
> the device owner is much more likely to have. The SIM PIN might be such
> a thing, or maybe some other unique identifier that comes with the phone
> but isn't physically present or accessible on the handset itself.

Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
The issue we're hitting is always the same: how to make sure it's the
actual owner of the device who is initializing _first_ the
authentication service (setting a PIN code, synchronizing to a backup
service, etc) while protecting the data. Hence the reset factory solution...


Stéphanie
_______________________________________________
dev-b2g mailing list
dev...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-b2g
signature.asc

Paul Theriault

unread,
Sep 9, 2014, 11:17:30 PM9/9/14
to Jared Hirsch, mozilla...@lists.mozilla.org, Jan Jongboom

On 10 Sep 2014, at 6:49 am, Jared Hirsch <6a...@mozilla.com> wrote:

> On Sep 9, 2014, at 1:31 PM, Jan Jongboom <janjo...@gmail.com> wrote:
>
>> Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.
>
> Actually, I think we're fine here - I don't see how this differs from the threat mentioned already.
>
> If an attacker finds your phone, then presumably you have lost it. In that case, you can use FMD to lock or wipe it remotely.

If you actually set up FMD. If it has battery. If it has network connectivity etc.
>
> I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^

Yeh I agree that is confusing in the doc - we talk about a developer pin code which is kept in sync with the lockscreen passcode. Overly complex i think. It should just BE the lockscreen passcode.

>
>>
>> On Tue, Sep 9, 2014 at 10:05 PM, Jared Hirsch <6a...@mozilla.com> wrote:
>> Hi Paul,
>>
>> Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.
>>
>> BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
>>
>> On Sep 9, 2014, at 5:16 AM, Jan Jongboom <janjo...@gmail.com> wrote:
>>
>>> On Monday, September 8, 2014 11:20:02 AM UTC+2, Paul Theriault wrote:
>>>>
>>>> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
>>
>> Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
>>
>> It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).
>>
>> So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
>>
>>>>
>>>>
>>>>
>>>> My team has been working on a proposal to remedy this situation:
>>>>
>>>> - Introduce an "os-developer" mode
>>>>
>>>> - Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
>>>>
>>>> - Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
>>>>
>>>> - When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
>>>>
>>>> - The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
>>>>
>>>> - Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
>>
>> The "developer PIN" concept and UX seem quite complex.
>>
>> What if we just add an "enable certified app debugging" checkbox to the developer menu?
>>
>> The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).
>>
>> Cheers,
>>
>> Jared
>>
>>
>>>>
>>>>
>>>>
>>>> Pros:
>>>>
>>>> - Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
>>>>
>>>>
>>>>
>>>> Cons:
>>>>
>>>> - A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
>>>>
>>>> - In the past there has been pushback on having passcode selection in FTU
>>>>
>>>>
>>>>
>>>> There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
>>>>
>>>>
>>>>
>>>> Thoughts & suggestions welcome.
>>>>
>>>>
>>>>
>>>> - Paul
>>>>
>>>>
>>>>
>>>> [1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
>>>
>>> Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.
>>>
>>> Pin code sounds like a proper way of enabling this on consumer phones.
signature.asc

Andrew Sutherland

unread,
Sep 10, 2014, 1:30:49 AM9/10/14
to dev...@lists.mozilla.org
This seems like a good idea, but I think the approach may not go far
enough. I have some suggestions.

I think there are a few scenarios that interact with the proposed
functionality:
1: Lost, locked device found by a nefarious person with no plans to
return it
2: Device in the possession of a nefarious person who intends a
persistent attack on the owner of the device.

1) In the forever-lost-to-evil case we want the attacker to not be able
to get at the data, so wiping the data as a prerequisite to gaining
root-ish access seems absolutely correct. And as Paul notes, it's quite
conceivable for even a limited-capability attacker to keep the device
out-of-touch via network, etc., making remote wipe insufficient on its
own. Of course, since the data is not encrypted on the flash, we will
lose to more capable attackers at this time.

2) In the root-and-return case we want the true owner of the device to
be able to know that their device has been tampered with. The problem
is that once a device has been rooted, the device can no longer be
trusted to indicate that it's been tampered with at this time.
Obviously if we do enough trusted-computing stuff we could get the boot
process to indicate rooting (a Firefox with a robber's mask!), but I
think we're pretty far from that right now.

Nuking the user's data is an excellent indicator of potential
tampering. But that's not reliably being proposed here. The current
proposal as I read it asks the user "want a lock screen code?" at
first-run but the actual decision is "want a lock screen code and for it
also to enable a super-powerful developer mode that could let someone
persistently pwn your device?"

It seems like it would be better to just ask the user outright whether
they'd like to enable super-dangerous debugging mode and how they'd like
to secure that debugging mode. For example, I would personally prefer
that the code that would let an attacker persistently root my device be
more sophisticated than the 4-digit code I'm potentially typing in front
of everyone all the time and smudging onto the screen with my
fingerprints. Or, since people hate FTU screens, make it an opt-in
under an "advanced..." or "developer..." call-out in the flow that will
catch the eye of developers/tinkerers but not infuriate most users.


There are other possible tamper indication options, and using those
could let the user upgrade to "super developer mode" without nuking all
their data. For example, if the device is bound to a Firefox account,
the Mozilla server could potentially generate a rooting-unlock
authorization for the device if-and-only-if the account has been bound
to the device for some number of days. The user hits the "hey, I wanna
be a super-fancy developer and do dangerous stuff" button. We set some
arbitrary delay on this, N hours:
- send out an email to the associated email address immediately, and
then randomly at some point in the next N/2 hours (the idea being not to
be predictable so if the attacker is able to use the email app to delete
the email they have to be at least somewhat competent rather than just
waiting exactly 2 hours).
- present a persistent notification in the tray "still want to root your
device?" for the duration of the N hours.
At the end of the time period the device gets unlocked and a persistent
note is made on the Firefox Account for the device.

The general idea is that you have to lose control of your device for an
extended period of time and our web services infrastructure can help
provide notifications via other channels if we have them. (In an ideal
world everyone has both a Firefox OS phone on them and a Firefox OS
tablet at home, right?)

Honestly, the bang/buck effort seems way off for this compared to "opt
in to developer mode at first-run, potentially having to wipe your
device." And until we provide more support for layered
security/encryption, in many cases there isn't much of a point since the
weak 4-digit pass-code is all that's standing between the attacker and
the user's email account(s)/etc.

Also, many interesting permutations of this potentially want the
processor/chipset to have a non-extractable private crypto key that can
be used to prove the device is who it says it is. Various things using
serial numbers/MACs/etc. are too predictable or just accessible to
would-be attackers on the back of the box or inside the battery case. I
think many interesting server-assisted mechanisms depend on a
non-forge-able device id where the initial owner of the device can
reliably bind the device to some other authentication factors. (So it
becomes "*initial* possession is nine tenths of the law" rather than
just "possession".)

Andrew

Jan Jongboom

unread,
Sep 10, 2014, 3:46:48 AM9/10/14
to Paul Theriault, Jared Hirsch, mozilla...@lists.mozilla.org
+1

On Wed, Sep 10, 2014 at 5:17 AM, Paul Theriault <pther...@mozilla.com> wrote:

On 10 Sep 2014, at 6:49 am, Jared Hirsch <6a...@mozilla.com> wrote:

> On Sep 9, 2014, at 1:31 PM, Jan Jongboom <janjo...@gmail.com> wrote:
>
>> Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.
>
> Actually, I think we're fine here - I don't see how this differs from the threat mentioned already.
>
> If an attacker finds your phone, then presumably you have lost it. In that case, you can use FMD to lock or wipe it remotely.

If you actually set up FMD. If it has battery. If it has network connectivity etc.
>
> I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^

Yeh I agree that is confusing in the doc - we talk about a developer pin code which is kept in sync with the lockscreen passcode. Overly complex i think. It should just BE the lockscreen passcode.

>
>>
>> On Tue, Sep 9, 2014 at 10:05 PM, Jared Hirsch <6a...@mozilla.com> wrote:
>> Hi Paul,
>>
>> Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.
>>
>> BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
>>
>> On Sep 9, 2014, at 5:16 AM, Jan Jongboom <janjo...@gmail.com> wrote:
>>
>>> On Monday, September 8, 2014 11:20:02 AM UTC+2, Paul Theriault wrote:
>>>>
>>>> The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
>>
>> Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
>>
>> It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).
>>
>> So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
>>
>>>>
>>>>
>>>>
>>>> My team has been working on a proposal to remedy this situation:
>>>>
>>>> - Introduce an "os-developer" mode
>>>>
>>>> - Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
>>>>
>>>> - Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
>>>>
>>>> - When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
>>>>
>>>> - The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
>>>>
>>>> - Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
>>
>> The "developer PIN" concept and UX seem quite complex.
>>
>> What if we just add an "enable certified app debugging" checkbox to the developer menu?
>>
>> The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).
>>
>> Cheers,
>>
>> Jared
>>
>>
>>>>
>>>>
>>>>
>>>> Pros:
>>>>
>>>> - Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
>>>>
>>>>
>>>>
>>>> Cons:
>>>>
>>>> - A user must set  passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
>>>>
>>>> - In the past there has been pushback on having passcode selection in FTU
>>>>
>>>>
>>>>
>>>> There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
>>>>
>>>>
>>>>
>>>> Thoughts & suggestions welcome.
>>>>
>>>>
>>>>
>>>> - Paul
>>>>
>>>>
>>>>
>>>> [1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
>>>

Paul Theriault

unread,
Sep 10, 2014, 3:48:12 AM9/10/14
to Andrew Sutherland, dev...@lists.mozilla.org

On 10 Sep 2014, at 3:30 pm, Andrew Sutherland <asuth...@asutherland.org> wrote:

> This seems like a good idea, but I think the approach may not go far enough. I have some suggestions.
>
> I think there are a few scenarios that interact with the proposed functionality:
> 1: Lost, locked device found by a nefarious person with no plans to return it
> 2: Device in the possession of a nefarious person who intends a persistent attack on the owner of the device.
>
> 1) In the forever-lost-to-evil case we want the attacker to not be able to get at the data, so wiping the data as a prerequisite to gaining root-ish access seems absolutely correct. And as Paul notes, it's quite conceivable for even a limited-capability attacker to keep the device out-of-touch via network, etc., making remote wipe insufficient on its own. Of course, since the data is not encrypted on the flash, we will lose to more capable attackers at this time.
>
> 2) In the root-and-return case we want the true owner of the device to be able to know that their device has been tampered with. The problem is that once a device has been rooted, the device can no longer be trusted to indicate that it's been tampered with at this time. Obviously if we do enough trusted-computing stuff we could get the boot process to indicate rooting (a Firefox with a robber's mask!), but I think we're pretty far from that right now.
>
> Nuking the user's data is an excellent indicator of potential tampering. But that's not reliably being proposed here. The current proposal as I read it asks the user "want a lock screen code?" at first-run but the actual decision is "want a lock screen code and for it also to enable a super-powerful developer mode that could let someone persistently pwn your device?"
>
> It seems like it would be better to just ask the user outright whether they'd like to enable super-dangerous debugging mode and how they'd like to secure that debugging mode. For example, I would personally prefer that the code that would let an attacker persistently root my device be more sophisticated than the 4-digit code I'm potentially typing in front of everyone all the time and smudging onto the screen with my fingerprints. Or, since people hate FTU screens, make it an opt-in under an "advanced..." or "developer..." call-out in the flow that will catch the eye of developers/tinkerers but not infuriate most users.

Both good points...

>
>
> There are other possible tamper indication options, and using those could let the user upgrade to "super developer mode" without nuking all their data. For example, if the device is bound to a Firefox account, the Mozilla server could potentially generate a rooting-unlock authorization for the device if-and-only-if the account has been bound to the device for some number of days. The user hits the "hey, I wanna be a super-fancy developer and do dangerous stuff" button. We set some arbitrary delay on this, N hours:
> - send out an email to the associated email address immediately, and then randomly at some point in the next N/2 hours (the idea being not to be predictable so if the attacker is able to use the email app to delete the email they have to be at least somewhat competent rather than just waiting exactly 2 hours).
> - present a persistent notification in the tray "still want to root your device?" for the duration of the N hours.
> At the end of the time period the device gets unlocked and a persistent note is made on the Firefox Account for the device.

This is something I considered, and we already have Firefox Account signup in FTU (hence my comment in the google doc about this). But unless you do this on the _actual_ first run (or after a factory reset), you have to wipe the data, right? Otherwise its just the attacker setting up their own firefox account, not the device’s actual owner. Or am I missing something?

But I do like the idea of tying this dangerous functionality to a Firefox Account, instead of a pin (hopefully the Firefox Account is more secure). Not sure if developers will agree with having to have a Firefox Account though?


> The general idea is that you have to lose control of your device for an extended period of time and our web services infrastructure can help provide notifications via other channels if we have them. (In an ideal world everyone has both a Firefox OS phone on them and a Firefox OS tablet at home, right?)
>
> Honestly, the bang/buck effort seems way off for this compared to "opt in to developer mode at first-run, potentially having to wipe your device." And until we provide more support for layered security/encryption, in many cases there isn't much of a point since the weak 4-digit pass-code is all that's standing between the attacker and the user's email account(s)/etc.
>
> Also, many interesting permutations of this potentially want the processor/chipset to have a non-extractable private crypto key that can be used to prove the device is who it says it is. Various things using serial numbers/MACs/etc. are too predictable or just accessible to would-be attackers on the back of the box or inside the battery case. I think many interesting server-assisted mechanisms depend on a non-forge-able device id where the initial owner of the device can reliably bind the device to some other authentication factors. (So it becomes "*initial* possession is nine tenths of the law" rather than just "possession”.)

Other options for user authentication I had been thinking about were:

- pairing the phone with the computer it is going to be plugged into - maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again


>
> Andrew
signature.asc

Paul Theriault

unread,
Sep 10, 2014, 3:57:12 AM9/10/14
to Andrew Sutherland, dev...@lists.mozilla.org

Other options for user authentication I had been thinking about were:

- pairing the phone with the computer it is going to be plugged into - maybe via adb  (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker”  - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again


Which are all, on second thoughts, just overly complex ways of setting up a password to access developer options at first use...


signature.asc

Andrew Sutherland

unread,
Sep 10, 2014, 4:36:38 AM9/10/14
to dev...@lists.mozilla.org
Well, passwords do suck, though.  And so does manually needing to turn os-debugging mode on/off or dealing with that frustrating 12-hour adb kill-timer.  All of the options you list above sound like great options to help me make sure that my phone is only in development mode when I'm at home at my dev machine but is safe out in the world.

Andrew

Stephanie Ouillon

unread,
Sep 10, 2014, 5:16:56 AM9/10/14
to Andrew Sutherland, dev...@lists.mozilla.org
>
> Other options for user authentication I had been thinking about were:
>
> - pairing the phone with the computer it is going to be plugged into - maybe
> via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
> - Ship phones with “developer NFC sticker” - basically an NFC tag which is
> proof of ownership (only works for NFC devices obviously)
> - Pair the phone with a computer via bluetooth during FTU. Access to
> developer options later requires you to pair to the computer again
>

I don't see how 1) solves the case when a user has never paired his/her device with
a computer and the attacker does it. Or do you mean doing that during FTU?
3) solves this issue, but the drawback is that you don't necessarily have a computer
nearby when you first start your phone.

For 1) and 3), considering it solves the issue of an attacker being able
to pair first the stolen phone to his/her own computer, what happens if you
want to help somebody debug his/her phone, or use a device that you don't
necessarily own (I'm thinking about the context of a debugging session,
or workshop, or hackaton, or class...)?
Maybe an option you could set (while the phone is connected on the
legitimate paired computer) such as "enable pairing with one more device",
would solve that.



Paul Theriault

unread,
Sep 10, 2014, 6:39:41 AM9/10/14
to Stéphanie Ouillon, Andrew Sutherland, dev...@lists.mozilla.org

On 10 Sep 2014, at 7:16 pm, Stephanie Ouillon <stepho...@mozilla.com> wrote:

>>
>> Other options for user authentication I had been thinking about were:
>>
>> - pairing the phone with the computer it is going to be plugged into - maybe
>> via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
>> - Ship phones with “developer NFC sticker” - basically an NFC tag which is
>> proof of ownership (only works for NFC devices obviously)
>> - Pair the phone with a computer via bluetooth during FTU. Access to
>> developer options later requires you to pair to the computer again
>>
>
> I don't see how 1) solves the case when a user has never paired his/her device with
> a computer and the attacker does it. Or do you mean doing that during FTU?

Yeh I meant all during FTU as an alternative to pin code/password.

> 3) solves this issue, but the drawback is that you don't necessarily have a computer
> nearby when you first start your phone.
>
> For 1) and 3), considering it solves the issue of an attacker being able
> to pair first the stolen phone to his/her own computer, what happens if you
> want to help somebody debug his/her phone, or use a device that you don't
> necessarily own (I'm thinking about the context of a debugging session,
> or workshop, or hackaton, or class...)?
> Maybe an option you could set (while the phone is connected on the
> legitimate paired computer) such as "enable pairing with one more device",
> would solve that.
>
>
>
signature.asc

Kartikaya Gupta

unread,
Sep 10, 2014, 10:30:06 AM9/10/14
to Paul Theriault, Andrew Sutherland
On 10/9/2014, 3:48, Paul Theriault wrote:
> Other options for user authentication I had been thinking about were:
>
> - pairing the phone with the computer it is going to be plugged into - maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
> - Ship phones with �developer NFC sticker� - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
> - Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again

Of these options I dislike 1 and 3 because (a) you may not have a
computer handy at the time and (b) you may have a different computer
later when you actually want to enable the os-developer mode. I like
option (2) more. In the case of non-NFC devices (or even for NFC
devices) you could just ship phones with a separate unique PIN code to
activate developer mode. If the users lose this they can recover it by
calling their carrier and authenticating themselves via the regular
carrier authentication channel. Users who buy the phone without a
carrier plan are likely to be developers or savvy enough to realize they
should hang on to the code.

kats

Jared Hirsch

unread,
Sep 10, 2014, 2:05:03 PM9/10/14
to Paul Theriault, mozilla...@lists.mozilla.org, Jan Jongboom

On Sep 9, 2014, at 8:17 PM, Paul Theriault <pther...@mozilla.com> wrote:

>>
>> I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^
>
> Yeh I agree that is confusing in the doc - we talk about a developer pin code which is kept in sync with the lockscreen passcode. Overly complex i think. It should just BE the lockscreen passcode.

Great! Yeah, I found it hard to understand when lockscreen passcode changes would/wouldn't also change the developer pin.

In that case, do you still think it's necessary to require the passcode be set during FTU?

Stephanie Ouillon

unread,
Sep 10, 2014, 2:16:12 PM9/10/14
to Jared Hirsch, mozilla...@lists.mozilla.org, Paul Theriault, Jan Jongboom
Yes, because a user is not forced to set a PIN code for the the lockscreen.
In the case it is not set, an attacker could just go in the settings,
set his own PIN code, and enable debugging.
0 new messages