Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Can we deprecate packaged apps?

286 views
Skip to first unread message

Ben Francis

unread,
Jul 8, 2013, 5:31:49 PM7/8/13
to dev-b2g, dev-w...@lists.mozilla.org
Sorry for the typo in the subject line. It wasn't an attempt at a clever
pun...

Hello all,

Apologies for the length of this email, I didn't have time to write a
shorter one.

A year ago Jonas sent out an email [1] outlining the requirements for a new
breed of trusted web apps. He explained some of the challenges of
fulfilling these requirements with hosted apps and set out the rationale
for starting with a packaged app solution, with a view to exploring
something more "webby" later.

Now that we have shipped v1 of Firefox OS [2] with a packaged solution for
trusted apps I would like to re-open this discussion and get your feedback
on whether and how we might make trusted apps more web-like.

In order to gain access to many of the new APIs we have created in Firefox
OS, web content creators must change their entire distribution model and
package the assets of their app into a zip file to be signed and served
from one or more app stores. These assets do not have their own URIs on the
Internet and are served over a local app:// protocol instead of HTTP. This
is similar to how native apps on other mobile platforms work, and is also
similar to the packaged apps used in Chrome & Chrome OS, as well as W3C
widgets. However, it isn't much like how the web works. There is no one
definitive version of the app at a single URI and the update process is
very different.

The System Applications Working Group [3] at the W3C have actually started
some early drafts of specifications to standardise an app manifest/package
format and the app:// URI scheme, based largely on the work done by
Mozilla. Meanwhile at Google I/O, Google showed a version of the Chrome Web
Store [4] where hosted apps are re-branded simply as "web sites", with the
term "app" being limited to packaged apps using their own .crx packaging
format. I have also heard suggestions that support for hosted apps may at
some point be deprecated in Chrome altogether, in favour of supporting only
packaged apps. The message coming from both Mozilla and Google right now is
that all trusted apps must be packaged, hosted web apps can simply not be
trusted with access to new privileged APIs.

What's sad about this vision of the future is that many of the most
interesting apps that get written using web technologies like HTML, CSS and
JavaScript will not actually be part of the web. As Tim Berners-Lee
recently put it in an interview with the BBC about native apps [5], when
apps and their content don't have a URI on the Internet they are not part
of the "discourse" of the web and are therefore non-web. This was a topic
discussed at the "Meet the TAG" event hosted by Mozilla in London recently,
with members of the W3C's Technical Architecture Group expressing
disappointment in this trend.

Are we happy with a packaged model for trusted apps going forward, or is
now the time to embrace the challenge of making trusted apps genuinely part
of the web? Perhaps we don't even need to restrict our thinking to the
"app" and "app store" model and can explore ways of exposing more
privileged APIs to all web content in a trusted way.

If you're interested in the nitty gritty of this problem, I've tried to
summarise Jonas' original email below. I hope he forgives me if I
mis-represent him in any way, but you can read his original email in the
archives [1].

...

In his email, Jonas proposed the following requirements for trusted apps:
1. The ability for a trusted party to review an app and indicate some level
of trust in the app (or potentially in the app developer).
2. A mechanism for signing an app to verify that the app actually contains
the content that was reviewed.
3. Use of a minimum CSP policy for all pages of an app to ensure only the
reviewed code runs.
4. A separate data jar for local data to ensure a compromised web site can
not write to the local data of an app to alter the way it behaves.
5. A separate origin for the resources of an app so that the app can not be
tricked into running un-reviewed code from the same origin with escalated
privileges.

Jonas explained that the initial intention was to host trusted apps in the
same way as non-trusted apps, to retrieve the signatures for reviewed files
from an app store, but the files themselves directly from the app's own web
server.

He explained some problems with this approach:
a) HTTPS must be used to ensure proxies don't modify the headers or body of
HTTP responses, invalidating the signature. This would create an overhead
for app developers.
b) If multiple stores host signatures for the same app but review the app
at different speeds, updating of the app resources and signatures must be
synchronised between different stores and be limited to the speed of the
slowest review.
c) Signed resources have to be static because if a resource is dynamically
generated by a server side script, the signature would also need to be
dynamically generated, requiring a private key to be stored on the same
server, which largely defeats the object of the signing system.

It was argued that this would result in a system which, while hosted like
the rest of the web, is not very "webby" and is the worst of both worlds.

This led to the conclusion for us to package up trusted apps and to serve
their resources locally over a new app:// protocol.


My question is what might a hosted solution to this problem look like?

Ben

1. https://groups.google.com/forum/#!topic/mozilla.dev.webapps/hnCzm2RcX5o
2. https://wiki.mozilla.org/WebAPI
3. http://www.w3.org/2012/sysapps/
4. https://twitter.com/bfrancis/status/335728228897550336/photo/1
5. http://www.bbc.co.uk/iplayer/episode/b036zdhg/Click_06_07_2013/

Benjamin Smedberg

unread,
Jul 8, 2013, 8:36:17 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
On 7/8/2013 5:31 PM, Ben Francis wrote:
> What's sad about this vision of the future is that many of the most
> interesting apps that get written using web technologies like HTML, CSS and
> JavaScript will not actually be part of the web.
I don't think this is true or sad!

One of the fundamental strengths of the web is that you can download an
entire "page" and save it locally, and even view-source and hack it.

With the advent of online web apps, it has become much more difficult to
view-source the web, because dynamic loading and complex multi-page apps
often don't give you a way to download and view-source the entire app.
It also means that there is often not clear separate between the client
logic in the app and the server logic that most apps depend on.

Packaged apps are the most elegant "stupidly simple" solution to the
offline problem that continues to plague those who want unpackaged apps
to work as if they were downloadable entities.

Packaged apps are not a problem or something to be "sad" about, but
something to rejoice in. They are a way of empowering users. We should
be encouraging all app authors to use packaged apps, even if they don't
need any special permissions.

To solve the problems mentioned by TimBL, we should just make sure that
servers can serve up a packaged app at a URL, and clients can just use
it and choose to keep it on their homescreen or not, and their client
would keep using that URL to check for updates. (Ignoring for the moment
the special security requirements with high-privilege APIs, stores,
signing, and all that stuff). Then "app search" is again the same as
"web search".

--BDS

Mark Giffin

unread,
Jul 8, 2013, 11:33:53 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I can't see this outside the UK, I'm in the US of A:

5. http://www.bbc.co.uk/iplayer/episode/b036zdhg/Click_06_07_2013/

Alternate available?

Mark

Matt Basta

unread,
Jul 8, 2013, 11:46:35 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Without arguing in favor of or against packaged apps and to address the closing question of "what might a hosted solution to this problem look like?", I think the first and most important issue is this: developers need something better than appcache for offline support.

Without packaged apps, the only alternative is appcache, and appcache is bad. It's hard to test, it's buggy, it oftentimes (most of the time) doesn't work as expected, it doesn't work in a deterministic way, there's no dev tools for it in Firefox (!), until very (VERY) recently it wasn't possible to clear it in the FXOS Simulator, and it's a source of general confusion and dislike. If we were to imagine an app ecosystem without packaged apps, offline support would be the most lacking component.
What's sad about this vision of the future is that many of the most
interesting apps that get written using web technologies like HTML, CSS and
_______________________________________________
dev-webapps mailing list
dev-w...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-webapps

Tim Chien

unread,
Jul 9, 2013, 1:31:11 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I want to echo Ben's opinion here. While some of the features are
irreplaceable at the time being (offline access, reviewable by
marketplace, sandboxing our experimental APIs, even portability,
etc.), packaged app itself is not free:

-- By distributing the app in zip packages and not simply as an URL,
the OS would have to handle preloading/updating/deleting/etc.
-- While these were all implemented (in a rush way for v1.0.1), many
of the proposals awaits, for example, shared resource packages,
library packages. I highly suspect that we won't ended up
re-implementing Debian package system in Gecko if we go down this
path.
-- By moving away from a universal URL (ironically that's what U
stands for), places where a valid URL is required will fail, e.g.
OAuth auth flow.
-- Our datajar policy is not directly related to packaged app, but
many of the surprises (for users and for developers) come from that
too (e.g. user would have to login Facebook at least 3 times in order
to use it in Browser app, Website-on-home-screen, and Facebook app
itself.)

It would make sense if Mozilla have a solid roadmap toward a solution
for the Open Web itself instead of relying on packaged apps and spend
time to solve it's technical (and non-technical) problems.

Kan-Ru Chen (陳侃如)

unread,
Jul 9, 2013, 2:09:12 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev...@lists.mozilla.org
Seconds.

We should move the web forward instead of restrict us to the packaged
app approach.

Ben Francis <bfra...@mozilla.com> writes:

[...]
> Are we happy with a packaged model for trusted apps going forward, or is
> now the time to embrace the challenge of making trusted apps genuinely part
> of the web? Perhaps we don't even need to restrict our thinking to the
> "app" and "app store" model and can explore ways of exposing more
> privileged APIs to all web content in a trusted way.
>
> If you're interested in the nitty gritty of this problem, I've tried to
> summarise Jonas' original email below. I hope he forgives me if I
> mis-represent him in any way, but you can read his original email in the
> archives [1].
>
> ...
>
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.

IMO these two are the biggest problems. The current trust model on the
web is to trust the author; how do we extend that to trust the content?

> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>
> Jonas explained that the initial intention was to host trusted apps in the
> same way as non-trusted apps, to retrieve the signatures for reviewed files
> from an app store, but the files themselves directly from the app's own web
> server.
>
> He explained some problems with this approach:
> a) HTTPS must be used to ensure proxies don't modify the headers or body of
> HTTP responses, invalidating the signature. This would create an overhead
> for app developers.
> b) If multiple stores host signatures for the same app but review the app
> at different speeds, updating of the app resources and signatures must be
> synchronised between different stores and be limited to the speed of the
> slowest review.

Packaged app could also suffer from the slow review speed. The app has
to handle the co-existing of old/new app version anyway. If the app
could use a version-ed URL then the signing won't have to be
synchronized between stores.

> c) Signed resources have to be static because if a resource is dynamically
> generated by a server side script, the signature would also need to be
> dynamically generated, requiring a private key to be stored on the same
> server, which largely defeats the object of the signing system.
>
> It was argued that this would result in a system which, while hosted like
> the rest of the web, is not very "webby" and is the worst of both worlds.
>
> This led to the conclusion for us to package up trusted apps and to serve
> their resources locally over a new app:// protocol.

Later the app:// protocol also served other purposes like faster app
loading.

Kanru

Brett Zamir

unread,
Jul 9, 2013, 5:38:27 AM7/9/13
to Benjamin Smedberg, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hi,

Encouraging to see this discussion as my dissatisfaction with the hassle
of packaged apps led me to my however humble work on
https://github.com/brettz9/asyouwish/ . Replies below...

On 7/9/2013 8:36 AM, Benjamin Smedberg wrote:
> On 7/8/2013 5:31 PM, Ben Francis wrote:
>> What's sad about this vision of the future is that many of the most
>> interesting apps that get written using web technologies like HTML,
>> CSS and
>> JavaScript will not actually be part of the web.
> I don't think this is true or sad!
>
While I agree with some of the motivation behind your arguments (though
see below), I think the situation is indeed sad for a number of reasons.

I have personally witnessed several examples in my or others' proposals
to WhatWG which were rejected because the WhatWG apparently felt that
HTML beyond the Web was not their business or that a server environment
can be taken for granted. One example was the rejection (at least at the
time) of client-side includes, another was for a mechanism of
client-side entities, and another was the cold reassurance that the
adding of new globals like "Worker" (instead of scoping new APIs into
some namespace, as has been done in some cases by adding to say the
"navigator" global) was safe because Google's search engine was used to
check whether a variable had been used frequently or not.

There indeed appears to be even an open admission of failing to take
into account all environments of usage, though I would agree that that
itself does not necessitate moving away from non-web usages (and I would
hope the dismissive "online-only" attitude would change). However, I
think there are other compelling reasons for privileged online apps (if
I can extend "online" to also include the "file://" protocol along with
http/s), with the most important being the ease of publishing and
consumption of such apps.

> One of the fundamental strengths of the web is that you can download
> an entire "page" and save it locally, and even view-source and hack it.
>
> With the advent of online web apps, it has become much more difficult
> to view-source the web, because dynamic loading and complex multi-page
> apps often don't give you a way to download and view-source the entire
> app. It also means that there is often not clear separate between the
> client logic in the app and the server logic that most apps depend on.
>
First of all, there are tools to view the generated source or
potentially view generated code so at least client-side code can be
introspected (and generally without going through other additional hoops
as with packaged apps).

Yes, preventing online distribution of apps in favor of packaged ones
may COMPEL ALL people to distribute their code in entirety (or more
likely, go to non-web solutions), but why force this? It is one thing to
want applications written in HTML or JavaScript so that one can
introspect the code that is run in one's own machine in a familiar
language (though I'm not so sure force is the best solution even there).
But if you want to see server-side-dependent code become less opaque,
why not go with the following less universally compelling and cumbersome
solutions?

1) Evangelize licenses like the AGPL if you want to promote apps
including their server-side code.

Few developers choose this license from what I can tell, however, as we
ourselves don't want to be compelled so stringently--though if you do
send us code, many will indeed bristle against our own machines being
used as black boxes (as with non-standard plug-in code).

2) Bake more server-interactive functionality into client-side languages
like HTML/JavaScript in a transparent manner so that one would, for such
apps, be clear on what is (most likely) happening on the server without
needing to introspect its code even while leaving apps free to interact
with the server.***

> Packaged apps are the most elegant "stupidly simple" solution to the
> offline problem that continues to plague those who want unpackaged
> apps to work as if they were downloadable entities.

But many of us--as users or developers--don't want apps to work as
downloadable entities! It is a hassle to package apps, it slows
development time, it is annoying to users--including to users who wish
to inspect them (e.g., all the darn XPI and JAR files I have to unzip to
introspect some Firefox extensions). Moreover, packaging, if it needs to
exist at all, should be an implementation detail--if your browser really
feels the need to save space, go ahead and do the packaging for me, but
don't require me to zip things myself, learn build scripts, etc.

The web is great in large part because anyone can start working on it
today without going through unnecessary hoops. File packaging is a
hoop--and an annoying one given especially the frequency of use we're
describing here. I want to spend time programming, not zipping or
unzipping files. I can't tell you how much faster I have been able to
develop (despite a number of personal limitations) what I think are some
pretty cool proof-of-concept demos without the hassle of file packaging
by using my AsYouWish add-on which is my interim solution for such
transparent requests of privileges from regular websites (to the SDK
API): https://github.com/brettz9/asyouwish/tree/master/demos
(experimental Firefox addon needed,
https://addons.mozilla.org/en-US/firefox/addon/as-you-wish/ along with
add-on options configured to even accept privileged requests from a
given protocol/site)

> Packaged apps are not a problem or something to be "sad" about, but
> something to rejoice in. They are a way of empowering users. We should
> be encouraging all app authors to use packaged apps, even if they
> don't need any special permissions.
>
If the criterion is to ensure code is self-contained, then one could
achieve this by allowing web authors to send a flag in their code
indicating the program was wholly self-contained (kind of a reverse
appcache---cache everything by default).

But it is hardly inevitable or pleasurable that one should:
1) be forced to take an extra steps of packaging my own app for
distribution, unpackaging someone else's app to introspect or hack on
it, or deal with a build system. I also want the Web to be equally easy
for any newcomers to the Web whose ideas may otherwise remain
undeveloped because of these however seemingly small barriers to entry.
2) be restricted from developing apps which have server interactions
where the interactivity is useful

I would hope that in the search for a non-packaged privileged access
solution, the (potentially cross-browser) proposed privileged API for
mobile and non-mobile, independent apps and extensions (of desktop or
mobile browsers or systems), could be harmonized with each other:
https://bugzilla.mozilla.org/show_bug.cgi?id=848647 and even with the
likes of Node.js: https://bugzilla.mozilla.org/show_bug.cgi?id=855936

Best wishes,
Brett

*** There is a lot of lip service given to building RESTful
applications, but the fact remains that RESTful APIs are not wholly
reusable; different sites use different query parameters for the same
purpose.

If, however, certain headers were used to make make secure but otherwise
arbitrary XPath or CSS Selector queries against a document (e.g.,
somewhat along the lines of the Range header, but for HTML-aware
piecemeal delivery of documents instead of byte-wise access), besides
enabling privileged clients to treat the web, including otherwise static
files, more as its own XML database, and besides allowing one to get
genuine reusability across websites of web-friendly query APIs (and thus
better introspectability), HTML itself could prescribe usage of these
headers to tie in markup for server-side interaction, thereby also
minimizing server-side coding and need for 3rd-party library inclusion
(unless so desired by the server) while ensuring powerful querying
functionality is more available by default on even simple web documents.

For example, for clients advertising support for this header feature,
even a static document could be scanned by a server before being served
to the user, and delivered, e.g., with all tables above say 50 rows to
be truncated to the first 50 rows, and with markup added to indicate to
the client (subject to its own user preferences) that _the table could
have pagination controls added browser-side_. This would provide
frequently-used functionality without need for third party libraries or
custom server-side programming.

Similar functionality could be added to allow hierarchical list
drill-downs, paragraph range selections, etc., and although schemas have
fallen out of favor, if also specified (as in a header), a browser could
do even more in partnership with the server, e.g.:

1) type-aware sorting of tables (without the ugliness of what is
apparently emerging out of the current standard with each cell needing
its own type markup)
2) type-aware searches of tables or other elements, e.g., browser
display of date controls for date-type columns (or number ranges for
numeric fields, etc.), allowing users to make queries to obtain all
records whose values lie within a specific date range.

Mechanisms could also be added to allow easy caching/offline storage of
hitherto-retrieved rows.

The above markup-header interaction could encourage introspection by:
1) Facilitating offline application development
2) Avoiding an undue generation of custom coding and thus effort in
deciphering server-side APIs and code

...while continuing to avoid coupling the HTML language to any database
or static file format (and avoiding file packaging).

I've started some initial work on this idea at
https://github.com/brettz9/httpquery though I am currently trying to
simplify the proposed header field syntax and streamline it with
standard practices.

> To solve the problems mentioned by TimBL, we should just make sure
> that servers can serve up a packaged app at a URL, and clients can
> just use it and choose to keep it on their homescreen or not, and
> their client would keep using that URL to check for updates. (Ignoring
> for the moment the special security requirements with high-privilege
> APIs, stores, signing, and all that stuff). Then "app search" is again
> the same as "web search".
>
> --BDS
>

Brett Zamir

unread,
Jul 9, 2013, 5:52:14 AM7/9/13
to Brett Zamir, dev-w...@lists.mozilla.org, Ben Francis, Benjamin Smedberg, dev-b2g
On 7/9/2013 5:38 PM, Brett Zamir wrote:
> Hi,
>
> Encouraging to see this discussion as my dissatisfaction with the
> hassle of packaged apps led me to my however humble work on
> https://github.com/brettz9/asyouwish/ . Replies below...
>

I should clarify here that I mean dissatisfaction with packaged apps in
the generic sense of browser extensions as well as packaged mobile web
apps. (I have not yet made a FF OS version of AsYouWish.)

Brett

zar...@gmail.com

unread,
Jul 9, 2013, 6:24:52 AM7/9/13
to
> Ben Francis:
> Are we happy with a packaged model for trusted apps going forward, or is
> now the time to embrace the challenge of making trusted apps genuinely part
> of the web?

Packaged are IMHO the best part of FireFox OS so far, being simple and solving the problem effectively.

appcache is complicated and unreliable by design: when you need the cache, it won't just be to save on download rates or response time, it'll be because the network could be unreachable, and you need every resource predictably available, including those that you need to report the error to the user

* updating a single coherent zip in the background is preferable to piecemeal update when resources are accessed, and also more reliable. Strictly speaking, you don't have to support multiple versions of the app, you can just block when the package isn't up to date (just like a non-packaged website version would block, stall or require the user to refresh)
* package review and centralized revocation might be preferable to trusting the host url "blindly", no matter how many levels of automated security are involved. This is in part why the app model has been successful: people don't trust an url (they can't, not without a significant technical baggage, and even then...), you trust a marketplace and its review protocols, you trust a company name, and you trust the marketplace to have verified and certified that name.

> Benjamin Smedberg :
> To solve the problems mentioned by TimBL, we should just make sure that
> servers can serve up a packaged app at a URL

I would argue that the web should be going the way of packaged apps, and standardize that. After all, a lot of the recent communication protocols are just dancing around the concept of downloading everything at once in consolidated form (SPDY, I'm looking at you), and then keeping a cache of it. Maybe it's time for a return to KISS principles?

All in all, the piecemeal loading of resources that is the hallmark of the web-based URL approach serves only two purposes: being able to start an app without having all the resources (but at the cost of a greater latency, and the risk of future failure should a resource be suddenly inaccessible), and aggregating content and code from multiple hosts and origins (but is that really desirable these days?).

App offline and start behaviors are still key, and anything that requires online access to downloading code or resource (vs just the useful dynamic data) is still going to be problematic for many years.

A mobile device is connected most of the time, but may not be when the user needs it, so background connection time should be leveraged to keep everything updated so it's ready when needed, and the packaged app model answers that, while the web model fails.

Eric

Andrew Williamson

unread,
Jul 9, 2013, 7:59:49 AM7/9/13
to
From a marketplace perspective, the majority of packaged apps aren't
privileged so most developers are using packaged apps because of some
combination of:
a) offline support without the horror of appcache
b) don't need to maintain a reliable, persistent, server infrastructure
c) its what they're used to with other app stores (or addons)

IMO, if anything, its the use cases for un-privileged packaged apps we
should consider, rather than concentrating on a very complicated
replacement for privileged packaged apps that still satisfies Jonas'
requirements below.
> ....

Salvador de la Puente González

unread,
Jul 9, 2013, 10:58:04 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Hello

I want to say I totally support the ideal of removing packaged apps.
INMHO, I think we need a way to move from on-line to off-line
applications in a transparent way in order to make "installation"
process almost trivial. For this purpose some mechanism like offline
cache is necessary. If current implementation is bugged, let's find a
better alternative.

For the problems exposed by Jonas, I'm not an expert but some of them
can be easily addressed:

On 08/07/13 23:31, Ben Francis wrote:
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
With a list of downloadable content to be installed (some like the
Offline Cache), a third party could retrieve a version of the software,
then review it.
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
With a digest algorithm based on the content of each file the third
party and the User Agent could compute the same signature and see if it
matches. I have in mind some sha1-based like in Git. At least, all is
about content.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
Sorry but I don't understand the problem here. I can currently load an
external script from a packaged application and run its code.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
If the UA receive an update order for the Offline Cache of a determined
App, it can perform another digest and sent to the third party in order
to see if the new code has been reviewed. Maybe I did not understand
what is a compromised web in this context or how it could be a hazard..
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>
Same as above. With the digest mechanism you can say if the version the
third party reviewed and the device version differs, then ask the user.

If you think the digest process delays the application execution, it is
true but you can cover this time by adding it to the
"installation/update process".

Maybe I'm very naive about security so if you can clarify me some
aspects I did not take in count, I was very pleased to read about.

Best!

________________________________

Este mensaje se dirige exclusivamente a su destinatario. Puede consultar nuestra política de envío y recepción de correo electrónico en el enlace situado más abajo.
This message is intended exclusively for its addressee. We only send and receive email on the basis of the terms set out at:
http://www.tid.es/ES/PAGINAS/disclaimer.aspx

Kumar McMillan

unread,
Jul 9, 2013, 4:24:16 PM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I see a strong future for the concept of packaged apps in a webby way. Instead of URLs, think of each app as a "URL" that is addressable by its unique content hash. This would allow a decentralized and distributed "web" where all resources are shared. Consider that web server you throw money at to handle traffic: It could be replaced by millions of mobile devices (perhaps even with peer to peer connections).

Thus, it might be beneficial to look at the research work being done in this field as a guide: Named Data Networking and Content Centric Networking.

https://en.wikipedia.org/wiki/Named_data_networking
http://www.parc.com/work/focus-area/content-centric-networking/

There are a lot of working NDN prototypes (https://github.com/named-data); the packaged apps problem seems very similar in nature.

Kumar


>
> Are we happy with a packaged model for trusted apps going forward, or is
> now the time to embrace the challenge of making trusted apps genuinely part
> of the web? Perhaps we don't even need to restrict our thinking to the
> "app" and "app store" model and can explore ways of exposing more
> privileged APIs to all web content in a trusted way.
>
> If you're interested in the nitty gritty of this problem, I've tried to
> summarise Jonas' original email below. I hope he forgives me if I
> mis-represent him in any way, but you can read his original email in the
> archives [1].
>
> ...
>
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>

Antonio Manuel Amaya Calvo

unread,
Jul 9, 2013, 5:13:10 PM7/9/13
to Salvador de la Puente González, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hey all.

You know, at the risk of being the discordant voice here, I don't see
what the issue with packaged apps is. It's not like you're required to
use packaged app if you want do develop a non special-api-using app. You
can just develop the app as you would normally, host it somewhere, test
it that way and then, after you've finished developing it, if it's
mostly static content (static HTMLs, JSs and images) then you have the
*option* to package it so your users can use it without having a network
connection. You don't have to do it, though, it's your decision. If you
prefer not to package it, you can just distribute it as a hosted app.

Oh, and the cool thing is that so long as you've only used relative URIs
on your app, you won't have to change a single iota to change it from a
hosted app to a packaged app.

So, for normal apps, I don't see having packaged apps as a bad thing.
It's an extra option, and having more options is good on my book. I
don't believe in forcing people into any vision of what the web should
be, and what is webby and what isn't. The web will be whatever people
make of it. If packaged apps are useful they will prosper and grow, if
they're not, they'll shrink and die. And that's good.

Now, for privileged app, that's another problem altogether, and
something that doesn't actually have a close relative on the existing
web. At Firefox OS we're exposing some APIs to content that are or can
be dangerous if used incorrectly or with bad intent. And to do that,
we've taken onto us the responsibility of telling the user, hey, it's ok
if you allow this app to access your contact because WE have checked it
and WE have seen it will treat your data with the due respect. We cannot
do that unless we ensure that the content of the app we've examined is
*static*. We could give the option for that content to be served
remotely, along with the signature of every piece of code we download
(raw hashes as Salva said below will just not work). But the server
cannot serve dynamic content, cause otherwise the signatures won't be of
any use. And I'm not even sure if even that would be enough, since a
developer could send to be signed more content than he actually uses...
Anyway...

On that model, every time a developer wanted to change a single comma on
his code he would have to resend all the content (and not just the code
he just changed, because of possible interactions) to be reexamined.
Then a trusted third party (cause I, as a user, love developer but don't
trust them as far as I can throw them) will sign the data again, and
send the signatures back and only then the developer can change one set
of static content for another set of static content.

What does he win with this model, compared with the current one? Just
the actual packaging of the app. What do end users win? Hmm... without
any way of local caching of content, they actually *lose*. They'll have
to download the static content over and over and over. Oh, and the
signature will have to be re-verified every time the content is loaded
(while actually is only checked at install time).

Going away from a packaged/controlled environment for privileged app
will make harder to actually trust them. The way I would actually like
for us to go is to, slowly, start opening some APIs that are currently
certified only to be privileged, so more and richer third party apps can
be added to the ecosystem. And I don't see that happening if we relax
the current trust/security model.

Best regards,

Antonio



On 09/07/2013 16:58, Salvador de la Puente González wrote:
> Hello
>
> I want to say I totally support the ideal of removing packaged apps.
> INMHO, I think we need a way to move from on-line to off-line
> applications in a transparent way in order to make "installation"
> process almost trivial. For this purpose some mechanism like offline
> cache is necessary. If current implementation is bugged, let's find a
> better alternative.
>
> For the problems exposed by Jonas, I'm not an expert but some of them
> can be easily addressed:
>
> On 08/07/13 23:31, Ben Francis wrote:
>> In his email, Jonas proposed the following requirements for trusted
>> apps:
>> 1. The ability for a trusted party to review an app and indicate some
>> level
>> of trust in the app (or potentially in the app developer).
> With a list of downloadable content to be installed (some like the
> Offline Cache), a third party could retrieve a version of the software,
> then review it.
>> 2. A mechanism for signing an app to verify that the app actually
>> contains
>> the content that was reviewed.
> With a digest algorithm based on the content of each file the third
> party and the User Agent could compute the same signature and see if it
> matches. I have in mind some sha1-based like in Git. At least, all is
> about content.
>> 3. Use of a minimum CSP policy for all pages of an app to ensure only
>> the
>> reviewed code runs.
> Sorry but I don't understand the problem here. I can currently load an
> external script from a packaged application and run its code.
>> 4. A separate data jar for local data to ensure a compromised web
>> site can
>> not write to the local data of an app to alter the way it behaves.
> If the UA receive an update order for the Offline Cache of a determined
> App, it can perform another digest and sent to the third party in order
> to see if the new code has been reviewed. Maybe I did not understand
> what is a compromised web in this context or how it could be a hazard..
>> 5. A separate origin for the resources of an app so that the app can
>> not be
>> tricked into running un-reviewed code from the same origin with
>> escalated
>> privileges.
>>
> Same as above. With the digest mechanism you can say if the version the
> third party reviewed and the device version differs, then ask the user.
>
> If you think the digest process delays the application execution, it is
> true but you can cover this time by adding it to the
> "installation/update process".
>
> Maybe I'm very naive about security so if you can clarify me some
> aspects I did not take in count, I was very pleased to read about.
>
> Best!
>
> ________________________________
>
> Este mensaje se dirige exclusivamente a su destinatario. Puede
> consultar nuestra política de envío y recepción de correo electrónico
> en el enlace situado más abajo.
> This message is intended exclusively for its addressee. We only send
> and receive email on the basis of the terms set out at:
> http://www.tid.es/ES/PAGINAS/disclaimer.aspx
> _______________________________________________
> dev-b2g mailing list
> dev...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-b2g

Peter Dolanjski

unread,
Jul 10, 2013, 3:36:19 AM7/10/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Hello all,

I don't have much to add in the way of implementation suggestions, but I do want to make a few points from an overall product and end user perspective. (much of this is probably well known to this list, but it may spark some further thought)

As was already mentioned, it is important that our solution provides users with the freedom to obtain and use apps in a manner that suits them best in an environment encumbered by connectivity constraints (costs, network availability, etc.). So, to the extent that a given app's developer intended such a use for their app, the user should be able to opportunistically "download" the app as it suits them (ie. WiFi) for later use when not having local assets would otherwise render that app useless. Moreover, right or wrong, I believe the average user expectation is such that when an icon for an app persists on the device then that app can function in an offline mode. When there is a lack of connectivity, it may be worth exploring, using some user experience tests, whether or not visually distinguishing between apps that don't have such local content and apps that do provides clarity and benefit to the user. (I believe there were some proposals/contributions on this front)

The other part of the picture are users that don't tend to seek out and install third party apps for later/repeated use. Our data for our target markets indicates that the average smartphone user does not use more than a few (< 2-3) third party apps. It is for these users that I think that pushing the boundary with respect to the use cases a hosted app is capable of satisfying (ie. APIs available to that app) would provide product differentiation and round out the "instant app" story. These are not users who will go to an app storefront to seek out apps that they can install for later use. It is more likely that they will have a specific task in mind that they would like to accomplish. A discovery mechanism for such apps to meet the task at hand, paired with unencumbered app access to device functionality (beyond what is possible using hosted apps today and to the extent possible while still protecting the user) that can create an "instant" experience without having to download the app would go a long way to meeting the workflow needs of these types of users.

Peter

----- Original Message -----

From: "Ben Francis" <bfra...@mozilla.com>
To: "dev-b2g" <dev...@lists.mozilla.org>, dev-w...@lists.mozilla.org
Are we happy with a packaged model for trusted apps going forward, or is
now the time to embrace the challenge of making trusted apps genuinely part
of the web? Perhaps we don't even need to restrict our thinking to the
"app" and "app store" model and can explore ways of exposing more
privileged APIs to all web content in a trusted way.

If you're interested in the nitty gritty of this problem, I've tried to
summarise Jonas' original email below. I hope he forgives me if I
mis-represent him in any way, but you can read his original email in the
archives [1].

...

In his email, Jonas proposed the following requirements for trusted apps:
1. The ability for a trusted party to review an app and indicate some level
of trust in the app (or potentially in the app developer).
2. A mechanism for signing an app to verify that the app actually contains
the content that was reviewed.
3. Use of a minimum CSP policy for all pages of an app to ensure only the
reviewed code runs.
4. A separate data jar for local data to ensure a compromised web site can
not write to the local data of an app to alter the way it behaves.
5. A separate origin for the resources of an app so that the app can not be
tricked into running un-reviewed code from the same origin with escalated
privileges.

Salvador de la Puente González

unread,
Jul 10, 2013, 4:09:47 AM7/10/13
to Antonio Manuel Amaya Calvo, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hello!

On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
> Hey all.
>
> You know, at the risk of being the discordant voice here, I don't see
> what the issue with packaged apps is. It's not like you're required to
> use packaged app if you want do develop a non special-api-using app.
> You can just develop the app as you would normally, host it somewhere,
> test it that way and then, after you've finished developing it, if
> it's mostly static content (static HTMLs, JSs and images) then you
> have the *option* to package it so your users can use it without
> having a network connection. You don't have to do it, though, it's
> your decision. If you prefer not to package it, you can just
> distribute it as a hosted app.
Oh, of course but it could be great if I, as web developer, only add a
file saying something like "hey, if you want this application to work
offline, take these files and pack them together". This is said to the
user agent so I'm doing nothing more than declaring the structure of my
package and the the UA pack it "automagically".
>
> Oh, and the cool thing is that so long as you've only used relative
> URIs on your app, you won't have to change a single iota to change it
> from a hosted app to a packaged app.
With the approach above, it could be the same. :)
>
> So, for normal apps, I don't see having packaged apps as a bad thing.
> It's an extra option, and having more options is good on my book. I
> don't believe in forcing people into any vision of what the web should
> be, and what is webby and what isn't. The web will be whatever people
> make of it. If packaged apps are useful they will prosper and grow, if
> they're not, they'll shrink and die. And that's good.
I agree it is an extra option, the problem raises when the on-line web
is the extra. At the end, and despite we move to an always-connected
world, the real Web is the Online Web.
>
> Now, for privileged app, that's another problem altogether, and
> something that doesn't actually have a close relative on the existing
> web. At Firefox OS we're exposing some APIs to content that are or can
> be dangerous if used incorrectly or with bad intent. And to do that,
> we've taken onto us the responsibility of telling the user, hey, it's
> ok if you allow this app to access your contact because WE have
> checked it and WE have seen it will treat your data with the due
> respect. We cannot do that unless we ensure that the content of the
> app we've examined is *static*. We could give the option for that
> content to be served remotely, along with the signature of every piece
> of code we download (raw hashes as Salva said below will just not work).
Why? If all is about content and I review the content, trust it and make
a hash for him, any attempt to modify the code will alter the
cryptographic hash making the validation algorithm fail.
> But the server cannot serve dynamic content, cause otherwise the
> signatures won't be of any use. And I'm not even sure if even that
> would be enough, since a developer could send to be signed more
> content than he actually uses... Anyway...
>
> On that model, every time a developer wanted to change a single comma
> on his code he would have to resend all the content (and not just the
> code he just changed, because of possible interactions) to be
> reexamined. Then a trusted third party (cause I, as a user, love
> developer but don't trust them as far as I can throw them) will sign
> the data again, and send the signatures back and only then the
> developer can change one set of static content for another set of
> static content.
But that is how it's working now! If I made a syntax error error in my
v1 application and it lands in the market and I need to fix it, I need
to send an update to the market. A v2 application so it need to be
reviewed again. How to review is responsibility of the third party. It
should keep a code repository an see only the differences and automatic
tools can decide when a change need another human review or not.
>
> What does he win with this model, compared with the current one? Just
> the actual packaging of the app. What do end users win? Hmm... without
> any way of local caching of content, they actually *lose*. They'll
> have to download the static content over and over and over. Oh, and
> the signature will have to be re-verified every time the content is
> loaded (while actually is only checked at install time).
Just the same with the current approach. The user need to download the
"application update" and the sign must to be recalculated again.
>
> Going away from a packaged/controlled environment for privileged app
> will make harder to actually trust them. The way I would actually like
> for us to go is to, slowly, start opening some APIs that are currently
> certified only to be privileged, so more and richer third party apps
> can be added to the ecosystem. And I don't see that happening if we
> relax the current trust/security model.
I like the current security model, I only want some "auto-packing"
method that make the "off-line site" an option, not the "on-line site"
because "on-line" sites are the Web.

Cheers!
>
> Best regards,
>
> Antonio
>
>
>
> On 09/07/2013 16:58, Salvador de la Puente González wrote:
>> Hello
>>
>> I want to say I totally support the ideal of removing packaged apps.
>> INMHO, I think we need a way to move from on-line to off-line
>> applications in a transparent way in order to make "installation"
>> process almost trivial. For this purpose some mechanism like offline
>> cache is necessary. If current implementation is bugged, let's find a
>> better alternative.
>>
>> For the problems exposed by Jonas, I'm not an expert but some of them
>> can be easily addressed:
>>
>> On 08/07/13 23:31, Ben Francis wrote:
>>> In his email, Jonas proposed the following requirements for trusted
>>> apps:
>>> 1. The ability for a trusted party to review an app and indicate
>>> some level
>>> of trust in the app (or potentially in the app developer).
>> With a list of downloadable content to be installed (some like the
>> Offline Cache), a third party could retrieve a version of the software,
>> then review it.
>>> 2. A mechanism for signing an app to verify that the app actually
>>> contains
>>> the content that was reviewed.
>> With a digest algorithm based on the content of each file the third
>> party and the User Agent could compute the same signature and see if it
>> matches. I have in mind some sha1-based like in Git. At least, all is
>> about content.
>>> 3. Use of a minimum CSP policy for all pages of an app to ensure
>>> only the
>>> reviewed code runs.
>> Sorry but I don't understand the problem here. I can currently load an
>> external script from a packaged application and run its code.
>>> 4. A separate data jar for local data to ensure a compromised web
>>> site can
>>> not write to the local data of an app to alter the way it behaves.
>> If the UA receive an update order for the Offline Cache of a determined
>> App, it can perform another digest and sent to the third party in order
>> to see if the new code has been reviewed. Maybe I did not understand
>> what is a compromised web in this context or how it could be a hazard..
>>> 5. A separate origin for the resources of an app so that the app can
>>> not be
>>> tricked into running un-reviewed code from the same origin with
>>> escalated
>>> privileges.
>>>
>> Same as above. With the digest mechanism you can say if the version the
>> third party reviewed and the device version differs, then ask the user.
>>
>> If you think the digest process delays the application execution, it is
>> true but you can cover this time by adding it to the
>> "installation/update process".
>>
>> Maybe I'm very naive about security so if you can clarify me some
>> aspects I did not take in count, I was very pleased to read about.
>>
>> Best!
>>
>> ________________________________
>>
>> Este mensaje se dirige exclusivamente a su destinatario. Puede
>> consultar nuestra política de envío y recepción de correo electrónico
>> en el enlace situado más abajo.
>> This message is intended exclusively for its addressee. We only send
>> and receive email on the basis of the terms set out at:
>> http://www.tid.es/ES/PAGINAS/disclaimer.aspx
>> _______________________________________________
>> dev-b2g mailing list
>> dev...@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-b2g
>
>


Antonio M. Amaya

unread,
Jul 10, 2013, 4:45:59 AM7/10/13
to Salvador de la Puente González, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
On 10/07/2013 10:09, Salvador de la Puente González wrote:
> Hello!
>
> On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
>> Hey all.
>>
>> You know, at the risk of being the discordant voice here, I don't see
>> what the issue with packaged apps is. It's not like you're required
>> to use packaged app if you want do develop a non special-api-using
>> app. You can just develop the app as you would normally, host it
>> somewhere, test it that way and then, after you've finished
>> developing it, if it's mostly static content (static HTMLs, JSs and
>> images) then you have the *option* to package it so your users can
>> use it without having a network connection. You don't have to do it,
>> though, it's your decision. If you prefer not to package it, you can
>> just distribute it as a hosted app.
> Oh, of course but it could be great if I, as web developer, only add a
> file saying something like "hey, if you want this application to work
> offline, take these files and pack them together". This is said to the
> user agent so I'm doing nothing more than declaring the structure of
> my package and the the UA pack it "automagically".
So you just want to save the time to make the zip yourself? ;)

>
>>
>> Oh, and the cool thing is that so long as you've only used relative
>> URIs on your app, you won't have to change a single iota to change it
>> from a hosted app to a packaged app.
> With the approach above, it could be the same. :)
>>
>> So, for normal apps, I don't see having packaged apps as a bad thing.
>> It's an extra option, and having more options is good on my book. I
>> don't believe in forcing people into any vision of what the web
>> should be, and what is webby and what isn't. The web will be whatever
>> people make of it. If packaged apps are useful they will prosper and
>> grow, if they're not, they'll shrink and die. And that's good.
> I agree it is an extra option, the problem raises when the on-line web
> is the extra. At the end, and despite we move to an always-connected
> world, the real Web is the Online Web.
Well, if or when we arrive to that always-connected world then the
packaged apps will quietly go away by themselves cause they won't be
useful. Meanwhile, I for one have data limits on my mobile connection,
not to mention the times when connection just isn't available. And I
would like to keep using the apps that don't actually require connection
(games, book readers, whatever). Currently packaged apps fill this need.
Now, I don't actually mind if the developer makes the zip himself or if
he just instructs the UA somehow to download/perma-cache the fixed
resources. But as I said before, this affects mainly the non privileged
apps workflow.

>>
>> Now, for privileged app, that's another problem altogether, and
>> something that doesn't actually have a close relative on the existing
>> web. At Firefox OS we're exposing some APIs to content that are or
>> can be dangerous if used incorrectly or with bad intent. And to do
>> that, we've taken onto us the responsibility of telling the user,
>> hey, it's ok if you allow this app to access your contact because WE
>> have checked it and WE have seen it will treat your data with the due
>> respect. We cannot do that unless we ensure that the content of the
>> app we've examined is *static*. We could give the option for that
>> content to be served remotely, along with the signature of every
>> piece of code we download (raw hashes as Salva said below will just
>> not work).
> Why? If all is about content and I review the content, trust it and
> make a hash for him, any attempt to modify the code will alter the
> cryptographic hash making the validation algorithm fail.
Because hashes by themselves give integrity, not security. Why? Cause if
I can change the content and the hash is stored with the content, I can
change the hash also. And just trusting hash by origin (I trust this
hash because I downloaded it from the store while I downloaded the
actual content from the developer site) could work but it links
completely your security to the security of the third party (market)
download site.

>
>> But the server cannot serve dynamic content, cause otherwise the
>> signatures won't be of any use. And I'm not even sure if even that
>> would be enough, since a developer could send to be signed more
>> content than he actually uses... Anyway...
>>
>> On that model, every time a developer wanted to change a single comma
>> on his code he would have to resend all the content (and not just the
>> code he just changed, because of possible interactions) to be
>> reexamined. Then a trusted third party (cause I, as a user, love
>> developer but don't trust them as far as I can throw them) will sign
>> the data again, and send the signatures back and only then the
>> developer can change one set of static content for another set of
>> static content.
> But that is how it's working now! If I made a syntax error error in my
> v1 application and it lands in the market and I need to fix it, I need
> to send an update to the market. A v2 application so it need to be
> reviewed again. How to review is responsibility of the third party. It
> should keep a code repository an see only the differences and
> automatic tools can decide when a change need another human review or
> not.
Yep, that was my point exactly, as stated below. That a developer won't
gain anything with this model compared to the existing one (besides not
having to create the zip file).


>>
>> What does he win with this model, compared with the current one? Just
>> the actual packaging of the app. What do end users win? Hmm...
>> without any way of local caching of content, they actually *lose*.
>> They'll have to download the static content over and over and over.
>> Oh, and the signature will have to be re-verified every time the
>> content is loaded (while actually is only checked at install time).
> Just the same with the current approach. The user need to download the
> "application update" and the sign must to be recalculated again.

No. If we download resources remotely every time we use them (or if we
can download them remotely) then we have to recheck the signature every
time a file is accessed/downloaded. With packaged apps you check the
signature only at install time (be it first time install, or update).

Salvador de la Puente González

unread,
Jul 10, 2013, 9:04:46 AM7/10/13
to Antonio M. Amaya, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hi

On 10/07/13 10:45, Antonio M. Amaya wrote:
> On 10/07/2013 10:09, Salvador de la Puente González wrote:
>> Hello!
>>
>> On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
>>> Hey all.
>>>
>>> You know, at the risk of being the discordant voice here, I don't
>>> see what the issue with packaged apps is. It's not like you're
>>> required to use packaged app if you want do develop a non
>>> special-api-using app. You can just develop the app as you would
>>> normally, host it somewhere, test it that way and then, after you've
>>> finished developing it, if it's mostly static content (static HTMLs,
>>> JSs and images) then you have the *option* to package it so your
>>> users can use it without having a network connection. You don't have
>>> to do it, though, it's your decision. If you prefer not to package
>>> it, you can just distribute it as a hosted app.
>> Oh, of course but it could be great if I, as web developer, only add
>> a file saying something like "hey, if you want this application to
>> work offline, take these files and pack them together". This is said
>> to the user agent so I'm doing nothing more than declaring the
>> structure of my package and the the UA pack it "automagically".
> So you just want to save the time to make the zip yourself? ;)
Yes, because in this way it becomes a Web feature, not a developer concern.
>
>>
>>>
>>> Oh, and the cool thing is that so long as you've only used relative
>>> URIs on your app, you won't have to change a single iota to change
>>> it from a hosted app to a packaged app.
>> With the approach above, it could be the same. :)
>>>
>>> So, for normal apps, I don't see having packaged apps as a bad
>>> thing. It's an extra option, and having more options is good on my
>>> book. I don't believe in forcing people into any vision of what the
>>> web should be, and what is webby and what isn't. The web will be
>>> whatever people make of it. If packaged apps are useful they will
>>> prosper and grow, if they're not, they'll shrink and die. And that's
>>> good.
>> I agree it is an extra option, the problem raises when the on-line
>> web is the extra. At the end, and despite we move to an
>> always-connected world, the real Web is the Online Web.
> Well, if or when we arrive to that always-connected world then the
> packaged apps will quietly go away by themselves cause they won't be
> useful. Meanwhile, I for one have data limits on my mobile connection,
> not to mention the times when connection just isn't available. And I
> would like to keep using the apps that don't actually require
> connection (games, book readers, whatever). Currently packaged apps
> fill this need. Now, I don't actually mind if the developer makes the
> zip himself or if he just instructs the UA somehow to
> download/perma-cache the fixed resources. But as I said before, this
> affects mainly the non privileged apps workflow.
Let's discuss this later on this answer. After our off-line conversation
I see the problem now.
>
>>>
>>> Now, for privileged app, that's another problem altogether, and
>>> something that doesn't actually have a close relative on the
>>> existing web. At Firefox OS we're exposing some APIs to content that
>>> are or can be dangerous if used incorrectly or with bad intent. And
>>> to do that, we've taken onto us the responsibility of telling the
>>> user, hey, it's ok if you allow this app to access your contact
>>> because WE have checked it and WE have seen it will treat your data
>>> with the due respect. We cannot do that unless we ensure that the
>>> content of the app we've examined is *static*. We could give the
>>> option for that content to be served remotely, along with the
>>> signature of every piece of code we download (raw hashes as Salva
>>> said below will just not work).
>> Why? If all is about content and I review the content, trust it and
>> make a hash for him, any attempt to modify the code will alter the
>> cryptographic hash making the validation algorithm fail.
> Because hashes by themselves give integrity, not security. Why? Cause
> if I can change the content and the hash is stored with the content, I
> can change the hash also. And just trusting hash by origin (I trust
> this hash because I downloaded it from the store while I downloaded
> the actual content from the developer site) could work but it links
> completely your security to the security of the third party (market)
> download site.
You could sign the hash then. There are no further problems here.
>
>>
>>> But the server cannot serve dynamic content, cause otherwise the
>>> signatures won't be of any use. And I'm not even sure if even that
>>> would be enough, since a developer could send to be signed more
>>> content than he actually uses... Anyway...
>>>
>>> On that model, every time a developer wanted to change a single
>>> comma on his code he would have to resend all the content (and not
>>> just the code he just changed, because of possible interactions) to
>>> be reexamined. Then a trusted third party (cause I, as a user, love
>>> developer but don't trust them as far as I can throw them) will sign
>>> the data again, and send the signatures back and only then the
>>> developer can change one set of static content for another set of
>>> static content.
>> But that is how it's working now! If I made a syntax error error in
>> my v1 application and it lands in the market and I need to fix it, I
>> need to send an update to the market. A v2 application so it need to
>> be reviewed again. How to review is responsibility of the third
>> party. It should keep a code repository an see only the differences
>> and automatic tools can decide when a change need another human
>> review or not.
> Yep, that was my point exactly, as stated below. That a developer
> won't gain anything with this model compared to the existing one
> (besides not having to create the zip file).
>
Important thing is that he does not loose anything. And we drop the
developer concern about packing the application. So, you, as web
developer, only worry about working in your web application, then
publish a list of files to allow off-line mode and let The Web do the
hard job (when saying The Web I'm referring indeed to User Agents,
servers and HTTP protocol).

Of course, a drawback of this approach is that you need not only a
server but a domain name too.
>
>>>
>>> What does he win with this model, compared with the current one?
>>> Just the actual packaging of the app. What do end users win? Hmm...
>>> without any way of local caching of content, they actually *lose*.
>>> They'll have to download the static content over and over and over.
>>> Oh, and the signature will have to be re-verified every time the
>>> content is loaded (while actually is only checked at install time).
>> Just the same with the current approach. The user need to download
>> the "application update" and the sign must to be recalculated again.
>
> No. If we download resources remotely every time we use them (or if we
> can download them remotely) then we have to recheck the signature
> every time a file is accessed/downloaded. With packaged apps you check
> the signature only at install time (be it first time install, or update).
Ok, this is part of the former discussion. But we can simply not allow
load resources from the Internet.

Only remarking, if we allow this auto-packing , then we are encouraging
web development and adding the packing as an option.

Best!

gene.v...@urbien.com

unread,
Jul 16, 2013, 12:23:48 AM7/16/13
to
Matt, although appcache is very flaky, it is possible to minimize it's problems by loading just the first page and a script using appcache and then roll your own appcache functionality replacement using LocalStorage and IndexedDB (with the fallback to WebSQL) for the rest of the app assets. We did so with Urbini (see it on github), and it works fine. Said that, I would not recommend doing it yourself, as the level of complexity is 10x of the appcache, just use some JS framework.

Anyway, off-lining app assets does not give me access to Bluetooth, as an example of my problem at hand. I want my app to talk to Pebble watch, various fitness and medical devices. Packaged apps supposedly give me this privilege the same way native apps do - they allow distributor to:

1) verify that I am not a spammer/virus maker/Chinese hacker
2) inspect my code and vouch for it, to a degree
3) revoke my distribution rights if I violated the trust

Do not get me wrong, I hate that we have relegated to the downloadable software model. Like Ben, I am sad.

So I am brewing some ideas on how to address the problems of trust and damage control. The direction of my thought is that the rights given to me do not need to be binary, they can grow as I gain more trust. Also, networks like Facebook and LinkedIn, often serve as good deterrents from the behaviour that puts others at risk.

Gene Vayngrib

Matt Basta

unread,
Jul 16, 2013, 4:53:53 PM7/16/13
to gene vayngrib, dev-w...@lists.mozilla.org
While using appcache may be possible, it's a poor alternative to packaged apps for offline app development. Rather than encouraging workarounds, we should be building a better, more stable (and sane) solution. If offline is a hassle, developers aren't going to do it and that's a lose for users.

Travis Choma

unread,
Jul 16, 2013, 6:38:27 PM7/16/13
to Matt Basta, dev-w...@lists.mozilla.org, Harald Kirschner, gene vayngrib
Hi Matt,

You are totally right. We are looking at building a better appcache. There are some early plans from the WebAPI team to implement Navigation Controller which is a lower level scriptable cache, in order to avoid the mistakes made in the original appcache and allow developers to roll their own behavior: https://github.com/slightlyoff/NavigationController/ .

Following that the idea would be to implement a declarative version on top that captures common use cases and would succeed the original appcache. The first pass of what that might look like is here: https://github.com/slightlyoff/NavigationController/blob/master/examples/new-manifest/spec.md

Also one of the inputs into this process has been the following collection of use cases: http://www.w3.org/wiki/Webapps/AppCacheUseCases .

If you have some specific insights from building the Marketplace app, it'd be great to capture that so we can feed that into this process as we move to an appcache that is a better fit for developers who are building offline access into their apps.

Cheers,
-Travis

----- Original Message -----
From: "Matt Basta" <mba...@mozilla.com>
To: "gene vayngrib" <gene.v...@urbien.com>
Cc: dev-w...@lists.mozilla.org
Sent: Tuesday, July 16, 2013 1:53:53 PM
Subject: Re: Can we deprecate packaged apps?

While using appcache may be possible, it's a poor alternative to packaged apps for offline app development. Rather than encouraging workarounds, we should be building a better, more stable (and sane) solution. If offline is a hassle, developers aren't going to do it and that's a lose for users.



----- Original Message -----
From: "gene vayngrib" <gene.v...@urbien.com>
To: dev-w...@lists.mozilla.org
Sent: Monday, July 15, 2013 9:23:48 PM

Ehsan Akhgari

unread,
Jul 16, 2013, 6:40:47 PM7/16/13
to Matt Basta, dev-w...@lists.mozilla.org, gene vayngrib
On 2013-07-16 4:53 PM, Matt Basta wrote:
> While using appcache may be possible, it's a poor alternative to packaged apps for offline app development. Rather than encouraging workarounds, we should be building a better, more stable (and sane) solution. If offline is a hassle, developers aren't going to do it and that's a lose for users.

We are going to experiment with NavigationController
<https://github.com/slightlyoff/NavigationController> to see if that can
address the needs of offline applications adequately.

Cheers,
Ehsan

Mounir Lamouri

unread,
Jul 17, 2013, 8:25:03 PM7/17/13
to dev-b2g, dev-w...@lists.mozilla.org, Ben Francis
Hi,

tl;dr: it is too early to deprecate packaged apps. They are a good tool
to experiment and can already start taking over the native app world.

Our APIs are not mature enough to be moved to the Web. Most APIs that
are privileged-only will get significant changes in the future. Packaged
apps is an environment where we can actually change APIs relatively
easily. We are actually thinking of adding an "api version" in the
manifest to make such a mechanism working.

In addition, of not being mature, some APIs have security issues that we
do not know how to solve yet and solving them is not trivial. Having
those APIs for packaged apps only with a review system allow us to delay
having a security model for the Web (where we can't review content). We
will hopefully get to that at some point.

In other words, I see packaged apps as a nice playground where we can
experiment things before having them go in the Web.

Furthermore, one of the things Mozilla wanted to solve with Firefox OS
was to make things simpler for developers and users. Having one platform
(the Web) that will no longer require you to re-buy apps if you change
phone. We are not close to solve the second problem but packaged apss
will help with the former. Even if packaged apps are not really the Web,
they are using Web Technologies and will help developers to write
applications that could way easily be ported from a platform to another.
SysApps could be a good path and we could standardise APIs there in a
way that, at some point, an application for Chrome, Firefox or Tizen
would look exactly the same.

Having all those APIs available to the Web is something everyone in the
WebAPI team want but this is not going to happen overnight and we should
be realistic and try to fix our problems one at a time.

--
Mounir
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>

Gene Vayngrib

unread,
Jul 20, 2013, 3:05:28 PM7/20/13
to
Hello Mounir,

So in your view the review process is the effective way to prevent security issues from flaring up in the Web APIs that have not reached maturity?

Allow me to challenge that. If you look at the amount of code in our framework, on top of which we are building Firefox OS apps ( http://github.com/urbien/urbini ), and the high volume of commits, you will see that reviewer will quickly get lost and certainly will not be able to keep up with the speed of updates.

1. May be what you mean is that the review process allows to remove the app from the marketplace and thus it is a control mechanism used to weed out the malicious or security prone apps?

But isn't this 'revocation from the marketplace' control mechanism applicable to the hosted apps, providing they are allowed to be installed on the device only if they are registered (but not reviewed) by the Marketplace?

2. In addition some companies may spring up to do virus checking and vulnerability analysis directly on the hosted app's website, as is quite common on the ecommerce sites, take a look at the survey: http://baymard.com/blog/site-seal-trust

Although such mechanism does not provide a bullet proof solution of crypto signatures with the additional guarantee of the uncompromisable zip of the app downloaded directly from the marketplace, if it is done automatically and hourly, it should be good enough. Then Firefox OS app install process could warn the user if the app does not have such a "trust seal". If app is not adorned with the "trust seal", then user will make a decision based on a brand name, app install stats and user reviews scores.

3. Marketplace can offer one more service, paid hosting. It can then serve hosted apps over SSL and have a way to verify apps in real time as they are changed by developers. Some underdog cloud companies will likely jump on this opportunity and will integrate with 'trust seal' companies. This gets Mozilla off the hook and may even provide an extra revenue stream.

These moves will allow even the immature Web APIs to be used by the hosted web apps, with 2 small requirement only, a manifest and a quick registration in a marketplace. This is not too much for the developer, and will not bog down Mozilla resources in supporting two app models, hosted and packaged, will free Mozilla resources from hiring, training and managing reviewers (even if they are just volunteers and ask for no pay) and what is more important, it will avoid creating a constant friction with developers around the lengthy and subjective review process. If you think it is non-existent problem, take a look at this two week old post by a developer and the answer from Mozilla: https://groups.google.com/forum/#!msg/mozilla.dev.webapps/Vma71BM2up4/s_MggYloat4J

4. If this is not enough, leave packaged apps for now for some special cases, like apps which need full access to the device's file system, or device backup apps, that need access to everything.

5. Meanwhile, before any changes are made, you might recommend to developers a hybrid model. A small packaged app that is fairly rarely changed and acts like a shell for an iframe with a mozbrowser attribute, where the rest of the app works. This will take the pressure off of the review process, by greatly reducing the frequency of app updates and the amount of code that reviewers need to read. This is the model we are offering to the third-party apps that use our Urbini framework.

Hope this helps,

-- Gene Vayngrib

Gene Vayngrib

unread,
Aug 5, 2013, 3:17:57 PM8/5/13
to
Google just introduced automatic malware scanner for Chrome Web Store submissions: https://plus.google.com/+GoogleChromeDevelopers/posts/3kpAu4VcP5E

They also seem to have this policy of falling back to manual reviews only if automatic checks failed, or app is resubmitted after being suspended: https://developers.google.com/chrome/web-store/faq#faq-listing-08

Matt Basta

unread,
Aug 5, 2013, 3:51:35 PM8/5/13
to Gene Vayngrib, dev-w...@lists.mozilla.org
> Google just introduced automatic malware scanner for Chrome Web Store submissions

We have similar technology in the Marketplace app-validator.

https://github.com/mozilla/app-validator

Our scans right now are very basic since there's not a whole lot you can do with the APIs that are available to privileged apps, and all apps are reviewed regardless of the validator's output (or whether the app is packaged or hosted). Personally, I prefer Google's approach since it means it's easier (and faster) for apps to get into the Marketplace, but that's a separate discussion.

----- Original Message -----
From: "Gene Vayngrib" <gene.v...@urbien.com>
To: dev-w...@lists.mozilla.org

Gene Vayngrib

unread,
Aug 16, 2013, 11:18:58 AM8/16/13
to
Matt, Mounir, in case you did not see it: an iOS malware app that defies static analysis and calls for a constant app monitoring infrastructure (and a kill switch) on the device.

http://www.technologyreview.com/news/518096/remotely-assembled-malware-blows-past-apples-screening-process/

Matt Basta

unread,
Aug 16, 2013, 11:44:43 AM8/16/13
to Gene Vayngrib, dev-w...@lists.mozilla.org
No static analysis tool is perfect, and no dynamic analysis tool is perfect. It's simply not possible. I'd argue that perfectly determining the intentions of the application are as hard as the halting problem, if not harder (will this code ever execute these other pieces of code in a way that's dangerous or insecure).

There also needs to be a balance of invasiveness vs trust: constant app monitoring leads to significant privacy concerns, and a kill switch can be abused. Just look at when Amazon kill-switched the book 1984 from Kindles.

Bear in mind also that the Firefox Marketplace is the only app store for a mobile OS that requires developers to submit the source code for their applications (since JS is interpreted rather than compiled). Emscripten can be used to hide code, but the code that does malicious things is difficult to obfuscate since it can't be compiled to asm.js. Our reviewers should be privy to potential malware issues.

Mounir Lamouri

unread,
Aug 20, 2013, 5:38:31 AM8/20/13
to dev-w...@lists.mozilla.org, Gene Vayngrib
On 16/08/13 16:18, Gene Vayngrib wrote:
> Matt, Mounir, in case you did not see it: an iOS malware app that defies static analysis and calls for a constant app monitoring infrastructure (and a kill switch) on the device.
>
> http://www.technologyreview.com/news/518096/remotely-assembled-malware-blows-past-apples-screening-process/

Hi Gene,

It is true that static analysis is not the ideal solution but this is
for the moment the best solution we came with to solve the security
model. This is a short term solution and hopefully we will find a better
solution for the long term.

If you have any ideas, please feel free to share.

Cheers,
--
Mounir

Jonas Sicking

unread,
Nov 17, 2013, 8:04:28 PM11/17/13
to Ben Francis, Anne van Kesteren, Marcos Cáceres, dev-w...@lists.mozilla.org, dev-b2g
Time to go through old threads again...

The short answer is "no, i don't think we can deprecate packaged apps yet".

The big problem is still how to expose APIs that are security
sensitive and whose implications we can't explain well enough to the
user. This generally mean all *security* sensitive APIs, we shouldn't
ask the user security questions. Privacy questions and things like
resource-use questions can be ok, but security questions generally
should be avoided.

So far the only solution we have is "review and sign by trusted 3rd
party", and the only realistic solution for signing that we have is
packaged apps. Or at least solutions that aren't "the web".

I think it will take many years to solve the packaged apps problem.
However it will take longer than that if we don't start working on it
now.

So what are we doing to try to help with the packaged apps problem?

We have started exploring creating a URL standard that would allow
linking to resources inside a .zip file. This would actually allow
having URLs to packaged apps. Not sure what the latest here is. Anne
could probably fill in the blanks.

We tried to standardize packaged apps so that they at least would be
interoperable. Unfortunately this is going less well. There's
currently at least 3 different standards: FirefoxOS packaged apps,
ChromeOS packaged apps, Widget packaged apps (which I believe is what
tizen is using). Attempts at getting these to align has so far failed.
Basically only mozilla has expressed willingness to actually change
the implementation and implement a standard.

My next plan here is to attack the hosted apps and get those
standardized. This is going much better and I hope that we can have a
spec draft for the bookmark-to-homepage use case in just a couple of
weeks. I've talked with multiple mobile browser vendors and there is a
lot of interest here (I don't want to speak for others which is why
I'm avoiding names).

Once we have that standardized, we have most of what we need for
hosted apps. Slap some API on top of that and you have an app store.
Though it's unclear who apart from us would be interested in
implementing such an API though.

We've also done some discussions with google around
WebIntents/WebActivities. Having standardized WebIntents/WebActivities
should enable more apps to not need the APIs that require "privileged"
level, and so would help reduce the packaged apps problem. At this
point I think we have a better understanding of what it would take to
create a standardized WebIntents/WebActivities spec.

Unfortunately we haven't gotten further than that here. Action is on
me to put together some information about what we know so far so that
others can pick up this work.

So while there is little concrete progress, we're not doing nothing.
But I'm all for help in doing more. In particular I'd like to see us
do more work on getting WebIntents/WebActivities standardized. And do
more work on finding safe-to-expose-to-the-web variants or subsets of
our existing privileged-only APIs.

I'd personally put a safe subset of DeviceStorage API high up on the
priority list.

/ Jonas

Anne van Kesteren

unread,
Nov 18, 2013, 8:38:28 AM11/18/13
to Jonas Sicking, Marcos Cáceres, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
On Mon, Nov 18, 2013 at 9:04 AM, Jonas Sicking <jo...@sicking.cc> wrote:
> We have started exploring creating a URL standard that would allow
> linking to resources inside a .zip file. This would actually allow
> having URLs to packaged apps. Not sure what the latest here is. Anne
> could probably fill in the blanks.

This does not seem to be going anywhere fast. The networking crowd is
not enthusiastic with it given the poor performance characteristics.
Nobody seems super keen on driving it forward. When the TAG discussed
it we ended up with some alternate format that would allow MIME
headers, so a folder on a server could be more effectively packaged.
But again, nobody is really driving that.


> My next plan here is to attack the hosted apps and get those
> standardized. This is going much better and I hope that we can have a
> spec draft for the bookmark-to-homepage use case in just a couple of
> weeks. I've talked with multiple mobile browser vendors and there is a
> lot of interest here (I don't want to speak for others which is why
> I'm avoiding names).

I think there's still a large problem with these applications. That
we're not giving them the same cache context as they would have inside
the browser. That makes a lot of things that are normal interactions
on the web today, a lot harder to implement. If we want to offer a no
shared cache feature, it should be opt-in by the application I think.


I think anything that deviates from the current browser security model
through some kind of end-user UI is not good. Bookmarking Gmail should
not affect the interaction model. We should move towards making the
browser the OS. Inventing new security models around URLs won't get us
there.


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 18, 2013, 10:32:01 AM11/18/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g



On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:

> > My next plan here is to attack the hosted apps and get those
> > standardized. This is going much better and I hope that we can have a
> > spec draft for the bookmark-to-homepage use case in just a couple of
> > weeks. I've talked with multiple mobile browser vendors and there is a
> > lot of interest here (I don't want to speak for others which is why
> > I'm avoiding names).
>
>
>
> I think there's still a large problem with these applications. That
> we're not giving them the same cache context as they would have inside
> the browser. That makes a lot of things that are normal interactions
> on the web today, a lot harder to implement.

Can you please provide a few more details and use cases. Or file bugs:
https://github.com/w3c/manifest/issues

> If we want to offer a no
> shared cache feature, it should be opt-in by the application I think.


That might be good. Different runtimes handle this differently (e.g., Chrome beta on Android shares everything with the browser, while apps added to the home screen on iOS don’t share anything).

> I think anything that deviates from the current browser security model
> through some kind of end-user UI is not good.

Not sure what you mean here? Which UI are you talking about?
> Bookmarking Gmail should
> not affect the interaction model. We should move towards making the
> browser the OS. Inventing new security models around URLs won't get us
> there.

I generally agree with what you are saying above.

--
Marcos Caceres



Anne van Kesteren

unread,
Nov 18, 2013, 10:35:54 AM11/18/13
to Marcos Caceres, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g
On Mon, Nov 18, 2013 at 11:32 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
> On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:
>> I think there's still a large problem with these applications. That
>> we're not giving them the same cache context as they would have inside
>> the browser. That makes a lot of things that are normal interactions
>> on the web today, a lot harder to implement.
>
> Can you please provide a few more details and use cases. Or file bugs:
> https://github.com/w3c/manifest/issues

It seems the onus for that should be on those attempting to change the
model. I think we should keep the current model.


>> I think anything that deviates from the current browser security model
>> through some kind of end-user UI is not good.
>
> Not sure what you mean here? Which UI are you talking about?

Bookmarking. And in particular bookmarking to home screen should not
have dramatically different security properties from just bookmarking.
They ought to be identical (and indeed, have negligible effect).


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 18, 2013, 4:44:54 PM11/18/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g


On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:

> On Mon, Nov 18, 2013 at 11:32 PM, Marcos Caceres <mcac...@mozilla.com (mailto:mcac...@mozilla.com)> wrote:
> > On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:
> > > I think there's still a large problem with these applications. That
> > > we're not giving them the same cache context as they would have inside
> > > the browser. That makes a lot of things that are normal interactions
> > > on the web today, a lot harder to implement.
> > >
> >
> >
> >
> > Can you please provide a few more details and use cases. Or file bugs:
> > https://github.com/w3c/manifest/issues
> >
>
>
>
> It seems the onus for that should be on those attempting to change the
> model. I think we should keep the current model.
>


I think its fine to put the onus on those working on this (mostly me) - but, I want to make sure we are aligned so I don’t go off in the weeds and come back with something you and others don’t like. I also don’t want us to change the current browser caching model unless it’s absolutely required. The spec is currently silent about this.

> > > I think anything that deviates from the current browser security model
> > > through some kind of end-user UI is not good.
> > >
> >
> >
> >
> > Not sure what you mean here? Which UI are you talking about?
>
> Bookmarking. And in particular bookmarking to home screen should not
> have dramatically different security properties from just bookmarking.
> They ought to be identical (and indeed, have negligible effect).
>


Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research [1], and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).

[1] http://w3c-webmob.github.io/installable-webapps/

--
Marcos Caceres



Anne van Kesteren

unread,
Nov 19, 2013, 6:28:18 AM11/19/13
to Marcos Caceres, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g
On Mon, Nov 18, 2013 at 9:44 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
> On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:
>> Bookmarking. And in particular bookmarking to home screen should not
>> have dramatically different security properties from just bookmarking.
>> They ought to be identical (and indeed, have negligible effect).
>
> Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research, and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).

So the differences I found that are not desirable are:

* If I'm logged into Facebook and then open a hosted app that Facebook
session is distinct. That's not the case in a browser today. I can see
us adding a feature that allows Facebook to opt out of sharing its
session with other sites, but by default I think we want the existing
model. (And definitely not change it around based on a bookmark.)

* When I click a link "the browser" opens. If the browser is core to
the OS everything in it should have a URL (whether visible or not,
that's an implementation detail) and navigation should happen
seamlessly between them. I don't think we need any behavioral
difference here between something that is bookmarked and something
that is not. Of course apps can opt in to having their links open in a
new window, using target=_blank.

I think part of the disconnect might be that the web is an actual OS
would be fundamentally different from what we have today. You cannot
really compare it with an OS that has a browser or an apps market
(such as Mac OS X and Firefox OS). The OS is the browser and the apps
market is the web.


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 19, 2013, 7:24:52 AM11/19/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g


> On 19 Nov 2013, at 11:28 am, Anne van Kesteren <ann...@annevk.nl> wrote:
>
>> On Mon, Nov 18, 2013 at 9:44 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
>>> On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:
>>> Bookmarking. And in particular bookmarking to home screen should not
>>> have dramatically different security properties from just bookmarking.
>>> They ought to be identical (and indeed, have negligible effect).
>>
>> Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research, and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).
>
> So the differences I found that are n