Can we deprecate packaged apps?

288 views
Skip to first unread message

Ben Francis

unread,
Jul 8, 2013, 5:31:49 PM7/8/13
to dev-b2g, dev-w...@lists.mozilla.org
Sorry for the typo in the subject line. It wasn't an attempt at a clever
pun...

Hello all,

Apologies for the length of this email, I didn't have time to write a
shorter one.

A year ago Jonas sent out an email [1] outlining the requirements for a new
breed of trusted web apps. He explained some of the challenges of
fulfilling these requirements with hosted apps and set out the rationale
for starting with a packaged app solution, with a view to exploring
something more "webby" later.

Now that we have shipped v1 of Firefox OS [2] with a packaged solution for
trusted apps I would like to re-open this discussion and get your feedback
on whether and how we might make trusted apps more web-like.

In order to gain access to many of the new APIs we have created in Firefox
OS, web content creators must change their entire distribution model and
package the assets of their app into a zip file to be signed and served
from one or more app stores. These assets do not have their own URIs on the
Internet and are served over a local app:// protocol instead of HTTP. This
is similar to how native apps on other mobile platforms work, and is also
similar to the packaged apps used in Chrome & Chrome OS, as well as W3C
widgets. However, it isn't much like how the web works. There is no one
definitive version of the app at a single URI and the update process is
very different.

The System Applications Working Group [3] at the W3C have actually started
some early drafts of specifications to standardise an app manifest/package
format and the app:// URI scheme, based largely on the work done by
Mozilla. Meanwhile at Google I/O, Google showed a version of the Chrome Web
Store [4] where hosted apps are re-branded simply as "web sites", with the
term "app" being limited to packaged apps using their own .crx packaging
format. I have also heard suggestions that support for hosted apps may at
some point be deprecated in Chrome altogether, in favour of supporting only
packaged apps. The message coming from both Mozilla and Google right now is
that all trusted apps must be packaged, hosted web apps can simply not be
trusted with access to new privileged APIs.

What's sad about this vision of the future is that many of the most
interesting apps that get written using web technologies like HTML, CSS and
JavaScript will not actually be part of the web. As Tim Berners-Lee
recently put it in an interview with the BBC about native apps [5], when
apps and their content don't have a URI on the Internet they are not part
of the "discourse" of the web and are therefore non-web. This was a topic
discussed at the "Meet the TAG" event hosted by Mozilla in London recently,
with members of the W3C's Technical Architecture Group expressing
disappointment in this trend.

Are we happy with a packaged model for trusted apps going forward, or is
now the time to embrace the challenge of making trusted apps genuinely part
of the web? Perhaps we don't even need to restrict our thinking to the
"app" and "app store" model and can explore ways of exposing more
privileged APIs to all web content in a trusted way.

If you're interested in the nitty gritty of this problem, I've tried to
summarise Jonas' original email below. I hope he forgives me if I
mis-represent him in any way, but you can read his original email in the
archives [1].

...

In his email, Jonas proposed the following requirements for trusted apps:
1. The ability for a trusted party to review an app and indicate some level
of trust in the app (or potentially in the app developer).
2. A mechanism for signing an app to verify that the app actually contains
the content that was reviewed.
3. Use of a minimum CSP policy for all pages of an app to ensure only the
reviewed code runs.
4. A separate data jar for local data to ensure a compromised web site can
not write to the local data of an app to alter the way it behaves.
5. A separate origin for the resources of an app so that the app can not be
tricked into running un-reviewed code from the same origin with escalated
privileges.

Jonas explained that the initial intention was to host trusted apps in the
same way as non-trusted apps, to retrieve the signatures for reviewed files
from an app store, but the files themselves directly from the app's own web
server.

He explained some problems with this approach:
a) HTTPS must be used to ensure proxies don't modify the headers or body of
HTTP responses, invalidating the signature. This would create an overhead
for app developers.
b) If multiple stores host signatures for the same app but review the app
at different speeds, updating of the app resources and signatures must be
synchronised between different stores and be limited to the speed of the
slowest review.
c) Signed resources have to be static because if a resource is dynamically
generated by a server side script, the signature would also need to be
dynamically generated, requiring a private key to be stored on the same
server, which largely defeats the object of the signing system.

It was argued that this would result in a system which, while hosted like
the rest of the web, is not very "webby" and is the worst of both worlds.

This led to the conclusion for us to package up trusted apps and to serve
their resources locally over a new app:// protocol.


My question is what might a hosted solution to this problem look like?

Ben

1. https://groups.google.com/forum/#!topic/mozilla.dev.webapps/hnCzm2RcX5o
2. https://wiki.mozilla.org/WebAPI
3. http://www.w3.org/2012/sysapps/
4. https://twitter.com/bfrancis/status/335728228897550336/photo/1
5. http://www.bbc.co.uk/iplayer/episode/b036zdhg/Click_06_07_2013/

Benjamin Smedberg

unread,
Jul 8, 2013, 8:36:17 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
On 7/8/2013 5:31 PM, Ben Francis wrote:
> What's sad about this vision of the future is that many of the most
> interesting apps that get written using web technologies like HTML, CSS and
> JavaScript will not actually be part of the web.
I don't think this is true or sad!

One of the fundamental strengths of the web is that you can download an
entire "page" and save it locally, and even view-source and hack it.

With the advent of online web apps, it has become much more difficult to
view-source the web, because dynamic loading and complex multi-page apps
often don't give you a way to download and view-source the entire app.
It also means that there is often not clear separate between the client
logic in the app and the server logic that most apps depend on.

Packaged apps are the most elegant "stupidly simple" solution to the
offline problem that continues to plague those who want unpackaged apps
to work as if they were downloadable entities.

Packaged apps are not a problem or something to be "sad" about, but
something to rejoice in. They are a way of empowering users. We should
be encouraging all app authors to use packaged apps, even if they don't
need any special permissions.

To solve the problems mentioned by TimBL, we should just make sure that
servers can serve up a packaged app at a URL, and clients can just use
it and choose to keep it on their homescreen or not, and their client
would keep using that URL to check for updates. (Ignoring for the moment
the special security requirements with high-privilege APIs, stores,
signing, and all that stuff). Then "app search" is again the same as
"web search".

--BDS

Mark Giffin

unread,
Jul 8, 2013, 11:33:53 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I can't see this outside the UK, I'm in the US of A:

5. http://www.bbc.co.uk/iplayer/episode/b036zdhg/Click_06_07_2013/

Alternate available?

Mark

Matt Basta

unread,
Jul 8, 2013, 11:46:35 PM7/8/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Without arguing in favor of or against packaged apps and to address the closing question of "what might a hosted solution to this problem look like?", I think the first and most important issue is this: developers need something better than appcache for offline support.

Without packaged apps, the only alternative is appcache, and appcache is bad. It's hard to test, it's buggy, it oftentimes (most of the time) doesn't work as expected, it doesn't work in a deterministic way, there's no dev tools for it in Firefox (!), until very (VERY) recently it wasn't possible to clear it in the FXOS Simulator, and it's a source of general confusion and dislike. If we were to imagine an app ecosystem without packaged apps, offline support would be the most lacking component.
What's sad about this vision of the future is that many of the most
interesting apps that get written using web technologies like HTML, CSS and
_______________________________________________
dev-webapps mailing list
dev-w...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-webapps

Tim Chien

unread,
Jul 9, 2013, 1:31:11 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I want to echo Ben's opinion here. While some of the features are
irreplaceable at the time being (offline access, reviewable by
marketplace, sandboxing our experimental APIs, even portability,
etc.), packaged app itself is not free:

-- By distributing the app in zip packages and not simply as an URL,
the OS would have to handle preloading/updating/deleting/etc.
-- While these were all implemented (in a rush way for v1.0.1), many
of the proposals awaits, for example, shared resource packages,
library packages. I highly suspect that we won't ended up
re-implementing Debian package system in Gecko if we go down this
path.
-- By moving away from a universal URL (ironically that's what U
stands for), places where a valid URL is required will fail, e.g.
OAuth auth flow.
-- Our datajar policy is not directly related to packaged app, but
many of the surprises (for users and for developers) come from that
too (e.g. user would have to login Facebook at least 3 times in order
to use it in Browser app, Website-on-home-screen, and Facebook app
itself.)

It would make sense if Mozilla have a solid roadmap toward a solution
for the Open Web itself instead of relying on packaged apps and spend
time to solve it's technical (and non-technical) problems.

Fabrice Desre

unread,
Jul 9, 2013, 2:02:21 AM7/9/13
to dev...@lists.mozilla.org

On 07/08/2013 10:31 PM, Tim Chien wrote:
> I want to echo Ben's opinion here. While some of the features are
> irreplaceable at the time being (offline access, reviewable by
> marketplace, sandboxing our experimental APIs, even portability,
> etc.), packaged app itself is not free:
>
> -- By distributing the app in zip packages and not simply as an URL,
> the OS would have to handle preloading/updating/deleting/etc.

So, what about using a manifest-based url scheme:
mnf:http://myapp.com/manifest.webapp!/path/to/frame.html
This would let address resources from both hosted and packaged apps in
the same way, and let convert apps from hosted <-> packaged.

> -- While these were all implemented (in a rush way for v1.0.1), many
> of the proposals awaits, for example, shared resource packages,
> library packages. I highly suspect that we won't ended up
> re-implementing Debian package system in Gecko if we go down this
> path.

I'm not sure I understand you there. I don't think we want shared
resources - that would lead to the terrible situation of firefox OS only
apps.

> -- By moving away from a universal URL (ironically that's what U
> stands for), places where a valid URL is required will fail, e.g.
> OAuth auth flow.

app:// urls are urls... they are just not recognized by oauth providers.
That may change, or not.

> It would make sense if Mozilla have a solid roadmap toward a solution
> for the Open Web itself instead of relying on packaged apps and spend
> time to solve it's technical (and non-technical) problems.

This is what people like Jonas, Mounir, Marcos and others are doing.
That's hard and will take time.

Fabrice
--
Fabrice Desré
b2g team
Mozilla Corporation

Kan-Ru Chen (陳侃如)

unread,
Jul 9, 2013, 2:09:12 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev...@lists.mozilla.org
Seconds.

We should move the web forward instead of restrict us to the packaged
app approach.

Ben Francis <bfra...@mozilla.com> writes:

[...]
> Are we happy with a packaged model for trusted apps going forward, or is
> now the time to embrace the challenge of making trusted apps genuinely part
> of the web? Perhaps we don't even need to restrict our thinking to the
> "app" and "app store" model and can explore ways of exposing more
> privileged APIs to all web content in a trusted way.
>
> If you're interested in the nitty gritty of this problem, I've tried to
> summarise Jonas' original email below. I hope he forgives me if I
> mis-represent him in any way, but you can read his original email in the
> archives [1].
>
> ...
>
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.

IMO these two are the biggest problems. The current trust model on the
web is to trust the author; how do we extend that to trust the content?

> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>
> Jonas explained that the initial intention was to host trusted apps in the
> same way as non-trusted apps, to retrieve the signatures for reviewed files
> from an app store, but the files themselves directly from the app's own web
> server.
>
> He explained some problems with this approach:
> a) HTTPS must be used to ensure proxies don't modify the headers or body of
> HTTP responses, invalidating the signature. This would create an overhead
> for app developers.
> b) If multiple stores host signatures for the same app but review the app
> at different speeds, updating of the app resources and signatures must be
> synchronised between different stores and be limited to the speed of the
> slowest review.

Packaged app could also suffer from the slow review speed. The app has
to handle the co-existing of old/new app version anyway. If the app
could use a version-ed URL then the signing won't have to be
synchronized between stores.

> c) Signed resources have to be static because if a resource is dynamically
> generated by a server side script, the signature would also need to be
> dynamically generated, requiring a private key to be stored on the same
> server, which largely defeats the object of the signing system.
>
> It was argued that this would result in a system which, while hosted like
> the rest of the web, is not very "webby" and is the worst of both worlds.
>
> This led to the conclusion for us to package up trusted apps and to serve
> their resources locally over a new app:// protocol.

Later the app:// protocol also served other purposes like faster app
loading.

Kanru

Brett Zamir

unread,
Jul 9, 2013, 5:38:27 AM7/9/13
to Benjamin Smedberg, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hi,

Encouraging to see this discussion as my dissatisfaction with the hassle
of packaged apps led me to my however humble work on
https://github.com/brettz9/asyouwish/ . Replies below...

On 7/9/2013 8:36 AM, Benjamin Smedberg wrote:
> On 7/8/2013 5:31 PM, Ben Francis wrote:
>> What's sad about this vision of the future is that many of the most
>> interesting apps that get written using web technologies like HTML,
>> CSS and
>> JavaScript will not actually be part of the web.
> I don't think this is true or sad!
>
While I agree with some of the motivation behind your arguments (though
see below), I think the situation is indeed sad for a number of reasons.

I have personally witnessed several examples in my or others' proposals
to WhatWG which were rejected because the WhatWG apparently felt that
HTML beyond the Web was not their business or that a server environment
can be taken for granted. One example was the rejection (at least at the
time) of client-side includes, another was for a mechanism of
client-side entities, and another was the cold reassurance that the
adding of new globals like "Worker" (instead of scoping new APIs into
some namespace, as has been done in some cases by adding to say the
"navigator" global) was safe because Google's search engine was used to
check whether a variable had been used frequently or not.

There indeed appears to be even an open admission of failing to take
into account all environments of usage, though I would agree that that
itself does not necessitate moving away from non-web usages (and I would
hope the dismissive "online-only" attitude would change). However, I
think there are other compelling reasons for privileged online apps (if
I can extend "online" to also include the "file://" protocol along with
http/s), with the most important being the ease of publishing and
consumption of such apps.

> One of the fundamental strengths of the web is that you can download
> an entire "page" and save it locally, and even view-source and hack it.
>
> With the advent of online web apps, it has become much more difficult
> to view-source the web, because dynamic loading and complex multi-page
> apps often don't give you a way to download and view-source the entire
> app. It also means that there is often not clear separate between the
> client logic in the app and the server logic that most apps depend on.
>
First of all, there are tools to view the generated source or
potentially view generated code so at least client-side code can be
introspected (and generally without going through other additional hoops
as with packaged apps).

Yes, preventing online distribution of apps in favor of packaged ones
may COMPEL ALL people to distribute their code in entirety (or more
likely, go to non-web solutions), but why force this? It is one thing to
want applications written in HTML or JavaScript so that one can
introspect the code that is run in one's own machine in a familiar
language (though I'm not so sure force is the best solution even there).
But if you want to see server-side-dependent code become less opaque,
why not go with the following less universally compelling and cumbersome
solutions?

1) Evangelize licenses like the AGPL if you want to promote apps
including their server-side code.

Few developers choose this license from what I can tell, however, as we
ourselves don't want to be compelled so stringently--though if you do
send us code, many will indeed bristle against our own machines being
used as black boxes (as with non-standard plug-in code).

2) Bake more server-interactive functionality into client-side languages
like HTML/JavaScript in a transparent manner so that one would, for such
apps, be clear on what is (most likely) happening on the server without
needing to introspect its code even while leaving apps free to interact
with the server.***

> Packaged apps are the most elegant "stupidly simple" solution to the
> offline problem that continues to plague those who want unpackaged
> apps to work as if they were downloadable entities.

But many of us--as users or developers--don't want apps to work as
downloadable entities! It is a hassle to package apps, it slows
development time, it is annoying to users--including to users who wish
to inspect them (e.g., all the darn XPI and JAR files I have to unzip to
introspect some Firefox extensions). Moreover, packaging, if it needs to
exist at all, should be an implementation detail--if your browser really
feels the need to save space, go ahead and do the packaging for me, but
don't require me to zip things myself, learn build scripts, etc.

The web is great in large part because anyone can start working on it
today without going through unnecessary hoops. File packaging is a
hoop--and an annoying one given especially the frequency of use we're
describing here. I want to spend time programming, not zipping or
unzipping files. I can't tell you how much faster I have been able to
develop (despite a number of personal limitations) what I think are some
pretty cool proof-of-concept demos without the hassle of file packaging
by using my AsYouWish add-on which is my interim solution for such
transparent requests of privileges from regular websites (to the SDK
API): https://github.com/brettz9/asyouwish/tree/master/demos
(experimental Firefox addon needed,
https://addons.mozilla.org/en-US/firefox/addon/as-you-wish/ along with
add-on options configured to even accept privileged requests from a
given protocol/site)

> Packaged apps are not a problem or something to be "sad" about, but
> something to rejoice in. They are a way of empowering users. We should
> be encouraging all app authors to use packaged apps, even if they
> don't need any special permissions.
>
If the criterion is to ensure code is self-contained, then one could
achieve this by allowing web authors to send a flag in their code
indicating the program was wholly self-contained (kind of a reverse
appcache---cache everything by default).

But it is hardly inevitable or pleasurable that one should:
1) be forced to take an extra steps of packaging my own app for
distribution, unpackaging someone else's app to introspect or hack on
it, or deal with a build system. I also want the Web to be equally easy
for any newcomers to the Web whose ideas may otherwise remain
undeveloped because of these however seemingly small barriers to entry.
2) be restricted from developing apps which have server interactions
where the interactivity is useful

I would hope that in the search for a non-packaged privileged access
solution, the (potentially cross-browser) proposed privileged API for
mobile and non-mobile, independent apps and extensions (of desktop or
mobile browsers or systems), could be harmonized with each other:
https://bugzilla.mozilla.org/show_bug.cgi?id=848647 and even with the
likes of Node.js: https://bugzilla.mozilla.org/show_bug.cgi?id=855936

Best wishes,
Brett

*** There is a lot of lip service given to building RESTful
applications, but the fact remains that RESTful APIs are not wholly
reusable; different sites use different query parameters for the same
purpose.

If, however, certain headers were used to make make secure but otherwise
arbitrary XPath or CSS Selector queries against a document (e.g.,
somewhat along the lines of the Range header, but for HTML-aware
piecemeal delivery of documents instead of byte-wise access), besides
enabling privileged clients to treat the web, including otherwise static
files, more as its own XML database, and besides allowing one to get
genuine reusability across websites of web-friendly query APIs (and thus
better introspectability), HTML itself could prescribe usage of these
headers to tie in markup for server-side interaction, thereby also
minimizing server-side coding and need for 3rd-party library inclusion
(unless so desired by the server) while ensuring powerful querying
functionality is more available by default on even simple web documents.

For example, for clients advertising support for this header feature,
even a static document could be scanned by a server before being served
to the user, and delivered, e.g., with all tables above say 50 rows to
be truncated to the first 50 rows, and with markup added to indicate to
the client (subject to its own user preferences) that _the table could
have pagination controls added browser-side_. This would provide
frequently-used functionality without need for third party libraries or
custom server-side programming.

Similar functionality could be added to allow hierarchical list
drill-downs, paragraph range selections, etc., and although schemas have
fallen out of favor, if also specified (as in a header), a browser could
do even more in partnership with the server, e.g.:

1) type-aware sorting of tables (without the ugliness of what is
apparently emerging out of the current standard with each cell needing
its own type markup)
2) type-aware searches of tables or other elements, e.g., browser
display of date controls for date-type columns (or number ranges for
numeric fields, etc.), allowing users to make queries to obtain all
records whose values lie within a specific date range.

Mechanisms could also be added to allow easy caching/offline storage of
hitherto-retrieved rows.

The above markup-header interaction could encourage introspection by:
1) Facilitating offline application development
2) Avoiding an undue generation of custom coding and thus effort in
deciphering server-side APIs and code

...while continuing to avoid coupling the HTML language to any database
or static file format (and avoiding file packaging).

I've started some initial work on this idea at
https://github.com/brettz9/httpquery though I am currently trying to
simplify the proposed header field syntax and streamline it with
standard practices.

> To solve the problems mentioned by TimBL, we should just make sure
> that servers can serve up a packaged app at a URL, and clients can
> just use it and choose to keep it on their homescreen or not, and
> their client would keep using that URL to check for updates. (Ignoring
> for the moment the special security requirements with high-privilege
> APIs, stores, signing, and all that stuff). Then "app search" is again
> the same as "web search".
>
> --BDS
>

Brett Zamir

unread,
Jul 9, 2013, 5:52:14 AM7/9/13
to Brett Zamir, dev-w...@lists.mozilla.org, Ben Francis, Benjamin Smedberg, dev-b2g
On 7/9/2013 5:38 PM, Brett Zamir wrote:
> Hi,
>
> Encouraging to see this discussion as my dissatisfaction with the
> hassle of packaged apps led me to my however humble work on
> https://github.com/brettz9/asyouwish/ . Replies below...
>

I should clarify here that I mean dissatisfaction with packaged apps in
the generic sense of browser extensions as well as packaged mobile web
apps. (I have not yet made a FF OS version of AsYouWish.)

Brett

Salvador de la Puente González

unread,
Jul 9, 2013, 10:58:04 AM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Hello

I want to say I totally support the ideal of removing packaged apps.
INMHO, I think we need a way to move from on-line to off-line
applications in a transparent way in order to make "installation"
process almost trivial. For this purpose some mechanism like offline
cache is necessary. If current implementation is bugged, let's find a
better alternative.

For the problems exposed by Jonas, I'm not an expert but some of them
can be easily addressed:

On 08/07/13 23:31, Ben Francis wrote:
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
With a list of downloadable content to be installed (some like the
Offline Cache), a third party could retrieve a version of the software,
then review it.
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
With a digest algorithm based on the content of each file the third
party and the User Agent could compute the same signature and see if it
matches. I have in mind some sha1-based like in Git. At least, all is
about content.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
Sorry but I don't understand the problem here. I can currently load an
external script from a packaged application and run its code.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
If the UA receive an update order for the Offline Cache of a determined
App, it can perform another digest and sent to the third party in order
to see if the new code has been reviewed. Maybe I did not understand
what is a compromised web in this context or how it could be a hazard..
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>
Same as above. With the digest mechanism you can say if the version the
third party reviewed and the device version differs, then ask the user.

If you think the digest process delays the application execution, it is
true but you can cover this time by adding it to the
"installation/update process".

Maybe I'm very naive about security so if you can clarify me some
aspects I did not take in count, I was very pleased to read about.

Best!

________________________________

Este mensaje se dirige exclusivamente a su destinatario. Puede consultar nuestra política de envío y recepción de correo electrónico en el enlace situado más abajo.
This message is intended exclusively for its addressee. We only send and receive email on the basis of the terms set out at:
http://www.tid.es/ES/PAGINAS/disclaimer.aspx

Kannan Vijayan

unread,
Jul 9, 2013, 11:52:46 AM7/9/13
to dev...@lists.mozilla.org
The thoughts and ideas presented in this mail are very cogent. Some of
these issues
can probably be addressed with more independence than others.

Lifting the app:// protocol up to https://, leaving everything else the
same, doesn't
seem like it should be that big of a difficulty. This would resolve the
"apps don't have
a URI" issue directly.. if perhaps in a shallow way.

The other issues seem more challenging. The main challenge of packaged vs.
non-packaged apps relates to deficiencies in connectivity in the current
mobile world.
The web is designed for always-connected. Any page or resource is
implicitly assumed
to have the ability and right to send you to another location to fetch
some associated
information. The web doesn't try to force a clear notion of boundaries
between
what should be included in a single pull and what's OK to leave to
secondary pulls
induced by HREFs embedded in the first pull.

This model doesn't fit so well with the mobile world, where each access
to the
network, and each byte transferred, may be ridiculously expensive. Under
that context,
it's very important to clearly define the boundaries between actions
that induce network
traffic, and actions that stay within resources that have already been
pulled.

We need some way to transparently indicate to the OS that "this is the
set of resources
you need to pre-fetch to minimize the set of subsequent fetches when
using the
application".

Fabrice mentioned the possibility of app manifests, and I think that's a
very workable
approach to this. In fact, it would be great if we can introduce the
notion of packaged
resources to the web transparently, independent of the specific
mobile-apps use case.

Using Fabrice's suggested syntax, a URL of the form:

http://host/some/path/name!/sub/path

Would indicate to the client agent that the resource located at
'.../name' is allowed
(but not required) to be a packaged file of some set of standardized
formats (e.g. zip).

The specified client behaviour when fetching this URL would be to first
try fetching
the full path: ("http://host/some/path/name!/sub/path"), and if that
request fails
with a NOT FOUND, to try fetching "http://host/some/path/name", checking
to see if
the identified resource is an appropriately packaged file, and if so,
retrieving
"/sub/path" from within the package, and using that as a resource.

This would keep backwards compatibility, and transparently introduce a
notion
of locality to packaged web-resources, which should help us take another
step towards making packaged apps first class citizens of the web.

Kannan
> What's sad about this vision of the future is that many of the most
> interesting apps that get written using web technologies like HTML, CSS and
> JavaScript will not actually be part of the web. As Tim Berners-Lee
> recently put it in an interview with the BBC about native apps [5], when
> apps and their content don't have a URI on the Internet they are not part
> of the "discourse" of the web and are therefore non-web. This was a topic
> discussed at the "Meet the TAG" event hosted by Mozilla in London recently,
> with members of the W3C's Technical Architecture Group expressing
> disappointment in this trend.
>
> Are we happy with a packaged model for trusted apps going forward, or is
> now the time to embrace the challenge of making trusted apps genuinely part
> of the web? Perhaps we don't even need to restrict our thinking to the
> "app" and "app store" model and can explore ways of exposing more
> privileged APIs to all web content in a trusted way.
>
> If you're interested in the nitty gritty of this problem, I've tried to
> summarise Jonas' original email below. I hope he forgives me if I
> mis-represent him in any way, but you can read his original email in the
> archives [1].
>
> ...
>
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>
> Jonas explained that the initial intention was to host trusted apps in the
> same way as non-trusted apps, to retrieve the signatures for reviewed files
> from an app store, but the files themselves directly from the app's own web
> server.
>
> He explained some problems with this approach:
> a) HTTPS must be used to ensure proxies don't modify the headers or body of
> HTTP responses, invalidating the signature. This would create an overhead
> for app developers.
> b) If multiple stores host signatures for the same app but review the app
> at different speeds, updating of the app resources and signatures must be
> synchronised between different stores and be limited to the speed of the
> slowest review.
> c) Signed resources have to be static because if a resource is dynamically
> generated by a server side script, the signature would also need to be
> dynamically generated, requiring a private key to be stored on the same
> server, which largely defeats the object of the signing system.
>
> It was argued that this would result in a system which, while hosted like
> the rest of the web, is not very "webby" and is the worst of both worlds.
>
> This led to the conclusion for us to package up trusted apps and to serve
> their resources locally over a new app:// protocol.
>
>
> dev-b2g mailing list
> dev...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-b2g

Kumar McMillan

unread,
Jul 9, 2013, 4:24:16 PM7/9/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
I see a strong future for the concept of packaged apps in a webby way. Instead of URLs, think of each app as a "URL" that is addressable by its unique content hash. This would allow a decentralized and distributed "web" where all resources are shared. Consider that web server you throw money at to handle traffic: It could be replaced by millions of mobile devices (perhaps even with peer to peer connections).

Thus, it might be beneficial to look at the research work being done in this field as a guide: Named Data Networking and Content Centric Networking.

https://en.wikipedia.org/wiki/Named_data_networking
http://www.parc.com/work/focus-area/content-centric-networking/

There are a lot of working NDN prototypes (https://github.com/named-data); the packaged apps problem seems very similar in nature.

Kumar

Antonio Manuel Amaya Calvo

unread,
Jul 9, 2013, 5:13:10 PM7/9/13
to Salvador de la Puente González, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hey all.

You know, at the risk of being the discordant voice here, I don't see
what the issue with packaged apps is. It's not like you're required to
use packaged app if you want do develop a non special-api-using app. You
can just develop the app as you would normally, host it somewhere, test
it that way and then, after you've finished developing it, if it's
mostly static content (static HTMLs, JSs and images) then you have the
*option* to package it so your users can use it without having a network
connection. You don't have to do it, though, it's your decision. If you
prefer not to package it, you can just distribute it as a hosted app.

Oh, and the cool thing is that so long as you've only used relative URIs
on your app, you won't have to change a single iota to change it from a
hosted app to a packaged app.

So, for normal apps, I don't see having packaged apps as a bad thing.
It's an extra option, and having more options is good on my book. I
don't believe in forcing people into any vision of what the web should
be, and what is webby and what isn't. The web will be whatever people
make of it. If packaged apps are useful they will prosper and grow, if
they're not, they'll shrink and die. And that's good.

Now, for privileged app, that's another problem altogether, and
something that doesn't actually have a close relative on the existing
web. At Firefox OS we're exposing some APIs to content that are or can
be dangerous if used incorrectly or with bad intent. And to do that,
we've taken onto us the responsibility of telling the user, hey, it's ok
if you allow this app to access your contact because WE have checked it
and WE have seen it will treat your data with the due respect. We cannot
do that unless we ensure that the content of the app we've examined is
*static*. We could give the option for that content to be served
remotely, along with the signature of every piece of code we download
(raw hashes as Salva said below will just not work). But the server
cannot serve dynamic content, cause otherwise the signatures won't be of
any use. And I'm not even sure if even that would be enough, since a
developer could send to be signed more content than he actually uses...
Anyway...

On that model, every time a developer wanted to change a single comma on
his code he would have to resend all the content (and not just the code
he just changed, because of possible interactions) to be reexamined.
Then a trusted third party (cause I, as a user, love developer but don't
trust them as far as I can throw them) will sign the data again, and
send the signatures back and only then the developer can change one set
of static content for another set of static content.

What does he win with this model, compared with the current one? Just
the actual packaging of the app. What do end users win? Hmm... without
any way of local caching of content, they actually *lose*. They'll have
to download the static content over and over and over. Oh, and the
signature will have to be re-verified every time the content is loaded
(while actually is only checked at install time).

Going away from a packaged/controlled environment for privileged app
will make harder to actually trust them. The way I would actually like
for us to go is to, slowly, start opening some APIs that are currently
certified only to be privileged, so more and richer third party apps can
be added to the ecosystem. And I don't see that happening if we relax
the current trust/security model.

Best regards,

Antonio



On 09/07/2013 16:58, Salvador de la Puente González wrote:
> Hello
>
> I want to say I totally support the ideal of removing packaged apps.
> INMHO, I think we need a way to move from on-line to off-line
> applications in a transparent way in order to make "installation"
> process almost trivial. For this purpose some mechanism like offline
> cache is necessary. If current implementation is bugged, let's find a
> better alternative.
>
> For the problems exposed by Jonas, I'm not an expert but some of them
> can be easily addressed:
>
> On 08/07/13 23:31, Ben Francis wrote:
>> In his email, Jonas proposed the following requirements for trusted
>> apps:
>> 1. The ability for a trusted party to review an app and indicate some
>> level
>> of trust in the app (or potentially in the app developer).
> With a list of downloadable content to be installed (some like the
> Offline Cache), a third party could retrieve a version of the software,
> then review it.
>> 2. A mechanism for signing an app to verify that the app actually
>> contains
>> the content that was reviewed.
> With a digest algorithm based on the content of each file the third
> party and the User Agent could compute the same signature and see if it
> matches. I have in mind some sha1-based like in Git. At least, all is
> about content.
>> 3. Use of a minimum CSP policy for all pages of an app to ensure only
>> the
>> reviewed code runs.
> Sorry but I don't understand the problem here. I can currently load an
> external script from a packaged application and run its code.
>> 4. A separate data jar for local data to ensure a compromised web
>> site can
>> not write to the local data of an app to alter the way it behaves.
> If the UA receive an update order for the Offline Cache of a determined
> App, it can perform another digest and sent to the third party in order
> to see if the new code has been reviewed. Maybe I did not understand
> what is a compromised web in this context or how it could be a hazard..
>> 5. A separate origin for the resources of an app so that the app can
>> not be
>> tricked into running un-reviewed code from the same origin with
>> escalated
>> privileges.
>>
> Same as above. With the digest mechanism you can say if the version the
> third party reviewed and the device version differs, then ask the user.
>
> If you think the digest process delays the application execution, it is
> true but you can cover this time by adding it to the
> "installation/update process".
>
> Maybe I'm very naive about security so if you can clarify me some
> aspects I did not take in count, I was very pleased to read about.
>
> Best!
>
> ________________________________
>
> Este mensaje se dirige exclusivamente a su destinatario. Puede
> consultar nuestra política de envío y recepción de correo electrónico
> en el enlace situado más abajo.
> This message is intended exclusively for its addressee. We only send
> and receive email on the basis of the terms set out at:
> http://www.tid.es/ES/PAGINAS/disclaimer.aspx
> _______________________________________________
> dev-b2g mailing list
> dev...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-b2g


Nicolas B. Pierron

unread,
Jul 9, 2013, 5:25:54 PM7/9/13
to mozilla...@lists.mozilla.org
On 07/09/2013 08:52 AM, Kannan Vijayan wrote:
> Lifting the app:// protocol up to https://, leaving everything else the
> same, doesn't
> seem like it should be that big of a difficulty. This would resolve the
> "apps don't have
> a URI" issue directly.. if perhaps in a shallow way.

Except that we might not want to penalize the loading time of applications
by doing 2 synchronous requests to the server for each file.

> The other issues seem more challenging. The main challenge of packaged vs.
> non-packaged apps relates to deficiencies in connectivity in the current
> mobile world.
> […]
>
> This model doesn't fit so well with the mobile world, where each access
to the
> network, and each byte transferred, may be ridiculously expensive. Under
> that context,
> it's very important to clearly define the boundaries between actions that
> induce network
> traffic, and actions that stay within resources that have already been
pulled.
>

I agree, and where I think packaged app (zip of all used files) are good for
the size of the initial download (even if they are way smaller than Android
ones) but I do not think this is good for updates. Suppose, I have a web
site with 10 pages. I want to add an extra page. I will probably modify
one existing page and add the new page into the zip file. Why should we
re-download all resources, instead of the 2 modified files?

Packaged apps have the advantages of providing a way to do one network
access to see if a new version is available where app-cache will do one
access for each page, and might even get an inconsistent update. To solve
that we can only check the manifest file for updates, and assume that the
manifest file (if present) ensure that other resources present in the
app-cache are still valid.

Packaged apps are smaller and faster to read, as they are contained in one
monolithic zip file. But I do not see any limitations in Gecko for building
a zip file out-of the app-cache manifest. On the contrary, we can even do
better by sorting the files in function of the order in which they are
listed in other pages or how frequently they are used.

In addition, if we are packing our-self the resources, we can re-pack them
with additional meta-data such as the pre-compiled code for asm.js or the
pre-parse lazy-script & bytecode for non-asm.js code.

--
Nicolas B. Pierron

Fabrice Desre

unread,
Jul 9, 2013, 5:45:16 PM7/9/13
to Nicolas B. Pierron, mozilla...@lists.mozilla.org
On 07/09/2013 02:25 PM, Nicolas B. Pierron wrote:

> I agree, and where I think packaged app (zip of all used files) are good
> for the size of the initial download (even if they are way smaller than
> Android ones) but I do not think this is good for updates. Suppose, I
> have a web site with 10 pages. I want to add an extra page. I will
> probably modify one existing page and add the new page into the zip
> file. Why should we re-download all resources, instead of the 2
> modified files?

True, and I'd like us to support differential updates, at least from
version N-1 to version N. Someone at version N-2 updating to version N
would need to do a full update.

> Packaged apps have the advantages of providing a way to do one network
> access to see if a new version is available where app-cache will do one
> access for each page, and might even get an inconsistent update. To
> solve that we can only check the manifest file for updates, and assume
> that the manifest file (if present) ensure that other resources present
> in the app-cache are still valid.

appcache also does a single check by downloading the manifest. Only if
this one has changed are the other resources checked.

> Packaged apps are smaller and faster to read, as they are contained in
> one monolithic zip file. But I do not see any limitations in Gecko for
> building a zip file out-of the app-cache manifest. On the contrary, we
> can even do better by sorting the files in function of the order in
> which they are listed in other pages or how frequently they are used.

Fwiw, we did zip reordering at some point and that was not helping with
startup time, so that didn't land.

> In addition, if we are packing our-self the resources, we can re-pack
> them with additional meta-data such as the pre-compiled code for asm.js
> or the pre-parse lazy-script & bytecode for non-asm.js code.

How much would we gain from doing that?

Kannan Vijayan

unread,
Jul 9, 2013, 5:50:21 PM7/9/13
to dev...@lists.mozilla.org
On 13-07-09 5:25 PM, Nicolas B. Pierron wrote:
> I agree, and where I think packaged app (zip of all used files) are
> good for the size of the initial download (even if they are way
> smaller than Android ones) but I do not think this is good for
> updates. Suppose, I have a web site with 10 pages. I want to add an
> extra page. I will probably modify one existing page and add the new
> page into the zip file. Why should we re-download all resources,
> instead of the 2 modified files?
This is a good point, so maybe a self-contained zip file is a bad idea.
The key requirement is that the OS have a way to determine what
resources it needs to pre-fetch to enable a notion of local context so
that the "app install" can be separated from "app use" - and so users
can do heavyweight downloads for installs when they're on cheaper
networks like wifi).

I suppose to that end, the existing offline cache.manifest seems "almost
there". It can be expanded to include app metadata, along with update
metadata.

> Packaged apps have the advantages of providing a way to do one network
> access to see if a new version is available where app-cache will do
> one access for each page, and might even get an inconsistent update.
> To solve that we can only check the manifest file for updates, and
> assume that the manifest file (if present) ensure that other resources
> present in the app-cache are still valid.
An expanded offline offline manifest file spec can easily include
provision for specifying version numbers for the identified resources.
In that case, checking for updates would still be one network access
(get a new manifest file, compare version numbers of resources within),
but actual updating might require multiple accesses to update each
individual file.

Kannan

Vivien

unread,
Jul 9, 2013, 6:03:11 PM7/9/13
to dev...@lists.mozilla.org
On 09/07/2013 23:45, Fabrice Desre wrote:
> On 07/09/2013 02:25 PM, Nicolas B. Pierron wrote:
>
>> I agree, and where I think packaged app (zip of all used files) are good
>> for the size of the initial download (even if they are way smaller than
>> Android ones) but I do not think this is good for updates. Suppose, I
>> have a web site with 10 pages. I want to add an extra page. I will
>> probably modify one existing page and add the new page into the zip
>> file. Why should we re-download all resources, instead of the 2
>> modified files?
> True, and I'd like us to support differential updates, at least from
> version N-1 to version N. Someone at version N-2 updating to version N
> would need to do a full update.
>
>> Packaged apps have the advantages of providing a way to do one network
>> access to see if a new version is available where app-cache will do one
>> access for each page, and might even get an inconsistent update. To
>> solve that we can only check the manifest file for updates, and assume
>> that the manifest file (if present) ensure that other resources present
>> in the app-cache are still valid.
> appcache also does a single check by downloading the manifest. Only if
> this one has changed are the other resources checked.
>
>> Packaged apps are smaller and faster to read, as they are contained in
>> one monolithic zip file. But I do not see any limitations in Gecko for
>> building a zip file out-of the app-cache manifest. On the contrary, we
>> can even do better by sorting the files in function of the order in
>> which they are listed in other pages or how frequently they are used.
> Fwiw, we did zip reordering at some point and that was not helping with
> startup time, so that didn't land.
>
>> In addition, if we are packing our-self the resources, we can re-pack
>> them with additional meta-data such as the pre-compiled code for asm.js
>> or the pre-parse lazy-script & bytecode for non-asm.js code.
> How much would we gain from doing that?

Load time and memory depending on the scripts content. The mechanism to
store the cache version is still to be defined though and it needs to be
defined if it is faster to retrieve a small script or to parse it. I
think the first target is apps (offline cache or packaged) but
ultimately it can affect hosted content if I have understood correctly.

Combine with sharing bytecode data this should also be a nice memory win
if you have many tabs opened using the same script.
>
> Fabrice

Nicolas B. Pierron

unread,
Jul 9, 2013, 7:50:05 PM7/9/13
to mozilla...@lists.mozilla.org
On 07/09/2013 02:45 PM, Fabrice Desre wrote:
> On 07/09/2013 02:25 PM, Nicolas B. Pierron wrote:
>
>> I agree, and where I think packaged app (zip of all used files) are good
>> for the size of the initial download (even if they are way smaller than
>> Android ones) but I do not think this is good for updates. Suppose, I
>> have a web site with 10 pages. I want to add an extra page. I will
>> probably modify one existing page and add the new page into the zip
>> file. Why should we re-download all resources, instead of the 2
>> modified files?
>
> True, and I'd like us to support differential updates, at least from
> version N-1 to version N. Someone at version N-2 updating to version N
> would need to do a full update.

Versionning is also an issue for large JS applications as small
modifications to the cross-compiled code (asm.js) should not cause the full
script to be downloaded again.

The problem of versionning is that you need some server instrumentation. We
were discussing this point with Luke & Brian last days, and a way to do it
would be to instrument the server to recognize and answer to some arguments
in the URL, such as:

http://example.com/foo.js?fromVersion=<md5-of-the-cached-version>

Doing it this way would benefit from the HTTP proxy caches, which might be
essential for operators when popular application such as twitter are updated.

The same versionning scheme can also work for app-cache applications:

http://example.com/foo.js?fromManifestVersion=<md5-of-the-manifest-file>

>> Packaged apps are smaller and faster to read, as they are contained in
>> one monolithic zip file. But I do not see any limitations in Gecko for
>> building a zip file out-of the app-cache manifest. On the contrary, we
>> can even do better by sorting the files in function of the order in
>> which they are listed in other pages or how frequently they are used.
>
> Fwiw, we did zip reordering at some point and that was not helping with
> startup time, so that didn't land.

Good to know.

>> In addition, if we are packing our-self the resources, we can re-pack
>> them with additional meta-data such as the pre-compiled code for asm.js
>> or the pre-parse lazy-script & bytecode for non-asm.js code.
>
> How much would we gain from doing that?

For asm.js, this would be a huge WIN, as the compilation time might be
important

That's something I have to investigate for normal scripts. But caching at
least the lazyScripts will prevent one pass on the file, which check and
identify the functions and there bindings. Caching the bytecode might be
harder but we could prevent reading the source of all functions which are
executed during the load of the JavaScript. We hope to improve the load
time of cached/packed JS files that way.

--
Nicolas B. Pierron

Peter Dolanjski

unread,
Jul 10, 2013, 3:36:19 AM7/10/13
to Ben Francis, dev-w...@lists.mozilla.org, dev-b2g
Hello all,

I don't have much to add in the way of implementation suggestions, but I do want to make a few points from an overall product and end user perspective. (much of this is probably well known to this list, but it may spark some further thought)

As was already mentioned, it is important that our solution provides users with the freedom to obtain and use apps in a manner that suits them best in an environment encumbered by connectivity constraints (costs, network availability, etc.). So, to the extent that a given app's developer intended such a use for their app, the user should be able to opportunistically "download" the app as it suits them (ie. WiFi) for later use when not having local assets would otherwise render that app useless. Moreover, right or wrong, I believe the average user expectation is such that when an icon for an app persists on the device then that app can function in an offline mode. When there is a lack of connectivity, it may be worth exploring, using some user experience tests, whether or not visually distinguishing between apps that don't have such local content and apps that do provides clarity and benefit to the user. (I believe there were some proposals/contributions on this front)

The other part of the picture are users that don't tend to seek out and install third party apps for later/repeated use. Our data for our target markets indicates that the average smartphone user does not use more than a few (< 2-3) third party apps. It is for these users that I think that pushing the boundary with respect to the use cases a hosted app is capable of satisfying (ie. APIs available to that app) would provide product differentiation and round out the "instant app" story. These are not users who will go to an app storefront to seek out apps that they can install for later use. It is more likely that they will have a specific task in mind that they would like to accomplish. A discovery mechanism for such apps to meet the task at hand, paired with unencumbered app access to device functionality (beyond what is possible using hosted apps today and to the extent possible while still protecting the user) that can create an "instant" experience without having to download the app would go a long way to meeting the workflow needs of these types of users.

Peter

----- Original Message -----

From: "Ben Francis" <bfra...@mozilla.com>
To: "dev-b2g" <dev...@lists.mozilla.org>, dev-w...@lists.mozilla.org
Are we happy with a packaged model for trusted apps going forward, or is
now the time to embrace the challenge of making trusted apps genuinely part
of the web? Perhaps we don't even need to restrict our thinking to the
"app" and "app store" model and can explore ways of exposing more
privileged APIs to all web content in a trusted way.

If you're interested in the nitty gritty of this problem, I've tried to
summarise Jonas' original email below. I hope he forgives me if I
mis-represent him in any way, but you can read his original email in the
archives [1].

...

In his email, Jonas proposed the following requirements for trusted apps:
1. The ability for a trusted party to review an app and indicate some level
of trust in the app (or potentially in the app developer).
2. A mechanism for signing an app to verify that the app actually contains
the content that was reviewed.
3. Use of a minimum CSP policy for all pages of an app to ensure only the
reviewed code runs.
4. A separate data jar for local data to ensure a compromised web site can
not write to the local data of an app to alter the way it behaves.
5. A separate origin for the resources of an app so that the app can not be
tricked into running un-reviewed code from the same origin with escalated
privileges.

Salvador de la Puente González

unread,
Jul 10, 2013, 4:09:47 AM7/10/13
to Antonio Manuel Amaya Calvo, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hello!

On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
> Hey all.
>
> You know, at the risk of being the discordant voice here, I don't see
> what the issue with packaged apps is. It's not like you're required to
> use packaged app if you want do develop a non special-api-using app.
> You can just develop the app as you would normally, host it somewhere,
> test it that way and then, after you've finished developing it, if
> it's mostly static content (static HTMLs, JSs and images) then you
> have the *option* to package it so your users can use it without
> having a network connection. You don't have to do it, though, it's
> your decision. If you prefer not to package it, you can just
> distribute it as a hosted app.
Oh, of course but it could be great if I, as web developer, only add a
file saying something like "hey, if you want this application to work
offline, take these files and pack them together". This is said to the
user agent so I'm doing nothing more than declaring the structure of my
package and the the UA pack it "automagically".
>
> Oh, and the cool thing is that so long as you've only used relative
> URIs on your app, you won't have to change a single iota to change it
> from a hosted app to a packaged app.
With the approach above, it could be the same. :)
>
> So, for normal apps, I don't see having packaged apps as a bad thing.
> It's an extra option, and having more options is good on my book. I
> don't believe in forcing people into any vision of what the web should
> be, and what is webby and what isn't. The web will be whatever people
> make of it. If packaged apps are useful they will prosper and grow, if
> they're not, they'll shrink and die. And that's good.
I agree it is an extra option, the problem raises when the on-line web
is the extra. At the end, and despite we move to an always-connected
world, the real Web is the Online Web.
>
> Now, for privileged app, that's another problem altogether, and
> something that doesn't actually have a close relative on the existing
> web. At Firefox OS we're exposing some APIs to content that are or can
> be dangerous if used incorrectly or with bad intent. And to do that,
> we've taken onto us the responsibility of telling the user, hey, it's
> ok if you allow this app to access your contact because WE have
> checked it and WE have seen it will treat your data with the due
> respect. We cannot do that unless we ensure that the content of the
> app we've examined is *static*. We could give the option for that
> content to be served remotely, along with the signature of every piece
> of code we download (raw hashes as Salva said below will just not work).
Why? If all is about content and I review the content, trust it and make
a hash for him, any attempt to modify the code will alter the
cryptographic hash making the validation algorithm fail.
> But the server cannot serve dynamic content, cause otherwise the
> signatures won't be of any use. And I'm not even sure if even that
> would be enough, since a developer could send to be signed more
> content than he actually uses... Anyway...
>
> On that model, every time a developer wanted to change a single comma
> on his code he would have to resend all the content (and not just the
> code he just changed, because of possible interactions) to be
> reexamined. Then a trusted third party (cause I, as a user, love
> developer but don't trust them as far as I can throw them) will sign
> the data again, and send the signatures back and only then the
> developer can change one set of static content for another set of
> static content.
But that is how it's working now! If I made a syntax error error in my
v1 application and it lands in the market and I need to fix it, I need
to send an update to the market. A v2 application so it need to be
reviewed again. How to review is responsibility of the third party. It
should keep a code repository an see only the differences and automatic
tools can decide when a change need another human review or not.
>
> What does he win with this model, compared with the current one? Just
> the actual packaging of the app. What do end users win? Hmm... without
> any way of local caching of content, they actually *lose*. They'll
> have to download the static content over and over and over. Oh, and
> the signature will have to be re-verified every time the content is
> loaded (while actually is only checked at install time).
Just the same with the current approach. The user need to download the
"application update" and the sign must to be recalculated again.
>
> Going away from a packaged/controlled environment for privileged app
> will make harder to actually trust them. The way I would actually like
> for us to go is to, slowly, start opening some APIs that are currently
> certified only to be privileged, so more and richer third party apps
> can be added to the ecosystem. And I don't see that happening if we
> relax the current trust/security model.
I like the current security model, I only want some "auto-packing"
method that make the "off-line site" an option, not the "on-line site"
because "on-line" sites are the Web.

Cheers!
>
> Best regards,
>
> Antonio
>
>
>
> On 09/07/2013 16:58, Salvador de la Puente González wrote:
>> Hello
>>
>> I want to say I totally support the ideal of removing packaged apps.
>> INMHO, I think we need a way to move from on-line to off-line
>> applications in a transparent way in order to make "installation"
>> process almost trivial. For this purpose some mechanism like offline
>> cache is necessary. If current implementation is bugged, let's find a
>> better alternative.
>>
>> For the problems exposed by Jonas, I'm not an expert but some of them
>> can be easily addressed:
>>
>> On 08/07/13 23:31, Ben Francis wrote:
>>> In his email, Jonas proposed the following requirements for trusted
>>> apps:
>>> 1. The ability for a trusted party to review an app and indicate
>>> some level
>>> of trust in the app (or potentially in the app developer).
>> With a list of downloadable content to be installed (some like the
>> Offline Cache), a third party could retrieve a version of the software,
>> then review it.
>>> 2. A mechanism for signing an app to verify that the app actually
>>> contains
>>> the content that was reviewed.
>> With a digest algorithm based on the content of each file the third
>> party and the User Agent could compute the same signature and see if it
>> matches. I have in mind some sha1-based like in Git. At least, all is
>> about content.
>>> 3. Use of a minimum CSP policy for all pages of an app to ensure
>>> only the
>>> reviewed code runs.
>> Sorry but I don't understand the problem here. I can currently load an
>> external script from a packaged application and run its code.
>>> 4. A separate data jar for local data to ensure a compromised web
>>> site can
>>> not write to the local data of an app to alter the way it behaves.
>> If the UA receive an update order for the Offline Cache of a determined
>> App, it can perform another digest and sent to the third party in order
>> to see if the new code has been reviewed. Maybe I did not understand
>> what is a compromised web in this context or how it could be a hazard..
>>> 5. A separate origin for the resources of an app so that the app can
>>> not be
>>> tricked into running un-reviewed code from the same origin with
>>> escalated
>>> privileges.
>>>
>> Same as above. With the digest mechanism you can say if the version the
>> third party reviewed and the device version differs, then ask the user.
>>
>> If you think the digest process delays the application execution, it is
>> true but you can cover this time by adding it to the
>> "installation/update process".
>>
>> Maybe I'm very naive about security so if you can clarify me some
>> aspects I did not take in count, I was very pleased to read about.
>>
>> Best!
>>
>> ________________________________
>>
>> Este mensaje se dirige exclusivamente a su destinatario. Puede
>> consultar nuestra política de envío y recepción de correo electrónico
>> en el enlace situado más abajo.
>> This message is intended exclusively for its addressee. We only send
>> and receive email on the basis of the terms set out at:
>> http://www.tid.es/ES/PAGINAS/disclaimer.aspx
>> _______________________________________________
>> dev-b2g mailing list
>> dev...@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-b2g
>
>


Antonio M. Amaya

unread,
Jul 10, 2013, 4:45:59 AM7/10/13
to Salvador de la Puente González, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
On 10/07/2013 10:09, Salvador de la Puente González wrote:
> Hello!
>
> On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
>> Hey all.
>>
>> You know, at the risk of being the discordant voice here, I don't see
>> what the issue with packaged apps is. It's not like you're required
>> to use packaged app if you want do develop a non special-api-using
>> app. You can just develop the app as you would normally, host it
>> somewhere, test it that way and then, after you've finished
>> developing it, if it's mostly static content (static HTMLs, JSs and
>> images) then you have the *option* to package it so your users can
>> use it without having a network connection. You don't have to do it,
>> though, it's your decision. If you prefer not to package it, you can
>> just distribute it as a hosted app.
> Oh, of course but it could be great if I, as web developer, only add a
> file saying something like "hey, if you want this application to work
> offline, take these files and pack them together". This is said to the
> user agent so I'm doing nothing more than declaring the structure of
> my package and the the UA pack it "automagically".
So you just want to save the time to make the zip yourself? ;)

>
>>
>> Oh, and the cool thing is that so long as you've only used relative
>> URIs on your app, you won't have to change a single iota to change it
>> from a hosted app to a packaged app.
> With the approach above, it could be the same. :)
>>
>> So, for normal apps, I don't see having packaged apps as a bad thing.
>> It's an extra option, and having more options is good on my book. I
>> don't believe in forcing people into any vision of what the web
>> should be, and what is webby and what isn't. The web will be whatever
>> people make of it. If packaged apps are useful they will prosper and
>> grow, if they're not, they'll shrink and die. And that's good.
> I agree it is an extra option, the problem raises when the on-line web
> is the extra. At the end, and despite we move to an always-connected
> world, the real Web is the Online Web.
Well, if or when we arrive to that always-connected world then the
packaged apps will quietly go away by themselves cause they won't be
useful. Meanwhile, I for one have data limits on my mobile connection,
not to mention the times when connection just isn't available. And I
would like to keep using the apps that don't actually require connection
(games, book readers, whatever). Currently packaged apps fill this need.
Now, I don't actually mind if the developer makes the zip himself or if
he just instructs the UA somehow to download/perma-cache the fixed
resources. But as I said before, this affects mainly the non privileged
apps workflow.

>>
>> Now, for privileged app, that's another problem altogether, and
>> something that doesn't actually have a close relative on the existing
>> web. At Firefox OS we're exposing some APIs to content that are or
>> can be dangerous if used incorrectly or with bad intent. And to do
>> that, we've taken onto us the responsibility of telling the user,
>> hey, it's ok if you allow this app to access your contact because WE
>> have checked it and WE have seen it will treat your data with the due
>> respect. We cannot do that unless we ensure that the content of the
>> app we've examined is *static*. We could give the option for that
>> content to be served remotely, along with the signature of every
>> piece of code we download (raw hashes as Salva said below will just
>> not work).
> Why? If all is about content and I review the content, trust it and
> make a hash for him, any attempt to modify the code will alter the
> cryptographic hash making the validation algorithm fail.
Because hashes by themselves give integrity, not security. Why? Cause if
I can change the content and the hash is stored with the content, I can
change the hash also. And just trusting hash by origin (I trust this
hash because I downloaded it from the store while I downloaded the
actual content from the developer site) could work but it links
completely your security to the security of the third party (market)
download site.

>
>> But the server cannot serve dynamic content, cause otherwise the
>> signatures won't be of any use. And I'm not even sure if even that
>> would be enough, since a developer could send to be signed more
>> content than he actually uses... Anyway...
>>
>> On that model, every time a developer wanted to change a single comma
>> on his code he would have to resend all the content (and not just the
>> code he just changed, because of possible interactions) to be
>> reexamined. Then a trusted third party (cause I, as a user, love
>> developer but don't trust them as far as I can throw them) will sign
>> the data again, and send the signatures back and only then the
>> developer can change one set of static content for another set of
>> static content.
> But that is how it's working now! If I made a syntax error error in my
> v1 application and it lands in the market and I need to fix it, I need
> to send an update to the market. A v2 application so it need to be
> reviewed again. How to review is responsibility of the third party. It
> should keep a code repository an see only the differences and
> automatic tools can decide when a change need another human review or
> not.
Yep, that was my point exactly, as stated below. That a developer won't
gain anything with this model compared to the existing one (besides not
having to create the zip file).


>>
>> What does he win with this model, compared with the current one? Just
>> the actual packaging of the app. What do end users win? Hmm...
>> without any way of local caching of content, they actually *lose*.
>> They'll have to download the static content over and over and over.
>> Oh, and the signature will have to be re-verified every time the
>> content is loaded (while actually is only checked at install time).
> Just the same with the current approach. The user need to download the
> "application update" and the sign must to be recalculated again.

No. If we download resources remotely every time we use them (or if we
can download them remotely) then we have to recheck the signature every
time a file is accessed/downloaded. With packaged apps you check the
signature only at install time (be it first time install, or update).

Salvador de la Puente González

unread,
Jul 10, 2013, 9:04:46 AM7/10/13
to Antonio M. Amaya, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
Hi

On 10/07/13 10:45, Antonio M. Amaya wrote:
> On 10/07/2013 10:09, Salvador de la Puente González wrote:
>> Hello!
>>
>> On 09/07/13 23:13, Antonio Manuel Amaya Calvo wrote:
>>> Hey all.
>>>
>>> You know, at the risk of being the discordant voice here, I don't
>>> see what the issue with packaged apps is. It's not like you're
>>> required to use packaged app if you want do develop a non
>>> special-api-using app. You can just develop the app as you would
>>> normally, host it somewhere, test it that way and then, after you've
>>> finished developing it, if it's mostly static content (static HTMLs,
>>> JSs and images) then you have the *option* to package it so your
>>> users can use it without having a network connection. You don't have
>>> to do it, though, it's your decision. If you prefer not to package
>>> it, you can just distribute it as a hosted app.
>> Oh, of course but it could be great if I, as web developer, only add
>> a file saying something like "hey, if you want this application to
>> work offline, take these files and pack them together". This is said
>> to the user agent so I'm doing nothing more than declaring the
>> structure of my package and the the UA pack it "automagically".
> So you just want to save the time to make the zip yourself? ;)
Yes, because in this way it becomes a Web feature, not a developer concern.
>
>>
>>>
>>> Oh, and the cool thing is that so long as you've only used relative
>>> URIs on your app, you won't have to change a single iota to change
>>> it from a hosted app to a packaged app.
>> With the approach above, it could be the same. :)
>>>
>>> So, for normal apps, I don't see having packaged apps as a bad
>>> thing. It's an extra option, and having more options is good on my
>>> book. I don't believe in forcing people into any vision of what the
>>> web should be, and what is webby and what isn't. The web will be
>>> whatever people make of it. If packaged apps are useful they will
>>> prosper and grow, if they're not, they'll shrink and die. And that's
>>> good.
>> I agree it is an extra option, the problem raises when the on-line
>> web is the extra. At the end, and despite we move to an
>> always-connected world, the real Web is the Online Web.
> Well, if or when we arrive to that always-connected world then the
> packaged apps will quietly go away by themselves cause they won't be
> useful. Meanwhile, I for one have data limits on my mobile connection,
> not to mention the times when connection just isn't available. And I
> would like to keep using the apps that don't actually require
> connection (games, book readers, whatever). Currently packaged apps
> fill this need. Now, I don't actually mind if the developer makes the
> zip himself or if he just instructs the UA somehow to
> download/perma-cache the fixed resources. But as I said before, this
> affects mainly the non privileged apps workflow.
Let's discuss this later on this answer. After our off-line conversation
I see the problem now.
>
>>>
>>> Now, for privileged app, that's another problem altogether, and
>>> something that doesn't actually have a close relative on the
>>> existing web. At Firefox OS we're exposing some APIs to content that
>>> are or can be dangerous if used incorrectly or with bad intent. And
>>> to do that, we've taken onto us the responsibility of telling the
>>> user, hey, it's ok if you allow this app to access your contact
>>> because WE have checked it and WE have seen it will treat your data
>>> with the due respect. We cannot do that unless we ensure that the
>>> content of the app we've examined is *static*. We could give the
>>> option for that content to be served remotely, along with the
>>> signature of every piece of code we download (raw hashes as Salva
>>> said below will just not work).
>> Why? If all is about content and I review the content, trust it and
>> make a hash for him, any attempt to modify the code will alter the
>> cryptographic hash making the validation algorithm fail.
> Because hashes by themselves give integrity, not security. Why? Cause
> if I can change the content and the hash is stored with the content, I
> can change the hash also. And just trusting hash by origin (I trust
> this hash because I downloaded it from the store while I downloaded
> the actual content from the developer site) could work but it links
> completely your security to the security of the third party (market)
> download site.
You could sign the hash then. There are no further problems here.
>
>>
>>> But the server cannot serve dynamic content, cause otherwise the
>>> signatures won't be of any use. And I'm not even sure if even that
>>> would be enough, since a developer could send to be signed more
>>> content than he actually uses... Anyway...
>>>
>>> On that model, every time a developer wanted to change a single
>>> comma on his code he would have to resend all the content (and not
>>> just the code he just changed, because of possible interactions) to
>>> be reexamined. Then a trusted third party (cause I, as a user, love
>>> developer but don't trust them as far as I can throw them) will sign
>>> the data again, and send the signatures back and only then the
>>> developer can change one set of static content for another set of
>>> static content.
>> But that is how it's working now! If I made a syntax error error in
>> my v1 application and it lands in the market and I need to fix it, I
>> need to send an update to the market. A v2 application so it need to
>> be reviewed again. How to review is responsibility of the third
>> party. It should keep a code repository an see only the differences
>> and automatic tools can decide when a change need another human
>> review or not.
> Yep, that was my point exactly, as stated below. That a developer
> won't gain anything with this model compared to the existing one
> (besides not having to create the zip file).
>
Important thing is that he does not loose anything. And we drop the
developer concern about packing the application. So, you, as web
developer, only worry about working in your web application, then
publish a list of files to allow off-line mode and let The Web do the
hard job (when saying The Web I'm referring indeed to User Agents,
servers and HTTP protocol).

Of course, a drawback of this approach is that you need not only a
server but a domain name too.
>
>>>
>>> What does he win with this model, compared with the current one?
>>> Just the actual packaging of the app. What do end users win? Hmm...
>>> without any way of local caching of content, they actually *lose*.
>>> They'll have to download the static content over and over and over.
>>> Oh, and the signature will have to be re-verified every time the
>>> content is loaded (while actually is only checked at install time).
>> Just the same with the current approach. The user need to download
>> the "application update" and the sign must to be recalculated again.
>
> No. If we download resources remotely every time we use them (or if we
> can download them remotely) then we have to recheck the signature
> every time a file is accessed/downloaded. With packaged apps you check
> the signature only at install time (be it first time install, or update).
Ok, this is part of the former discussion. But we can simply not allow
load resources from the Internet.

Only remarking, if we allow this auto-packing , then we are encouraging
web development and adding the packing as an option.

Best!

Mounir Lamouri

unread,
Jul 17, 2013, 8:25:03 PM7/17/13
to dev-b2g, dev-w...@lists.mozilla.org, Ben Francis
Hi,

tl;dr: it is too early to deprecate packaged apps. They are a good tool
to experiment and can already start taking over the native app world.

Our APIs are not mature enough to be moved to the Web. Most APIs that
are privileged-only will get significant changes in the future. Packaged
apps is an environment where we can actually change APIs relatively
easily. We are actually thinking of adding an "api version" in the
manifest to make such a mechanism working.

In addition, of not being mature, some APIs have security issues that we
do not know how to solve yet and solving them is not trivial. Having
those APIs for packaged apps only with a review system allow us to delay
having a security model for the Web (where we can't review content). We
will hopefully get to that at some point.

In other words, I see packaged apps as a nice playground where we can
experiment things before having them go in the Web.

Furthermore, one of the things Mozilla wanted to solve with Firefox OS
was to make things simpler for developers and users. Having one platform
(the Web) that will no longer require you to re-buy apps if you change
phone. We are not close to solve the second problem but packaged apss
will help with the former. Even if packaged apps are not really the Web,
they are using Web Technologies and will help developers to write
applications that could way easily be ported from a platform to another.
SysApps could be a good path and we could standardise APIs there in a
way that, at some point, an application for Chrome, Firefox or Tizen
would look exactly the same.

Having all those APIs available to the Web is something everyone in the
WebAPI team want but this is not going to happen overnight and we should
be realistic and try to fix our problems one at a time.

--
Mounir
> In his email, Jonas proposed the following requirements for trusted apps:
> 1. The ability for a trusted party to review an app and indicate some level
> of trust in the app (or potentially in the app developer).
> 2. A mechanism for signing an app to verify that the app actually contains
> the content that was reviewed.
> 3. Use of a minimum CSP policy for all pages of an app to ensure only the
> reviewed code runs.
> 4. A separate data jar for local data to ensure a compromised web site can
> not write to the local data of an app to alter the way it behaves.
> 5. A separate origin for the resources of an app so that the app can not be
> tricked into running un-reviewed code from the same origin with escalated
> privileges.
>

Jonas Sicking

unread,
Nov 17, 2013, 8:04:28 PM11/17/13
to Ben Francis, Anne van Kesteren, Marcos Cáceres, dev-w...@lists.mozilla.org, dev-b2g
Time to go through old threads again...

The short answer is "no, i don't think we can deprecate packaged apps yet".

The big problem is still how to expose APIs that are security
sensitive and whose implications we can't explain well enough to the
user. This generally mean all *security* sensitive APIs, we shouldn't
ask the user security questions. Privacy questions and things like
resource-use questions can be ok, but security questions generally
should be avoided.

So far the only solution we have is "review and sign by trusted 3rd
party", and the only realistic solution for signing that we have is
packaged apps. Or at least solutions that aren't "the web".

I think it will take many years to solve the packaged apps problem.
However it will take longer than that if we don't start working on it
now.

So what are we doing to try to help with the packaged apps problem?

We have started exploring creating a URL standard that would allow
linking to resources inside a .zip file. This would actually allow
having URLs to packaged apps. Not sure what the latest here is. Anne
could probably fill in the blanks.

We tried to standardize packaged apps so that they at least would be
interoperable. Unfortunately this is going less well. There's
currently at least 3 different standards: FirefoxOS packaged apps,
ChromeOS packaged apps, Widget packaged apps (which I believe is what
tizen is using). Attempts at getting these to align has so far failed.
Basically only mozilla has expressed willingness to actually change
the implementation and implement a standard.

My next plan here is to attack the hosted apps and get those
standardized. This is going much better and I hope that we can have a
spec draft for the bookmark-to-homepage use case in just a couple of
weeks. I've talked with multiple mobile browser vendors and there is a
lot of interest here (I don't want to speak for others which is why
I'm avoiding names).

Once we have that standardized, we have most of what we need for
hosted apps. Slap some API on top of that and you have an app store.
Though it's unclear who apart from us would be interested in
implementing such an API though.

We've also done some discussions with google around
WebIntents/WebActivities. Having standardized WebIntents/WebActivities
should enable more apps to not need the APIs that require "privileged"
level, and so would help reduce the packaged apps problem. At this
point I think we have a better understanding of what it would take to
create a standardized WebIntents/WebActivities spec.

Unfortunately we haven't gotten further than that here. Action is on
me to put together some information about what we know so far so that
others can pick up this work.

So while there is little concrete progress, we're not doing nothing.
But I'm all for help in doing more. In particular I'd like to see us
do more work on getting WebIntents/WebActivities standardized. And do
more work on finding safe-to-expose-to-the-web variants or subsets of
our existing privileged-only APIs.

I'd personally put a safe subset of DeviceStorage API high up on the
priority list.

/ Jonas

Anne van Kesteren

unread,
Nov 18, 2013, 8:38:28 AM11/18/13
to Jonas Sicking, Marcos Cáceres, dev-w...@lists.mozilla.org, Ben Francis, dev-b2g
On Mon, Nov 18, 2013 at 9:04 AM, Jonas Sicking <jo...@sicking.cc> wrote:
> We have started exploring creating a URL standard that would allow
> linking to resources inside a .zip file. This would actually allow
> having URLs to packaged apps. Not sure what the latest here is. Anne
> could probably fill in the blanks.

This does not seem to be going anywhere fast. The networking crowd is
not enthusiastic with it given the poor performance characteristics.
Nobody seems super keen on driving it forward. When the TAG discussed
it we ended up with some alternate format that would allow MIME
headers, so a folder on a server could be more effectively packaged.
But again, nobody is really driving that.


> My next plan here is to attack the hosted apps and get those
> standardized. This is going much better and I hope that we can have a
> spec draft for the bookmark-to-homepage use case in just a couple of
> weeks. I've talked with multiple mobile browser vendors and there is a
> lot of interest here (I don't want to speak for others which is why
> I'm avoiding names).

I think there's still a large problem with these applications. That
we're not giving them the same cache context as they would have inside
the browser. That makes a lot of things that are normal interactions
on the web today, a lot harder to implement. If we want to offer a no
shared cache feature, it should be opt-in by the application I think.


I think anything that deviates from the current browser security model
through some kind of end-user UI is not good. Bookmarking Gmail should
not affect the interaction model. We should move towards making the
browser the OS. Inventing new security models around URLs won't get us
there.


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 18, 2013, 10:32:01 AM11/18/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g



On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:

> > My next plan here is to attack the hosted apps and get those
> > standardized. This is going much better and I hope that we can have a
> > spec draft for the bookmark-to-homepage use case in just a couple of
> > weeks. I've talked with multiple mobile browser vendors and there is a
> > lot of interest here (I don't want to speak for others which is why
> > I'm avoiding names).
>
>
>
> I think there's still a large problem with these applications. That
> we're not giving them the same cache context as they would have inside
> the browser. That makes a lot of things that are normal interactions
> on the web today, a lot harder to implement.

Can you please provide a few more details and use cases. Or file bugs:
https://github.com/w3c/manifest/issues

> If we want to offer a no
> shared cache feature, it should be opt-in by the application I think.


That might be good. Different runtimes handle this differently (e.g., Chrome beta on Android shares everything with the browser, while apps added to the home screen on iOS don’t share anything).

> I think anything that deviates from the current browser security model
> through some kind of end-user UI is not good.

Not sure what you mean here? Which UI are you talking about?
> Bookmarking Gmail should
> not affect the interaction model. We should move towards making the
> browser the OS. Inventing new security models around URLs won't get us
> there.

I generally agree with what you are saying above.

--
Marcos Caceres



Anne van Kesteren

unread,
Nov 18, 2013, 10:35:54 AM11/18/13
to Marcos Caceres, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g
On Mon, Nov 18, 2013 at 11:32 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
> On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:
>> I think there's still a large problem with these applications. That
>> we're not giving them the same cache context as they would have inside
>> the browser. That makes a lot of things that are normal interactions
>> on the web today, a lot harder to implement.
>
> Can you please provide a few more details and use cases. Or file bugs:
> https://github.com/w3c/manifest/issues

It seems the onus for that should be on those attempting to change the
model. I think we should keep the current model.


>> I think anything that deviates from the current browser security model
>> through some kind of end-user UI is not good.
>
> Not sure what you mean here? Which UI are you talking about?

Bookmarking. And in particular bookmarking to home screen should not
have dramatically different security properties from just bookmarking.
They ought to be identical (and indeed, have negligible effect).


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 18, 2013, 4:44:54 PM11/18/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g


On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:

> On Mon, Nov 18, 2013 at 11:32 PM, Marcos Caceres <mcac...@mozilla.com (mailto:mcac...@mozilla.com)> wrote:
> > On Monday, November 18, 2013 at 1:38 PM, Anne van Kesteren wrote:
> > > I think there's still a large problem with these applications. That
> > > we're not giving them the same cache context as they would have inside
> > > the browser. That makes a lot of things that are normal interactions
> > > on the web today, a lot harder to implement.
> > >
> >
> >
> >
> > Can you please provide a few more details and use cases. Or file bugs:
> > https://github.com/w3c/manifest/issues
> >
>
>
>
> It seems the onus for that should be on those attempting to change the
> model. I think we should keep the current model.
>


I think its fine to put the onus on those working on this (mostly me) - but, I want to make sure we are aligned so I don’t go off in the weeds and come back with something you and others don’t like. I also don’t want us to change the current browser caching model unless it’s absolutely required. The spec is currently silent about this.

> > > I think anything that deviates from the current browser security model
> > > through some kind of end-user UI is not good.
> > >
> >
> >
> >
> > Not sure what you mean here? Which UI are you talking about?
>
> Bookmarking. And in particular bookmarking to home screen should not
> have dramatically different security properties from just bookmarking.
> They ought to be identical (and indeed, have negligible effect).
>


Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research [1], and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).

[1] http://w3c-webmob.github.io/installable-webapps/

--
Marcos Caceres



Anne van Kesteren

unread,
Nov 19, 2013, 6:28:18 AM11/19/13
to Marcos Caceres, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g
On Mon, Nov 18, 2013 at 9:44 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
> On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:
>> Bookmarking. And in particular bookmarking to home screen should not
>> have dramatically different security properties from just bookmarking.
>> They ought to be identical (and indeed, have negligible effect).
>
> Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research, and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).

So the differences I found that are not desirable are:

* If I'm logged into Facebook and then open a hosted app that Facebook
session is distinct. That's not the case in a browser today. I can see
us adding a feature that allows Facebook to opt out of sharing its
session with other sites, but by default I think we want the existing
model. (And definitely not change it around based on a bookmark.)

* When I click a link "the browser" opens. If the browser is core to
the OS everything in it should have a URL (whether visible or not,
that's an implementation detail) and navigation should happen
seamlessly between them. I don't think we need any behavioral
difference here between something that is bookmarked and something
that is not. Of course apps can opt in to having their links open in a
new window, using target=_blank.

I think part of the disconnect might be that the web is an actual OS
would be fundamentally different from what we have today. You cannot
really compare it with an OS that has a browser or an apps market
(such as Mac OS X and Firefox OS). The OS is the browser and the apps
market is the web.


--
http://annevankesteren.nl/

Marcos Caceres

unread,
Nov 19, 2013, 7:24:52 AM11/19/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g


> On 19 Nov 2013, at 11:28 am, Anne van Kesteren <ann...@annevk.nl> wrote:
>
>> On Mon, Nov 18, 2013 at 9:44 PM, Marcos Caceres <mcac...@mozilla.com> wrote:
>>> On Monday, November 18, 2013 at 3:35 PM, Anne van Kesteren wrote:
>>> Bookmarking. And in particular bookmarking to home screen should not
>>> have dramatically different security properties from just bookmarking.
>>> They ought to be identical (and indeed, have negligible effect).
>>
>> Thanks for clarifying. I think I agree with this, but I’m currently investigating what the implications are. I have a *very early* draft of this research, and the good thing is that at least IE on Windows 8, Firefox for Android, and Chrome for Android tend to agree with you (will push that part of the research to the document soon).
>
> So the differences I found that are not desirable are:
>
> * If I'm logged into Facebook and then open a hosted app that Facebook
> session is distinct. That's not the case in a browser today. I can see
> us adding a feature that allows Facebook to opt out of sharing its
> session with other sites, but by default I think we want the existing
> model. (And definitely not change it around based on a bookmark.)
>
> * When I click a link "the browser" opens. If the browser is core to
> the OS everything in it should have a URL (whether visible or not,
> that's an implementation detail) and navigation should happen
> seamlessly between them. I don't think we need any behavioral
> difference here between something that is bookmarked and something
> that is not. Of course apps can opt in to having their links open in a
> new window, using target=_blank.


Great stuff! I'll make sure the above is captured as requirements (and ultimately ends up in the spec).

>
> I think part of the disconnect might be that the web is an actual OS
> would be fundamentally different from what we have today. You cannot
> really compare it with an OS that has a browser or an apps market
> (such as Mac OS X and Firefox OS). The OS is the browser and the apps
> market is the web.

Be great if you could expand this out in a blog post or something. We have a lot of work to do to shift people to this line of thinking (which I also agree with). I can then, with your permission, add it to the doc I'm working on.

If you think of anything else, please let me know!


>
>
> --
> http://annevankesteren.nl/

Jan Jongboom

unread,
Nov 19, 2013, 8:50:46 AM11/19/13
to
On Monday, November 18, 2013 2:38:28 PM UTC+1, Anne van Kesteren wrote:
> There's still a large problem with these applications. That
>
> we're not giving them the same cache context as they would have inside
>
> the browser. That makes a lot of things that are normal interactions
>
> on the web today, a lot harder to implement. If we want to offer a no
>
> shared cache feature, it should be opt-in by the application I think.
I want to highlight this as well. The model that we currently have sucks ass usability wise.

Marcos Caceres

unread,
Nov 27, 2013, 2:04:10 AM11/27/13
to Anne van Kesteren, dev-w...@lists.mozilla.org, Ben Francis, Jonas Sicking, dev-b2g
Apologies for top-posting, but here is the first public working-draft of the W3C manifest spec:

http://w3c.github.io/manifest/

I expect lots to change over the coming weeks.

If people want to take part in the discussion, a bunch of us are working on it on the W3C IRC server (irc.w3.org:6665#manifest).

Otherwise, repo is here:
https://github.com/w3c/manifest

Please file any bugs or issues there. Pull requests welcome.

Kind regards,
Marcos

--
Marcos Caceres


On Tuesday, 19 November 2013 at 12:24, Marcos Caceres wrote:

>
>
> > On 19 Nov 2013, at 11:28 am, Anne van Kesteren <ann...@annevk.nl (mailto:ann...@annevk.nl)> wrote:

Benjamin Francis

unread,
Jan 30, 2015, 9:48:46 AM1/30/15
to dev-b2g, dev-w...@lists.mozilla.org, dev-...@lists.mozilla.org
Hi,

It feels like a good time to bring up this topic again.

One of the main themes in suggestions for Firefox OS 3.0 has been to make the OS more "webby", moving away from packaged apps to something inherently more web-like, and even turning the built-in Gaia apps into hosted apps.

When we last spoke in this thread, the W3C "Manifest for Web Application" specification [1] was at its first public working draft. That spec has recently reached an important milestone by being declared feature complete and is expected to transition to a Candidate Recommendation soon, already having being implemented in Chrome.

Service Workers seem to be moving along in Gecko [2], and have also been one of the hot topics in Firefox OS 3.0 discussions, particularly in relation to offline functionality.

There have been several proposals of how to provide privileged hosted apps, including work around hosted packages [3] and discussions around a security model.

There have been several prototypes demonstrated with hosted versions of Gaia apps doing all sorts of interesting things, and proposed design concepts around new ways of thinking about web apps, like Pinned Apps [4].

There are lots of separate teams working on things in this area so I thought it might be useful to share what everyone is working on. How close are we to being able to deprecate packaged apps? How are hosted privileged apps coming along? What's the latest thinking on a security model? How are Service Workers coming along? Where is the source code of some of the prototypes people have been working on for hosted Gaia apps? What is still missing?

Please share!

Ben

1. http://w3c.github.io/manifest/
2. https://bugzilla.mozilla.org/show_bug.cgi?id=903441
3. https://bugzilla.mozilla.org/show_bug.cgi?id=1036275
4. https://wiki.mozilla.org/FirefoxOS/Pinned_Apps

Benjamin Kelly

unread,
Jan 30, 2015, 10:02:32 AM1/30/15
to Benjamin Francis, dev-w...@lists.mozilla.org, dev-...@lists.mozilla.org, dev-b2g
On Fri, Jan 30, 2015 at 9:48 AM, Benjamin Francis <bfra...@mozilla.com> wrote:
How are Service Workers coming along?

We have a Q1 goal to pref Service Workers on in mozilla-central.  If we're successful with this goal I believe that would put SWs in gecko 39.

A lot of code is in review right now, but we still have some more work to do after that.  There is also at least one thorny bug we need to solve in order to fully support b2g:

  https://bugzilla.mozilla.org/show_bug.cgi?id=1125961

Hope that helps.

Ben

Benjamin Francis

unread,
Jan 30, 2015, 11:53:43 AM1/30/15
to Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-...@lists.mozilla.org
On 30 January 2015 at 16:03, Andrew Overholt <over...@mozilla.com> wrote:
Is it possible for us to get a list of the non-standard APIs that are being
used, sorted by usage?  I'd like to say work is being prioritized based on
need :)

Most of the frequently used permissions do not require the use of a packaged app, but the second most used permission is systemXHR which does. Apparently most developers are only using this because they created a packaged app purely for its offline properties, then found they needed systemXHR to talk to their own server. Which is silly. I hope Service Workers will help with this situation.

The proportion of apps in the Marketplace which actually need to be privileged is surprisingly small. Service Workers should hopefully make it smaller.

David Rajchenbach-Teller

unread,
Jan 30, 2015, 12:03:40 PM1/30/15
to Benjamin Francis, Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-...@lists.mozilla.org
I found that a number of applications cannot be written without XHR,
starting with anything that looks like a feed reader, or anything that
needs to scrap a third-party website. We need a solution for this.

Cheers,
David

On 30/01/15 17:53, Benjamin Francis wrote:
> On 30 January 2015 at 16:03, Andrew Overholt <over...@mozilla.com
> <mailto:over...@mozilla.com>> wrote:
>
> Is it possible for us to get a list of the non-standard APIs that
> are being
> used, sorted by usage? I'd like to say work is being prioritized
> based on
> need :)
>
>
> Yes, here's a snapshot of permissions used:
> http://people.mozilla.org/~bfrancis/images/permissions-2015-01-30.png
> <http://people.mozilla.org/%7Ebfrancis/images/permissions-2015-01-30.png>
>
> Most of the frequently used permissions do not require the use of a
> packaged app, but the second most used permission is systemXHR which
> does. Apparently most developers are only using this because they
> created a packaged app purely for its offline properties, then found
> they needed systemXHR to talk to their own server. Which is silly. I
> hope Service Workers will help with this situation.
>
> The proportion of apps in the Marketplace which actually need to be
> privileged is surprisingly small. Service Workers should hopefully make
> it smaller.
>
>
> _______________________________________________
> dev-gaia mailing list
> dev-...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-gaia
>


--
David Rajchenbach-Teller, PhD
Performance Team, Mozilla

signature.asc

Benjamin Francis

unread,
Jan 30, 2015, 1:28:22 PM1/30/15
to Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-...@lists.mozilla.org
On 30 January 2015 at 18:17, Andrew Overholt <over...@mozilla.com> wrote:
Most of the frequently used permissions do not require the use of a packaged app, but the second most used permission is systemXHR which does. Apparently most developers are only using this because they created a packaged app purely for its offline properties, then found they needed systemXHR to talk to their own server. Which is silly. I hope Service Workers will help with this situation.

Can you elaborate on this a bit?  Why do developers need to use systemXHR with their packaged app?  How do you envision Service Workers helping in this case?

As I understand it...

Developers create a packaged app because it's currently the most effective way to make their app work offline. A packaged app has a synthetic origin which will always be cross-origin from the developer's own web server. They use SystemXHR to allow their packaged app to use remote resources from their web server, rather than set up CORS because that's more difficult.

If they instead used Service Workers to make their app work offline, they wouldn't have a weird synthetic origin and therefore wouldn't need to use systemXHR because their app would be same-origin with their web server.

James Burke

unread,
Jan 30, 2015, 1:31:16 PM1/30/15
to Benjamin Francis, dev-w...@lists.mozilla.org, dev-...@lists.mozilla.org, dev-b2g
On Fri, Jan 30, 2015 at 6:48 AM, Benjamin Francis <bfra...@mozilla.com> wrote:
> There have been several proposals of how to provide privileged hosted apps,
> including work around hosted packages [3] and discussions around a security
> model.

Link [3] (bug 1036275) seems mostly about how to reference resources
inside a zip file. It also seems to imply that the packaged app could
be hosted anywhere, not via a marketplace?

I would be interested to know how validation of the privileged app is
done (this thing I agreed to access these APIs is still the same
vetted thing). If there is some sort of signature checking going on,
it would be great to see that extended to certified apps, where we
allow certain signed apps access to the certified APIs.

This sort of model would allow gaia apps to go hosted. They also
likely need a way to make sure they can provide certain app bundles
for certain gecko versions.

Without those things, it is hard to see the gaia apps going to this
model. We will continue to get new APIs that we will be expected to
use behind a certified flag. The latest is the navigator.sync API.
That API is a great one to have for battery concerns, but needs
service workers to be fully realized, is still new, so it is a
certified API. For users getting 2.2 though, ideally our apps would
use the API in the effort to extend battery life.

I would like to see gaia apps go to a hosted model (even just
marketplace hosting), since it gives us a dogfood way to test how we
expect other apps to be made.

Benjamin Kelly

unread,
Jan 30, 2015, 1:37:45 PM1/30/15
to Benjamin Francis, Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-...@lists.mozilla.org
On Fri, Jan 30, 2015 at 1:28 PM, Benjamin Francis <bfra...@mozilla.com> wrote:
Developers create a packaged app because it's currently the most effective way to make their app work offline. A packaged app has a synthetic origin which will always be cross-origin from the developer's own web server. They use SystemXHR to allow their packaged app to use remote resources from their web server, rather than set up CORS because that's more difficult.

The "synthetic origin" is due to the app:// URL scheme?

Kevin Grandon

unread,
Jan 30, 2015, 1:40:22 PM1/30/15
to Benjamin Francis, Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-gaia
Do we really want to fully deprecate packaged apps? I think making hosted apps and packaged apps equals is a big win, but I'm not sure if fully deprecating packaged apps is a good idea.

There are many developers who like the ability to not run and maintain a server. It's simply less work and overhead for them to throw their code up on a marketplace, and not have to worry about a monthly server cost, or a server going down.

Best,
Kevin

On Fri, Jan 30, 2015 at 10:28 AM, Benjamin Francis <bfra...@mozilla.com> wrote:
On 30 January 2015 at 18:17, Andrew Overholt <over...@mozilla.com> wrote:

>
> Most of the frequently used permissions do not require the use of a
>> packaged app, but the second most used permission is systemXHR which does.
>> Apparently most developers are only using this because they created a
>> packaged app purely for its offline properties, then found they needed
>> systemXHR to talk to their own server. Which is silly. I hope Service
>> Workers will help with this situation.
>>
>
> Can you elaborate on this a bit?  Why do developers need to use systemXHR
> with their packaged app?  How do you envision Service Workers helping in
> this case?
>

As I understand it...

Developers create a packaged app because it's currently the most effective
way to make their app work offline. A packaged app has a synthetic origin
which will always be cross-origin from the developer's own web server. They
use SystemXHR to allow their packaged app to use remote resources from
their web server, rather than set up CORS because that's more difficult.

If they instead used Service Workers to make their app work offline, they
wouldn't have a weird synthetic origin and therefore wouldn't need to use
systemXHR because their app would be same-origin with their web server.

Fred Wenzel

unread,
Jan 30, 2015, 1:48:04 PM1/30/15
to James Burke, dev-w...@lists.mozilla.org, Benjamin Francis, dev-b2g, dev-...@lists.mozilla.org
On Fri, Jan 30, 2015 at 10:31 AM, James Burke <jrb...@gmail.com> wrote:
> On Fri, Jan 30, 2015 at 6:48 AM, Benjamin Francis <bfra...@mozilla.com> wrote:
>> There have been several proposals of how to provide privileged hosted apps,
>> including work around hosted packages [3] and discussions around a security
>> model.
>
> [...]
>
> I would like to see gaia apps go to a hosted model (even just
> marketplace hosting), since it gives us a dogfood way to test how we
> expect other apps to be made.

I'm particularly interested in this piece as well. If we can prototype
a pattern for "Serviceworkers-backed, self-updating, offline-capable
web apps" (snappy name I know, rolls right off the tongue), then I'd
be happy to put engineering resources behind the developer ergonomics
of this. (In fact, I'm working on a plan for exactly that).

These kinds of apps have the appeal that they can work anywhere™ based
on standards (essentially, wherever SWs already are, or wherever we
can make them work).

That's not to say we don't have a good reason for certified packaged
apps right now. The security model really is the pivot point here.
With verifiable (signed), hosted updates, you seem to be on to
something.

~F

Naoki Hirata

unread,
Jan 30, 2015, 1:54:54 PM1/30/15
to Kevin Grandon, mozilla-d...@lists.mozilla.org, Benjamin Francis, Andrew Overholt, Andrew Williamson, dev-gaia, dev-b2g
I have to agree with Kevin.  My biggest concerns are for performance, offline cases, and issues with caching with full hosted apps.  

An example is emergency calling.  If the dialer was fully hosted without a wifi connection, how do you make a call?  This scenario would require you to have a SIM w/ a data plan in order to access the app that is needed just to go through the ril when it should just be able to access the ril directly without having to have a SIM.

Even having partial packaged apps could help in this case.

Regards,
Naoki

Benjamin Francis

unread,
Jan 30, 2015, 2:04:20 PM1/30/15
to Kevin Grandon, Andrew Overholt, Andrew Williamson, mozilla-d...@lists.mozilla.org, dev-b2g, dev-gaia
On 30 January 2015 at 18:40, Kevin Grandon <kgra...@mozilla.com> wrote:
Do we really want to fully deprecate packaged apps? I think making hosted apps and packaged apps equals is a big win, but I'm not sure if fully deprecating packaged apps is a good idea.

Packaged apps are not web apps. They don't have a URL so are not linkable, crawlable, indexable, searchable or discoverable like the rest of the web and they are always proprietary to one OS. We tried to standardise packaged apps, and it didn't work out. Packaged apps were always intended as a temporary solution, but we never figured out the security model to allow us to get rid of them.
 

There are many developers who like the ability to not run and maintain a server. It's simply less work and overhead for them to throw their code up on a marketplace, and not have to worry about a monthly server cost, or a server going down.

If that really is a problem then Mozilla should offer web app hosting, which is what we are essentially doing, but we're using packaged apps which force a centralised app store model. We can never possibly compete using that model because it relies on app developers submitting their apps to our app store. We have the same chicken and egg bootstrapping problem as every other proprietary OS.

The only way the web can win is if the web itself is the one source of truth, all web apps have a URL, they use a standard cross-platform manifest format and anyone can create their own curated collection of those apps. If someone creates a web app with a W3C web app manifest so that it can be installed on Android via Google Chrome, then it is automatically useful on Firefox OS too. And every OS which supports the web.

Having all web apps be hosted and cross-platform means that rather than trying to convince app developers to submit their proprietary Firefox apps to our empty Firefox OS app store, we just crawl the web looking for web manifests of apps people already created for the web.

To save me ranting further in this thread, you can read more about my take on this here :) https://slack-files.com/T033ZPYCR-F03BRU38Z-1ce3da2e74

Dale Harvey

unread,
Jan 30, 2015, 2:04:40 PM1/30/15
to Naoki Hirata, mozilla-d...@lists.mozilla.org, dev-b2g, Benjamin Francis, Andrew Overholt, Andrew Williamson, dev-gaia, Kevin Grandon
I dont think its necessarily about removing packaged apps completely, they do have their advantages and building out the ability for web application run in a peer to peer fashion is a good thing to do.

However I think Naoki's questions illustrate the point really well


> An example is emergency calling.  If the dialer was fully hosted without a wifi connection, how do you make a call?

Web applications can and should be able to work fully offline and be able to perform just as well as packaged application, we currently arent doing a lot to make the offline web a better place since we pretty much gave up and used packaged apps, I do hope service workers improves the situation.

At every point we are building functionality where people need to do custom firefox packaging for it to be used, we should look at figuring out how we will make it available to web content

On 30 January 2015 at 18:54, Naoki Hirata <nhi...@mozilla.com> wrote:
I have to agree with Kevin.  My biggest concerns are for performance, offline cases, and issues with caching with full hosted apps.  

An example is emergency calling.  If the dialer was fully hosted without a wifi connection, how do you make a call?  This scenario would require you to have a SIM w/ a data plan in order to access the app that is needed just to go through the ril when it should just be able to access the ril directly without having to have a SIM.
Even having partial packaged apps could help in this case.

Regards,
Naoki
On Jan 30, 2015, at 10:40 AM, Kevin Grandon <kgra...@mozilla.com> wrote:

Do we really want to fully deprecate packaged apps? I think making hosted apps and packaged apps equals is a big win, but I'm not sure if fully deprecating packaged apps is a good idea.

There are many developers who like the ability to not run and maintain a server. It's simply less work and overhead for them to throw their code up on a marketplace, and not have to worry about a monthly server cost, or a server going down.

Best,
Kevin

Benjamin Francis

unread,
Jan 30, 2015, 2:09:08 PM1/30/15
to Naoki Hirata, mozilla-d...@lists.mozilla.org, dev-b2g, Andrew Overholt, Andrew Williamson, dev-gaia, Kevin Grandon
On 30 January 2015 at 18:54, Naoki Hirata <nhi...@mozilla.com> wrote:
I have to agree with Kevin.  My biggest concerns are for performance, offline cases, and issues with caching with full hosted apps.

As Dale says this is exactly why we need Service Workers. I'm not suggesting that all apps should require an Internet connection all the time, that would never work :)

Currently people are creating packaged apps so that their apps work offline. But I would argue that packaged apps miss out on most of the benefits of the web.

Andrew Overholt

unread,
Jan 30, 2015, 2:11:04 PM1/30/15
to Naoki Hirata, mozilla-d...@lists.mozilla.org, dev-b2g, Benjamin Francis, Andrew Williamson, dev-gaia, Kevin Grandon
On Fri, Jan 30, 2015 at 1:54 PM, Naoki Hirata <nhi...@mozilla.com> wrote:
An example is emergency calling.  If the dialer was fully hosted without a wifi connection, how do you make a call?  This scenario would require you to have a SIM w/ a data plan in order to access the app that is needed just to go through the ril when it should just be able to access the ril directly without having to have a SIM.

I would think we'd always want something required for certification to be packaged or built-in or something.

Samuel Foster

unread,
Jan 30, 2015, 2:11:41 PM1/30/15
to Naoki Hirata, mozilla-d...@lists.mozilla.org, dev-b2g, Benjamin Francis, Andrew Overholt, Andrew Williamson, dev-gaia, Kevin Grandon
The proposal is that the act of "installing" the app fetches all indicated resources and persists them. Apps that today are packaged wouldn't need to touch the network at all except for optional, periodic update pings. You could also preload apps like the dialer and other core/certified apps - so they are functional from the first run with no need to ever see a network. Nonetheless they are tied to an origin and could be updated using similar mechanims to any page with a serviceworker.

I know this presents difficulties today. I would like to understand them better as I think this stuff is *really* important. If an app doesnt have a meaningful URL its not really a web app in any real sense IMO.

/Sam


On Fri, Jan 30, 2015 at 10:54 AM, Naoki Hirata <nhi...@mozilla.com> wrote:
I have to agree with Kevin.  My biggest concerns are for performance, offline cases, and issues with caching with full hosted apps.  

An example is emergency calling.  If the dialer was fully hosted without a wifi connection, how do you make a call?  This scenario would require you to have a SIM w/ a data plan in order to access the app that is needed just to go through the ril when it should just be able to access the ril directly without having to have a SIM.

Benjamin Francis

unread,
Jan 30, 2015, 2:22:15 PM1/30/15
to Andrew Overholt, mozilla-d...@lists.mozilla.org, Kevin Grandon, Naoki Hirata, Andrew Williamson, dev-gaia, dev-b2g
On 30 January 2015 at 19:10, Andrew Overholt <over...@mozilla.com> wrote:
I would think we'd always want something required for certification to be packaged or built-in or something.

Before Gaia apps were packaged apps we used to pre-populate an appcache when generating a profile to flash to the device. We could do the same thing with a Service Worker cache, maybe even sign the cache [1]. Or we could use a package, but use a pre-cached copy of a signed hosted package [2] which has a real URL and can be updated without updating the whole OS.

1. https://groups.google.com/forum/#!topic/mozilla.dev.webapi/PicfHG9Figk
2. https://bugzilla.mozilla.org/show_bug.cgi?id=1036275

Samuel Foster

unread,
Jan 30, 2015, 2:22:31 PM1/30/15
to Dale Harvey, mozilla-d...@lists.mozilla.org, Kevin Grandon, Naoki Hirata, Benjamin Francis, Andrew Overholt, Andrew Williamson, dev-gaia, dev-b2g
On Fri, Jan 30, 2015 at 11:04 AM, Dale Harvey <da...@arandomurl.com> wrote:
I dont think its necessarily about removing packaged apps completely, they do have their advantages and building out the ability for web application run in a peer to peer fashion is a good thing to do.


There's a thing in here about conflating transport with delivery format and origin which I'm struggling a bit with. ISTM that we should be able to request resources with whatever mechanism and protocol is available and still have meaningful URIs for those resources. If that's P2P or just sideloading apps or updates from a zip or sdcard it would be nice to be able keep an association with the origin with those resources (and the trust that implies)

/Sam

Naoki Hirata

unread,
Jan 30, 2015, 2:25:37 PM1/30/15
to Benjamin Francis, mozilla-d...@lists.mozilla.org, dev-b2g, Andrew Overholt, Andrew Williamson, dev-gaia, Kevin Grandon
Interesting point in case for service workers.  I think I'm ignorant about the technology and being in QA, I tend to play the devil's advocate.  Don't get me wrong, I do wish for what you're stating.  I am concerned about throwing things over the wall to service workers; I get the feeling we'll still run into challenges to confront offline cases.

An edge case example of what I'm worried about is:
Camping and being in the middle of no where, and then the phone dies.  You're able to charge the device via Pan Charger ( http://www.slashgear.com/pan-charger-boils-your-iphone-battery-back-to-life-21160661/ ) or some similar device.  Will the service worker still be able to launch the dialer for emergency calls without having a net connection?  I guess my assumption here is that the service worker relies on information in cache and being active.  What happens when that gets disrupted?

I'm trying to read up on it a little here : http://www.w3.org/TR/service-workers/

Regards,
Naoki


On Jan 30, 2015, at 11:08 AM, Benjamin Francis <bfra...@mozilla.com> wrote:

On 30 January 2015 at 18:54, Naoki Hirata <nhi...@mozilla.com> wrote:
I have to agree with Kevin.  My biggest concerns are for performance, offline cases, and issues with caching with full hosted apps.

Benjamin Francis

unread,
Jan 30, 2015, 2:29:04 PM1/30/15