Module sharing

61 views
Skip to first unread message

Irakli Gozalishvili

unread,
Feb 27, 2012, 4:55:11 PM2/27/12
to Jetpack
Hi Folks,

I wrote a proposal on how we can improve module sharing story for SDK add-on devs:

If you have any feedback I'm more then happy to hear it.

P.S.: There is intentionally no word about tooling as it can be easily built once we'll get the foundation.
--
Irakli Gozalishvili
Web: http://www.jeditoolkit.com/

Andrew Sutherland

unread,
Feb 27, 2012, 6:33:15 PM2/27/12
to mozilla-la...@googlegroups.com
On 02/27/2012 01:55 PM, Irakli Gozalishvili wrote:
> I wrote a proposal on how we can improve module sharing story for SDK
> add-on devs:
> https://github.com/mozilla/addon-sdk/wiki/JEP-packageless

It might be useful to expand on the rationale section and to include a
section that describes how the changes impact interoperability with
other CommonJS-ish environments and why the changes are okay. Right
now, it seems like you are trying to shoe-horn all of the meta-data
about dependencies into the require() directive in order to avoid use of
package.json at the cost of CommonJS-ish compatibility. It's not clear
why this is a win.

Some specific thoughts:

I like name-spacing the jetpack pages under "addon-kit" and "api-utils";
it always struck me as dumb that everything by default sits at the
top-level. But the "@" prefix seems unnecessary because it's going to
make node.js and similar unhappy for no good reason.

The version specification seems too flexible and not flexible enough.
Why would I want to specify the example you provide of "panel;1.5"? Am
I manually going to need to find out what version I'm writing the code
against and put that in there and update it manually? Is Firefox really
going to include 1.5, 1.6, 1.7, 1.8, ... ?I could see wanting to be able
to evolve the API interface, but why not just create a "panel2.js" when
the time comes if there are backwards-incompatible changes?

I use RequireJS for a lot of my web JS logic, so the syntax for assets
seems fine to me. The example has the "asset" helper return a URI which
seems like a good idea for that data type. RequireJS also has, for
example, a "text" loader plugin that instead will return the actual
contents of the text file. (This is especially nice for web delivery
where you can use the r.js optimizer to include the text file in the
optimized .js file.) Would you support something like the text loader
plugin, or is the goal really just to have the require statements there
for dependency-specifying side-effects?

Andrew

Ben Bucksch

unread,
Feb 27, 2012, 7:01:30 PM2/27/12
to mozilla-la...@googlegroups.com
Are you proposing that modules are automatically fetched from the web
based on a require("http...")?

That sounds
1) dangerous, because you can't know what will be on that URL tomorrow.
Server hacked, game over. At the mimimum, you'd need a SHA hash of the
file content included in the require, next to the URL
2) unreliable, because I start to depend not only on one download
server, but several, and most of them not under my control.

If this is about sharing resources, I'd rather suggest to keep the
current mechnism of shipping the modules each addon, and then do a SHA
hash on the source at runtime, and if the same file (same SHA) is
already loaded, use that.

Ben

Ben Bucksch

unread,
Feb 27, 2012, 7:06:20 PM2/27/12
to mozilla-la...@googlegroups.com
On 28.02.2012 01:01, Ben Bucksch wrote:
> Are you proposing that modules are automatically fetched from the web
> based on a require("http...")?

On second reading, it seems like you don't. In that case, I would
recommend to drop the "http" from the unique identifier and just use
"example.com/foo", to remove confusion and not oblige me to commit to a
certain URL on my webserver forever.

Irakli Gozalishvili

unread,
Feb 27, 2012, 10:14:46 PM2/27/12
to mozilla-la...@googlegroups.com
On Monday, 2012-02-27 at 15:33 , Andrew Sutherland wrote:
On 02/27/2012 01:55 PM, Irakli Gozalishvili wrote:
I wrote a proposal on how we can improve module sharing story for SDK
add-on devs:

It might be useful to expand on the rationale section and to include a
section that describes how the changes impact interoperability with
other CommonJS-ish environments and why the changes are okay. Right
now, it seems like you are trying to shoe-horn all of the meta-data
about dependencies into the require() directive in order to avoid use of
package.json at the cost of CommonJS-ish compatibility. It's not clear
why this is a win.

While it's true that we want to get rid of packages as a code sharing mechanism
in favor of modules, we no way intend to make code sharing with other 
CommonJS-ish environments harder. In fact this change may even improve this.
I'll add a section about sharing with other CommonJS environments.  
 

Some specific thoughts:

I like name-spacing the jetpack pages under "addon-kit" and "api-utils";
it always struck me as dumb that everything by default sits at the
top-level. But the "@" prefix seems unnecessary because it's going to
make node.js and similar unhappy for no good reason.

I don't quite see why `@` prefix would make node unhappy, in fact this will
make code sharing via adapters easier. For example one could drop
`node_modules/@panel.js` into package to provide node adapter (probably
something other than panel would be better). Also It's possible to do adapters
other way round, for example node `fs` could be easily adapted using
`@modules/fs.js`. Also `@` is mainly there to signify that code comes
firefox and nowhere else.
The version specification seems too flexible and not flexible enough.
Why would I want to specify the example you provide of "panel;1.5"? Am
I manually going to need to find out what version I'm writing the code
against and put that in there and update it manually? Is Firefox really
going to include 1.5, 1.6, 1.7, 1.8, ... ?I could see wanting to be able
to evolve the API interface, but why not just create a "panel2.js" when
the time comes if there are backwards-incompatible changes?
Only time you will ever use 'panel;1.5' is when (if ever) panel will make
backwards incompatible change and you will want some time to make
a transition. In that regard it follows XPCOM.  

I use RequireJS for a lot of my web JS logic, so the syntax for assets
seems fine to me. The example has the "asset" helper return a URI which
seems like a good idea for that data type. RequireJS also has, for
example, a "text" loader plugin that instead will return the actual
contents of the text file. (This is especially nice for web delivery
where you can use the r.js optimizer to include the text file in the
optimized .js file.) Would you support something like the text loader
plugin, or is the goal really just to have the require statements there
for dependency-specifying side-effects?

At the moment goal is just "dependency-specifying side effect", but in a future
we might add 'text' loader as an enhanced alternative to `require('self').data.load`.
That being said, it will be a special case and it's very unlikely for us to add dynamic
loader plugins.

Andrew


Thanks Andrew for the feedback, hope my comments make sense, if not please followup.
 
--
You received this message because you are subscribed to the Google Groups "mozilla-labs-jetpack" group.
To post to this group, send email to mozilla-la...@googlegroups.com.
To unsubscribe from this group, send email to mozilla-labs-jet...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/mozilla-labs-jetpack?hl=en.

Irakli Gozalishvili

unread,
Feb 27, 2012, 10:24:00 PM2/27/12
to mozilla-la...@googlegroups.com

On Monday, 2012-02-27 at 16:06 , Ben Bucksch wrote:

On 28.02.2012 01:01, Ben Bucksch wrote:
Are you proposing that modules are automatically fetched from the web
based on a require("http...")?

On second reading, it seems like you don't.

Correct! There will be no runtime fetch, it will be only during development either by running special command
or automatically when running `cfx run/xpi` (it's still unclear, but tooling will come in a next iteration regardless,
so it's out of scope for now).
 
In that case, I would
recommend to drop the "http" from the unique identifier and just use
"example.com/foo", to remove confusion and not oblige me to commit to a
certain URL on my webserver forever.

In fact there is a note under the example that says that protocol is optional. So you could easily write
'!example.com/foo' instead. `!` prefix is required to be able to distinguish it from bundled dependencies
found under `@modules`. Also note that there are libraries like `socket.io` so we need a way to distinguish
between them. BTW tool installing dependencies will be able to handle redirects so moving to diff server
should not be such a big issue anyway.

Irakli Gozalishvili

unread,
Feb 27, 2012, 10:28:16 PM2/27/12
to mozilla-la...@googlegroups.com
Thanks Ben for the feedback!


Regards

--
Irakli Gozalishvili
Web: http://www.jeditoolkit.com/

James Burke

unread,
Feb 28, 2012, 2:04:24 PM2/28/12
to mozilla-labs-jetpack
On Feb 27, 1:55 pm, Irakli Gozalishvili <rfo...@gmail.com> wrote:
> I wrote a proposal on how we can improve module sharing story for SDK add-on devs:https://github.com/mozilla/addon-sdk/wiki/JEP-packageless

The items in the Rationale section can be accomplished by using a /
*package.json */ comment in the JS module, instead of adding more
things to module IDs that make them harder to visually parse, and
would make the modules less portable to other environments, in
particular to AMD-based web module loading.

I also think a package.json-like structure is clearer -- there needs
to be an area where the module broadcasts both what it depends on and
also what it provides (the license terms, author info), all in a way
that can be consumed by tools.

By reusing a package.json structure, this creates lowers the cross-
environment translation burden, and for the cases where more than one
module is provided via a directory of modules, the in-out membrane
knowledge can be contained in a package.json file, without having to
have longer IDs used inside the js code itself.

npm is open to allowing /*package.json */ type of comments:
https://github.com/isaacs/npm/issues/1772

While that integration is likely just a way to convert the js file
into a tarball, once that tarball is pulled out of npm, a jetpack tool
can look in the tarball, and if only a single JS file with a /
*package.json */ comment, it will know what to do.

In fact, I do something like this for volo right now. volo is a
command line tool that fetches dependencies from github or URLs to a
zipball/single JS file:

https://github.com/volojs/volo

It pulls down zipballs from github, unpacks them, then uses some rules
to figure out what to keep from the zipball, often it is just a single
JS file:

https://github.com/volojs/volo/wiki/add-dependency-rules

The other plus, volo knows how to pull down project templates and
allows running an onCreate command inside the project template to do
further setup. Here is an video walkthrough for a "responsive appcache
webapp":

http://tagneto.blogspot.com/2012/02/template-for-responsive-mobile-offline.html

volo could be very useful for jetpack module sharing too. You get the
tooling for "free" if /*package.json */ is used. I'm sure volo would
need some work to fit fully into an ideal jetpack flow, but I'm open
to doing that work.

In any case, I can see greater value in centralizing the intake/output
meta info about a module into one structure instead of spreading it
around within a module, where the output meta is somewhere else, and
the module IDs get longer more complex-looking to encode intake meta.

James

Irakli Gozalishvili

unread,
Feb 28, 2012, 2:26:18 PM2/28/12
to mozilla-la...@googlegroups.com
Hi James,

I will look into volo once again, but I think it should be able to work with this proposal as well. We would like to
keep simple cases simple, while for more complex use cases users are more then welcome to use other tooling,
require additional metadata, etc. Use of urls solves any name-spacing issues for us without any tooling or centralized
registry or metadata. In addition that's a way we expect modules on the web will look like once we'll get a harmony modules. In other words you don't necessary have to use url forms for dependencies you could just put whatever you
need into `@modules` folder and tools like "volo" would be helpful for doing this. Still we would like to support very simple cases where you just want to use module from the url, similar to script tags or harmony modules.

So if volo will just organize dependencies in a @modules folder it will just work fine, or do I miss something ?


Regards

--
Irakli Gozalishvili
Web: http://www.jeditoolkit.com/

Ben Bucksch

unread,
Feb 28, 2012, 2:51:35 PM2/28/12
to mozilla-la...@googlegroups.com
On 28.02.2012 20:04, James Burke wrote:
> The items in the Rationale section can be accomplished by using a /
> *package.json */ comment in the JS module, instead of adding more
> things to module IDs that make them harder to visually parse
>
> ... what it depends on and also what it provides (the license terms, author info)

I like that idea. The require() should remain unchanged and all this
information should be inside the module. This seems clean and
striaght-forward to me.

The changed require() syntax is difficult to comprehend how it works,
what it causes, and doesn't allow to give all the information. Moreover,
the URL of a module can change and shouldn't be in every file that uses
the module, but only in the module itself - this will be very important,
as URLs are bound to change, often without a redirect being possible.

Will Bamberg

unread,
Feb 28, 2012, 3:06:58 PM2/28/12
to mozilla-la...@googlegroups.com
The proposal lists this as a problem to be solved by packageless modules:

> Packages require repositories for publishing them

I don't think that we need repositories because we have packages. I
think we (might decide we) want repositories in order to provide places
where people can:

* find modules providing certain APIs (...that are compatible with
particular applications and application versions)
* be given help choosing a particular module, if there are several
options (by looking at download statistics, for instance)
* have some assurance that the module is not evil, and that it does what
it claims to (by having a review system, for instance)
* browse the documentation for the module

We might or might not think this is a good idea (I do), but it seems to
me to be independent of whether we have packages.

Will

Irakli Gozalishvili

unread,
Feb 28, 2012, 5:32:29 PM2/28/12
to mozilla-la...@googlegroups.com
In fact I totally agree and would like to make them independent of packages. Unfortunately with packages it's not independent as central registries are there to solve naming conflicts that urls already have built-in.
Will

Irakli Gozalishvili

unread,
Feb 28, 2012, 5:41:55 PM2/28/12
to mozilla-la...@googlegroups.com
As I already mentioned it's still possible to install your dependencies from whatever place with whatever tool
and continue using short syntax as you do today (with a difference that there will be actual place for putting them). Arguably this will bring a lot of other complexity name conflicts, publishing etc.. but it's your choice. On the other
hand if I just want to use underscore.js instead of manually downloading it to my dependencies I will be able to write require('!underscorejs.org/underscore') to do that automatically for me. Also there is nothing special about !foo.com
form is just a convention that will be recognized to automate manual dependency download task. 

James Burke

unread,
Feb 28, 2012, 8:24:38 PM2/28/12
to mozilla-labs-jetpack
On Feb 28, 11:26 am, Irakli Gozalishvili <rfo...@gmail.com> wrote:
> Hi James,
>
> I will look into volo once again, but I think it should be able to work with this proposal as well. We would like to
> keep simple cases simple, while for more complex use cases users are more then welcome to use other tooling,
> require additional metadata, etc. Use of urls solves any name-spacing issues for us without any tooling or centralized
> registry or metadata.

The ID syntax in the JEP-packageless proposal is effectively metadata
-- they are not needed at runtime, only install time. It just happens
that you will be building that part of the tooling into jetpack, but
the same could be done for /*package.json */ scanning for
dependencies.

I was just mentioning that there is other tooling available now for
that approach if you preferred to not build that installation logic
into jetpack itself. But either way, there is some tool (jetpack
itself or otherwise) processing some metadata about a dependency.

Mixing installation info in the module dependency ID is mixing up
installation concerns with runtime concerns: the require() ID should
just indicate what module is being requested, and it should operate
more like an interface ID. To expand on that:

With the underscore instance, it would be bad if one module specified
'!underscorejs.org/underscore' but a different module used '!
raw.github.com/documentcloud/underscore/1.2.1/underscore'. Ideally the
developer should choose to resolve to use one version of underscore
and go with that.

While Node's nested "node_modules" might work for Node, browser-based
code should avoid loading multiple versions of code. In addition, for
things like common event routers, using only one singleton for the
routing is important.

If the answer is "all modules should use '!underscorejs.org/
underscore' as the ID", that makes it hard to swap in equally capable
implementations of a dependency. For instance, Zepto can be used to
fulfill the 'jquery' dependency.

So, I believe it is best to express installation hints separately from
the runtime require IDs used in code.

> In addition that's a way we expect modules on the web will look like once we'll get a harmony modules. In other words you don't necessary have to use url forms for dependencies you could just put whatever you
> need into `@modules` folder and tools like "volo" would be helpful for doing this. Still we would like to support very simple cases where you just want to use module from the url, similar to script tags or harmony modules.

Jetpack seems to need to install modules first before using them
because it wants some security guarantees that they do not change over
time/after the addon installation. Although maybe I'm going on old
information. It is just a "fetch once from the network during dev
bootstrap, then use it". That seems like a dependency install tool,
just a built in one.

I can see the case for wanting to specify a full, regular URL, but the
syntax in that JEP proposal is not that. I would rather see it just be
regular URLs or module IDs, but strongly encourage module IDs to get
the benefits of the shorter "interface" names they imply. Those
install hints are not needed once runtime execution begins, and those
package.json comments could even be stripped out, if that made sense.

> So if volo will just organize dependencies in a @modules folder it will just work fine, or do I miss something ?

volo will install the dependencies in the current working directory,
unless there is a scripts or js directory in the working directory
that is available.

This brings up something else: I'm also unclear on what @${name}
signifies. It just seems like a module namespace name, like a
"package" name. Not sure why @ is needed for that.

James

Irakli Gozalishvili

unread,
Feb 29, 2012, 1:56:42 AM2/29/12
to mozilla-la...@googlegroups.com
I think all this boils down to convention vs configuration, where !foo is suggested convention and /* package.json */
is a configuration. I do agree that later one is more flexible in many ways but at the cost of complexity. I'm trying to
provide simple solution out of the box and make more complex solutions possible via alternative tooling. I have updated proposal to include more details about it (added sections about glue / adapter modules).
 
In addition that's a way we expect modules on the web will look like once we'll get a harmony modules. In other words you don't necessary have to use url forms for dependencies you could just put whatever you
need into `@modules` folder and tools like "volo" would be helpful for doing this. Still we would like to support very simple cases where you just want to use module from the url, similar to script tags or harmony modules.

Jetpack seems to need to install modules first before using them
because it wants some security guarantees that they do not change over
time/after the addon installation. Although maybe I'm going on old
information. It is just a "fetch once from the network during dev
bootstrap, then use it". That seems like a dependency install tool,
just a built in one.

No that's still a case for jetpack. New version includes some details about
differences between harmony modules and possibly an SDK modules. 

I can see the case for wanting to specify a full, regular URL, but the
syntax in that JEP proposal is not that. I would rather see it just be
regular URLs or module IDs, but strongly encourage module IDs to get
the benefits of the shorter "interface" names they imply. Those
install hints are not needed once runtime execution begins, and those
package.json comments could even be stripped out, if that made sense.

It's just a convention, in java for example org.foo.bar does not means much
either it just a convention. Same here it's not quite a URL as semantically it
behaves differently but simple enough convention to translate from one to
another. That being said I'm still not quite sure convention is better than an
actual URL there are pros and cons on both sides. Also if I have to type url
I don't see why in /* package.json */ is better than in require itself.
So if volo will just organize dependencies in a @modules folder it will just work fine, or do I miss something ?

volo will install the dependencies in the current working directory,
unless there is a scripts or js directory in the working directory
that is available.

This brings up something else: I'm also unclear on what @${name}
signifies. It just seems like a module namespace name, like a
"package" name. Not sure why @ is needed for that.


We plan to move SDK into firefox. Also we expect that other teams will provide
new APIs in firefox via SDK modules. @ prefix is there to signify that code
is from standard library / platform similar to harmony @dom. 

James Burke

unread,
Feb 29, 2012, 11:55:38 PM2/29/12
to mozilla-labs-jetpack
On Feb 28, 10:56 pm, Irakli Gozalishvili <rfo...@gmail.com> wrote:
> I think all this boils down to convention vs configuration, where !foo is suggested convention and /* package.json */
> is a configuration. I do agree that later one is more flexible in many ways but at the cost of complexity. I'm trying to
> provide simple solution out of the box and make more complex solutions possible via alternative tooling. I have updated proposal to include more details about it (added sections about glue / adapter modules).

For me, the convention is "place the module at a baseUrl location and
it just works". So if the dependency is "underscore" and that is at
baseUrl + underscore.js then everything works out.

If the system needs more information about which particular underscore
is suggested at "install time" if there is not already an underscore
installed by the developer, then look in the configuration. For me,
the require module ID domain/version stuff or /*package.json*/, those
are equivalent ways of providing that config, just different places to
do it.

Since require is a runtime command, adding install info to that
runtime value does not seem like the right separation of concerns, and
it seems to make it less clear how to allow alternative "underscore"
providers. It seems to enforce a very specific implementation version.
Even if another one could be allowed because it is found at baseUrl+
underscore.js so that the domain/version installation info is not
consulted, it seems confusing.

I would not base too much on how harmony modules are currently
constructed too. They are not ready yet. In particular, they should
allow for short meaningful interface names like 'jquery' or
'underscore' instead of full URLs for the same reasons as above, and a
custom resolver should not need to be provided. Otherwise, it is not
much better than having to ask people to use an AMD module loader --
the problem is asking developers to carry a script dependency that
enforces the conventions they want, and it is best if the short names
that map to a baseUrl + shortname.js are supported as the default
resolver so a library would not have to be provided.

Anyway, I'm starting to monologue, and repeat the same points, so I'll
stop posting now.

James

Alexandre poirot

unread,
Mar 28, 2012, 6:51:40 PM3/28/12
to mozilla-la...@googlegroups.com
It is not really clear to me how this will work in mozilla central, where we already have some file layout and distinct components already in place.
For example, if you take this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=708984
A module is moved from devtools to toolkit.
Toolkit would most likely contains such "standard library":
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/toolkit/content
But devtools components would expect to have another standard library, that would be at a very different place:
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/browser/devtools/shared/

Would devtools contains some extra configuration somehow to register this devtools-specific standard library? Or is toolkit, going to be the only one standard library folder?

Otherwise, is there an error here? Or is this expected that @tabs maps to /sdk/tabs.js? Shouldn't it be just /tabs.js?
require('@tabs')                 // require('resource://modules/sdk/tabs.js')
require('@sdk/window/events')    // require('resource://modules/sdk/window/events.js')
Standard library feature seems to require some "package" configuration and it introduce some magic on how the search works. I'd prefer explicit absolute path. Like require("addon-kit/tabs"), require("toolkit/promise"), and so on.
It looks like it would simplify the whole thing to make no differences between external dependencies and standard library.


Finally, what about existing packages? Are we going to keep supporting them? We still agree that we need at least one "package" per addon ? As we need a package.json for addon name, description, ... Are we going to only support this special root/addon packages and drop all others? It would mean that all addons using dependencies as package would break. External dependencies looks like a package except that cfx will ignore package.json and just search for CommonJS modules. What about data files? I'm pretty sure you wrote something about them, around metadata proposal. It sounds almost as a blocker to me, as you won't be able to use localization modules without such feature (support of data files and alike).

That's a lot of questions, feel free to pick most relevant ones!

Irakli Gozalishvili

unread,
Mar 29, 2012, 12:06:27 PM3/29/12
to mozilla-la...@googlegroups.com, Alexandre Poirot
See my comments below:

On Wednesday, 2012-03-28 at 15:51 , Alexandre poirot wrote:

It is not really clear to me how this will work in mozilla central, where we already have some file layout and distinct components already in place.
For example, if you take this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=708984
A module is moved from devtools to toolkit.
Toolkit would most likely contains such "standard library":
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/toolkit/content
But devtools components would expect to have another standard library, that would be at a very different place:
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/browser/devtools/shared/

Yes I have realized that standard library is incorrect naming and was planning to call modules that start with '@' a system modules, ones that come with a system. My assumption is that we will have one folder similar to resource:///modules where all
commonjs type modules will be placed. Also each project will be able to claim it's own subfolder there similar to resource:///modules/devtools/
resource:///modules/sessionstore/

etc…

Mozilla toolkit is an interesting question that may indeed be a worth asking maybe mapping can be done in a slightly different manner like:

@devtools/scratchpad -> resource://devtools/scratchpad.js
@sessionstore/xpath-generator -> resource://sessionstore/xpath-generator.js
@toolkit/promise ->  resource://toolkit/promise.js

Either way I think that would be a solvable problem even after the switch to packageless or landing to firefox. Discussing exact
mapping with firefox team was on my list either way, so I'll make sure to talk about this as well.


Would devtools contains some extra configuration somehow to register this devtools-specific standard library? Or is toolkit, going to be the only one standard library folder?

Again this confusion is probably caused by my poor naming, that must be system modules. Devtools modules will be part of system modules as long as it's shipped with a system.
 

Otherwise, is there an error here? Or is this expected that @tabs maps to /sdk/tabs.js? Shouldn't it be just /tabs.js?
require('@tabs')                 // require('resource://modules/sdk/tabs.js')
require('@sdk/window/events')    // require('resource://modules/sdk/window/events.js')

No that's intentional, as you can see above all the mapping is straight forward as you have a group / path pairs (devtools: scratchpad, sessionstore: xpath-generator, toolkit: promise) only exception are single term modules that are normalized to
carry a default group presumably sdk so @sdk/tabs is exact equivalent of @tabs, but if you will need to load module like
current `window/utils`  there will be no shortcut and you will have to type `@sdk/window/utils`. This makes high-level
sdk modules easier to type and distinguishable from low level modules.
 
Standard library feature seems to require some "package" configuration and it introduce some magic on how the search works. I'd prefer explicit absolute path. Like require("addon-kit/tabs"), require("toolkit/promise"), and so on.
It looks like it would simplify the whole thing to make no differences between external dependencies and standard library.

I hope answers above make normalization logic clear, which does not involves any kind of search. Also yes everyone will have
to type @toolkit/promise only short form will be available for high level sdk modules as they are for entry level devs. In addition it's easier to sell to switch from panel to @panel than to @sdk/panel. If you still think such normalization is too much please let me know.

Finally, what about existing packages?
 
There will be no packages will be just lib similar to nodejs, also that's until we move those things to firefox.

Are we going to keep supporting them?
No not as packages, but we will keep backwards compatibility by having modules like lib/addon-kit/panel.js that looks like

console.warn('require("panel") form is deprecated please use require("@panel") instead');
module.exports = require('@panel');

We still agree that we need at least one "package" per addon ?
 
We will just call that an add-on, and we will keep package.json which curries unfortunate name :(

As we need a package.json for addon name, description, ... Are we going to only support this special root/addon packages and drop all others?

Not sure I get this one, but hope other answers make it clear.
 
It would mean that all addons using dependencies as package would break.

I'm afraid so, but that feature was kind of broken anyway. On the other hand there is pretty simple fix for them by just putting things into @modules folder and fixing a require paths.   
 
External dependencies looks like a package except that cfx will ignore package.json and just search for CommonJS modules.

Yes and that's intentional it's basically what `node_modules` and this leaves a room for people who want to use package managers to deal with such dependencies. 
 
What about data files?

data folder of an add-on will be include with a XPI. Since there will be no other packages there will be no other data folders. If external dependencies wish to bundle some assets they will be able to use require('asset!./icon.png') form their modules that would return url to that icon.png which will be bundled with rest of the add-on.
 
I'm pretty sure you wrote something about them, around metadata proposal.

In fact it is mentioned in the proposal.
 
It sounds almost as a blocker to me, as you won't be able to use localization modules without such feature (support of data files and alike).


I hope solution above solves issues with localization, if not we should definitely talk!
 
That's a lot of questions, feel free to pick most relevant ones!

Irakli Gozalishvili

unread,
Mar 29, 2012, 2:43:27 PM3/29/12
to Alexandre poirot, mozilla-la...@googlegroups.com

On Thursday, 2012-03-29 at 11:06 , Alexandre poirot wrote:

Thanks for this long reply. It clarify tons of questions I had in mind!!

 
Sure thing, that's why this thread was started :)
 
I'm still wondering what is the difference between:
  require("external-dependency/sub-module")
and
  require("@internal-dependency/sub-module")
?

It looks like there is a substantial difference between both, but It is not clear why we need a difference?


Yes there are several motivations to that:

1. Firs of all @ signifies the fact that implementation is provided by a runtime and in fact it may not have a js code behind it.
2. If system and external dependencies were sharing same namespace than there would be room for surprises. What I mean is in case of conflict either system or external dependency will have to win. Both cases are problematic, if we say external dependency wins than installing dependency A may pull down other dependency that shadow system modules and cause surprises or inability to require system module. If we say that system module wins than there is a different issue since platform may change in a future and introduce new foo that would change behavior of the deployed add-on.
3. cfx has no way of knowing about module shipped with a platform, so for example if you require('@devtools/scratchpad') we assume that your target runtime has resource://devtools/stractpad.js or something, which means that @ namespace is unlimited. Now merging that namespace with external dependencies  makes that unlimited as well and we won't be able to complain about require('foo/bar') that was not discover in @modules as you might have meant one provided by a platform existence of which can not be verified at build time.

While this approach solves all these issues, it comes with a triad off. Sometimes you want to use new panel that will be shipped at some point in a future. But than you don't want to rewrite code using that panel to update. This case becomes bit more complicated but possible, all you will have to do is put panel.js in @modules than looks like

var panel = require('@panel')
module.exports = isNewPanelAPI(panel) ? panel : require('./panel-new')
 
and use require('panel') in the rest of the code base.
About localization, we might switch to packageless without metadata or data file support. But it will mean that we won't be able to use require(self).data or localization in dependencies. I don't think we can solve localization with require(asset...), but we can easily make it work with some metadata specifying either supported languages or path to each properties files. At the end, it is quite similar to data files support.


2012/3/29 Irakli Gozalishvili <rfo...@gmail.com>

Dave Townsend

unread,
Mar 30, 2012, 1:16:01 PM3/30/12
to mozilla-la...@googlegroups.com, Alexandre Poirot
On Thursday, 29 March 2012 09:06:27 UTC-7, gozala wrote:
See my comments below:

On Wednesday, 2012-03-28 at 15:51 , Alexandre poirot wrote:

It is not really clear to me how this will work in mozilla central, where we already have some file layout and distinct components already in place.
For example, if you take this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=708984
A module is moved from devtools to toolkit.
Toolkit would most likely contains such "standard library":
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/toolkit/content
But devtools components would expect to have another standard library, that would be at a very different place:
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/browser/devtools/shared/

Yes I have realized that standard library is incorrect naming and was planning to call modules that start with '@' a system modules, ones that come with a system. My assumption is that we will have one folder similar to resource:///modules where all
commonjs type modules will be placed. Also each project will be able to claim it's own subfolder there similar to resource:///modules/devtools/
resource:///modules/sessionstore/

etc…

Mozilla toolkit is an interesting question that may indeed be a worth asking maybe mapping can be done in a slightly different manner like:

@devtools/scratchpad -> resource://devtools/scratchpad.js
@sessionstore/xpath-generator -> resource://sessionstore/xpath-generator.js
@toolkit/promise ->  resource://toolkit/promise.js

Either way I think that would be a solvable problem even after the switch to packageless or landing to firefox. Discussing exact
mapping with firefox team was on my list either way, so I'll make sure to talk about this as well.

It's totally possible for devtools, sessionstore, etc. to each register their own resource mapping like that, I'm not sure it is worth it though. Instead I'd more expect us to register resource://commonjs/ to point somewhere and then devtools etc. just put subdirectories under wherever that place is. Easy enough (though potentially problematic if say an extension wanted to provide some new built-in modules).
Reply all
Reply to author
Forward
0 new messages