It might be useful to expand on the rationale section and to include a
section that describes how the changes impact interoperability with
other CommonJS-ish environments and why the changes are okay. Right
now, it seems like you are trying to shoe-horn all of the meta-data
about dependencies into the require() directive in order to avoid use of
package.json at the cost of CommonJS-ish compatibility. It's not clear
why this is a win.
Some specific thoughts:
I like name-spacing the jetpack pages under "addon-kit" and "api-utils";
it always struck me as dumb that everything by default sits at the
top-level. But the "@" prefix seems unnecessary because it's going to
make node.js and similar unhappy for no good reason.
The version specification seems too flexible and not flexible enough.
Why would I want to specify the example you provide of "panel;1.5"? Am
I manually going to need to find out what version I'm writing the code
against and put that in there and update it manually? Is Firefox really
going to include 1.5, 1.6, 1.7, 1.8, ... ?I could see wanting to be able
to evolve the API interface, but why not just create a "panel2.js" when
the time comes if there are backwards-incompatible changes?
I use RequireJS for a lot of my web JS logic, so the syntax for assets
seems fine to me. The example has the "asset" helper return a URI which
seems like a good idea for that data type. RequireJS also has, for
example, a "text" loader plugin that instead will return the actual
contents of the text file. (This is especially nice for web delivery
where you can use the r.js optimizer to include the text file in the
optimized .js file.) Would you support something like the text loader
plugin, or is the goal really just to have the require statements there
for dependency-specifying side-effects?
Andrew
That sounds
1) dangerous, because you can't know what will be on that URL tomorrow.
Server hacked, game over. At the mimimum, you'd need a SHA hash of the
file content included in the require, next to the URL
2) unreliable, because I start to depend not only on one download
server, but several, and most of them not under my control.
If this is about sharing resources, I'd rather suggest to keep the
current mechnism of shipping the modules each addon, and then do a SHA
hash on the source at runtime, and if the same file (same SHA) is
already loaded, use that.
Ben
On second reading, it seems like you don't. In that case, I would
recommend to drop the "http" from the unique identifier and just use
"example.com/foo", to remove confusion and not oblige me to commit to a
certain URL on my webserver forever.
On 02/27/2012 01:55 PM, Irakli Gozalishvili wrote:I wrote a proposal on how we can improve module sharing story for SDKadd-on devs:It might be useful to expand on the rationale section and to include asection that describes how the changes impact interoperability withother CommonJS-ish environments and why the changes are okay. Rightnow, it seems like you are trying to shoe-horn all of the meta-dataabout dependencies into the require() directive in order to avoid use ofpackage.json at the cost of CommonJS-ish compatibility. It's not clearwhy this is a win.
Some specific thoughts:I like name-spacing the jetpack pages under "addon-kit" and "api-utils";it always struck me as dumb that everything by default sits at thetop-level. But the "@" prefix seems unnecessary because it's going tomake node.js and similar unhappy for no good reason.
The version specification seems too flexible and not flexible enough.Why would I want to specify the example you provide of "panel;1.5"? AmI manually going to need to find out what version I'm writing the codeagainst and put that in there and update it manually? Is Firefox reallygoing to include 1.5, 1.6, 1.7, 1.8, ... ?I could see wanting to be ableto evolve the API interface, but why not just create a "panel2.js" whenthe time comes if there are backwards-incompatible changes?
I use RequireJS for a lot of my web JS logic, so the syntax for assetsseems fine to me. The example has the "asset" helper return a URI whichseems like a good idea for that data type. RequireJS also has, forexample, a "text" loader plugin that instead will return the actualcontents of the text file. (This is especially nice for web deliverywhere you can use the r.js optimizer to include the text file in theoptimized .js file.) Would you support something like the text loaderplugin, or is the goal really just to have the require statements therefor dependency-specifying side-effects?
Andrew
--You received this message because you are subscribed to the Google Groups "mozilla-labs-jetpack" group.To post to this group, send email to mozilla-la...@googlegroups.com.To unsubscribe from this group, send email to mozilla-labs-jet...@googlegroups.com.For more options, visit this group at http://groups.google.com/group/mozilla-labs-jetpack?hl=en.
On Monday, 2012-02-27 at 16:06 , Ben Bucksch wrote:
On 28.02.2012 01:01, Ben Bucksch wrote:Are you proposing that modules are automatically fetched from the webbased on a require("http...")?On second reading, it seems like you don't.
In that case, I wouldrecommend to drop the "http" from the unique identifier and just use"example.com/foo", to remove confusion and not oblige me to commit to acertain URL on my webserver forever.
I like that idea. The require() should remain unchanged and all this
information should be inside the module. This seems clean and
striaght-forward to me.
The changed require() syntax is difficult to comprehend how it works,
what it causes, and doesn't allow to give all the information. Moreover,
the URL of a module can change and shouldn't be in every file that uses
the module, but only in the module itself - this will be very important,
as URLs are bound to change, often without a redirect being possible.
> Packages require repositories for publishing them
I don't think that we need repositories because we have packages. I
think we (might decide we) want repositories in order to provide places
where people can:
* find modules providing certain APIs (...that are compatible with
particular applications and application versions)
* be given help choosing a particular module, if there are several
options (by looking at download statistics, for instance)
* have some assurance that the module is not evil, and that it does what
it claims to (by having a review system, for instance)
* browse the documentation for the module
We might or might not think this is a good idea (I do), but it seems to
me to be independent of whether we have packages.
Will
Will
In addition that's a way we expect modules on the web will look like once we'll get a harmony modules. In other words you don't necessary have to use url forms for dependencies you could just put whatever youneed into `@modules` folder and tools like "volo" would be helpful for doing this. Still we would like to support very simple cases where you just want to use module from the url, similar to script tags or harmony modules.Jetpack seems to need to install modules first before using thembecause it wants some security guarantees that they do not change overtime/after the addon installation. Although maybe I'm going on oldinformation. It is just a "fetch once from the network during devbootstrap, then use it". That seems like a dependency install tool,just a built in one.
I can see the case for wanting to specify a full, regular URL, but thesyntax in that JEP proposal is not that. I would rather see it just beregular URLs or module IDs, but strongly encourage module IDs to getthe benefits of the shorter "interface" names they imply. Thoseinstall hints are not needed once runtime execution begins, and thosepackage.json comments could even be stripped out, if that made sense.
So if volo will just organize dependencies in a @modules folder it will just work fine, or do I miss something ?volo will install the dependencies in the current working directory,unless there is a scripts or js directory in the working directorythat is available.This brings up something else: I'm also unclear on what @${name}signifies. It just seems like a module namespace name, like a"package" name. Not sure why @ is needed for that.
require('@tabs') // require('resource://modules/sdk/tabs.js')
require('@sdk/window/events') // require('resource://modules/sdk/window/events.js')
Standard library feature seems to require some "package" configuration and it introduce some magic on how the search works. I'd prefer explicit absolute path. Like require("addon-kit/tabs"), require("toolkit/promise"), and so on.On Wednesday, 2012-03-28 at 15:51 , Alexandre poirot wrote:
It is not really clear to me how this will work in mozilla central, where we already have some file layout and distinct components already in place.
For example, if you take this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=708984
A module is moved from devtools to toolkit.
Toolkit would most likely contains such "standard library":
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/toolkit/content
But devtools components would expect to have another standard library, that would be at a very different place:
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/browser/devtools/shared/
Would devtools contains some extra configuration somehow to register this devtools-specific standard library? Or is toolkit, going to be the only one standard library folder?
Otherwise, is there an error here? Or is this expected that @tabs maps to /sdk/tabs.js? Shouldn't it be just /tabs.js?
require('@tabs') // require('resource://modules/sdk/tabs.js') require('@sdk/window/events') // require('resource://modules/sdk/window/events.js')
Standard library feature seems to require some "package" configuration and it introduce some magic on how the search works. I'd prefer explicit absolute path. Like require("addon-kit/tabs"), require("toolkit/promise"), and so on.
It looks like it would simplify the whole thing to make no differences between external dependencies and standard library.
Finally, what about existing packages?
Are we going to keep supporting them?
We still agree that we need at least one "package" per addon ?
As we need a package.json for addon name, description, ... Are we going to only support this special root/addon packages and drop all others?
It would mean that all addons using dependencies as package would break.
External dependencies looks like a package except that cfx will ignore package.json and just search for CommonJS modules.
What about data files?
I'm pretty sure you wrote something about them, around metadata proposal.
It sounds almost as a blocker to me, as you won't be able to use localization modules without such feature (support of data files and alike).
That's a lot of questions, feel free to pick most relevant ones!
On Thursday, 2012-03-29 at 11:06 , Alexandre poirot wrote:
Thanks for this long reply. It clarify tons of questions I had in mind!!
I'm still wondering what is the difference between:
require("external-dependency/sub-module")
and
require("@internal-dependency/sub-module")
?
It looks like there is a substantial difference between both, but It is not clear why we need a difference?
About localization, we might switch to packageless without metadata or data file support. But it will mean that we won't be able to use require(self).data or localization in dependencies. I don't think we can solve localization with require(asset...), but we can easily make it work with some metadata specifying either supported languages or path to each properties files. At the end, it is quite similar to data files support.
2012/3/29 Irakli Gozalishvili <rfo...@gmail.com>
See my comments below:On Wednesday, 2012-03-28 at 15:51 , Alexandre poirot wrote:
It is not really clear to me how this will work in mozilla central, where we already have some file layout and distinct components already in place.
For example, if you take this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=708984
A module is moved from devtools to toolkit.
Toolkit would most likely contains such "standard library":
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/toolkit/content
But devtools components would expect to have another standard library, that would be at a very different place:
https://hg.mozilla.org/mozilla-central/file/5727a8f457c1/browser/devtools/shared/Yes I have realized that standard library is incorrect naming and was planning to call modules that start with '@' a system modules, ones that come with a system. My assumption is that we will have one folder similar to resource:///modules where allcommonjs type modules will be placed. Also each project will be able to claim it's own subfolder there similar to resource:///modules/devtools/resource:///modules/sessionstore/etc…Mozilla toolkit is an interesting question that may indeed be a worth asking maybe mapping can be done in a slightly different manner like:@devtools/scratchpad -> resource://devtools/scratchpad.js@sessionstore/xpath-generator -> resource://sessionstore/xpath-generator.js@toolkit/promise -> resource://toolkit/promise.jsEither way I think that would be a solvable problem even after the switch to packageless or landing to firefox. Discussing exactmapping with firefox team was on my list either way, so I'll make sure to talk about this as well.