New CommonJS Modules/2 (draft) loader (BravoJS) with package & mappings support

60 views
Skip to first unread message

Christoph Dorn

unread,
Feb 4, 2011, 11:43:25 PM2/4/11
to comm...@googlegroups.com
Hi all,

So I took Wes' BravoJS [1] CommonJS Modules/2 (draft) [2] loader and
added package and mappings support [3].

This is the third time I have added package and mappings support to a
loader and while it was not trivial I am pretty happy with the result.

The main take-away is that the sooner packages and mappings are added to
a loader the better as it has consequences for many areas. I would
encourage anyone who is maintaining a loader to add package and mappings
support as soon as possible before pushing adoption as it will help the
CommonJS effort when it comes to interoperable programs and make things
easier for you (vs adding after you have optimized your loader to death).

The implementation now [4] is about as simple as it gets I think
(without optimizations) and there is a comprehensive demo available here
[5] which can be adapted easily to test the package and mappings support
of other loaders.

I would love to get to a point where we can run the demo [5] on multiple
loaders without modification. Any takers?

Instead of outlining how building CommonJS programs out of packages
works here I would rather refer everyone to following the code in the
demo [5]. I have added many comments to explain things. If you have
further questions please do not hesitate to ask. Make sure you look at
all package directories as the package descriptors and modules are all
slightly different to test various features.

Now the purpose of this exercise was to see CommonJS Modules/2 (draft),
packages and mappings work together in order to further validate the
spec and assess impact on loader implementations. I am posting a bunch
of notes as a starting point for discussions below. My focus was to just
get this working vs working perfectly in harmony with the spec although
I made efforts in this regard as far as I could interpret and apply the
spec.

Notes following:

Links
-----

* http://code.google.com/p/bravojs/
* http://www.page.ca/~wes/CommonJS/modules-2.0-draft8/
* http://code.tolsma.net/blog/commonjs/
* http://wiki.commonjs.org/wiki/Packages/1.1
* http://wiki.commonjs.org/wiki/Packages/Mappings/C

Overall
-------

* Nice and mostly complete implementation that is relatively easy to
follow.
* Adding package & mappings support required non-trivial changes in
more than a few places.
* Some changes could be baked into the architecture of the loader
better to maybe simplify and speed things us.
* Module provider plugins via module.constructor.prototype are less
useful as only one plugin will work at a time.
Using hooks to trigger plugins may be more useful in various cases.
* Package & mappings support should be recommended as a default
feature for all CommonJS loaders to encourage
interoperability of non-trivial module sets combined via packages
into programs.

Questions
---------

* When are module.eventually() callbacks called and for what purpose?
* What is the difference now between the following. Are they all
valid by default? Should they be?
module.declare([], function(require, exports, module) {}) vs
module.declare(function(require, exports, module) {}) vs
No module.declare() i.e. Modules 1.1.1

Limitations
-----------

* Only catalog-based (via plugin) and relative location path-based
package mappings are supported at this time.

Changes, Additions & Comments
-----------------------------

* All internal top-level module paths for packaged modules follow
<packageID>!/<modulePath> where '!/' is used to signify the package root.
* Expanded scope of module.load(s, f) to allow a mappings object for 's'.
* Expanded scope of module.declare([], f) to allow labelled mapping
objects for '[]'.
* Changed module.load(id/mapping, function(id) { ... }) to return the
canonical ID of the loaded module that can be used with require(id).
* Added chained plugin system to service
resolvePackageMapping(packageMapping) which must return a top-level
package ID if it can resolve.
* Added module.pkgId which is set to the ID of the containing package
for a module if the module is part of a package.
* Added module.mappings which is set to a resolved map where values
are top-level package IDs for a module if the module is part of a
package with mappings.
* Map package UID as valid package ID (in addition to path-basd
package ID) if package descriptor has uid property set.
* Ability to resolve modules by <packageUID>!/<modulePath> if the
package descriptor has the "uid" property set.
e.g. require("http://registry/hostname/path/package1/!/lib/main")
This is not ideal as one must know the *<modulePath>* in this case
'lib/main'. A better solution may be:
require("http://registry/hostname/path/package1/!/").resolve("main");
Where '!/' is used to signify that we want to load a special object
that can resolve IDs for the specified package ID.
* Relative dependency IDs for module.declare() were resolved based on
bravojs.mainModuleDir whereas they should be resolved based on the path
of the module.
* The spec could use more wording as to whether functions accept
relative and/or absolute module identifiers only

TODO
----

* Loader plugins? See: http://wiki.commonjs.org/wiki/Modules/LoaderPlugin
* Sandboxes to allow for arbitrary sub-program scopes within programs
* nodejs support to provide module environment on server (like
nodules: https://github.com/kriszyp/nodules)


The TODO is what I would like to see but not necessarily all as part of
BravoJS depending on what the focus of BravoJS will be. I will be
wrapping BravoJS here [6] to come up with a versatile and complete
loader that will eventually run out of the box on all popular CommonJS
implementations [7].

In summary I like the Modules/2 (draft) API and having it all working
beats having perfect names for everything :) ... although I am sure I
will have more to say on this once I start using it more.

Let me know what you think. If you have any questions about the code
specifically it may be easier to use github's code review features
instead of reposting code here out of context.

Enjoy!

Christoph

[1] - http://code.google.com/p/bravojs/
[2] - http://www.page.ca/~wes/CommonJS/modules-2.0-draft8/
[3] -
https://github.com/pinf/loader-js/commit/1d6b9c25c3b186e1ba9b819171487af6ce740a0f
[4] - https://github.com/pinf/loader-js/tree/master/lib/bravojs
[5] -
https://github.com/pinf/loader-js/tree/master/lib/bravojs/demos/mappings
[6] - https://github.com/pinf/loader-js
[7] - http://wiki.commonjs.org/wiki/Implementations

Wes Garland

unread,
Feb 5, 2011, 8:00:34 AM2/5/11
to comm...@googlegroups.com
Christoph,

Thanks very much for this work.  Once I'm back 100% I'll be looking at your changes very carefully.


On Fri, Feb 4, 2011 at 11:43 PM, Christoph Dorn <christ...@christophdorn.com> wrote:
  * Module provider plugins via module.constructor.prototype are less useful as only one plugin will work at a time.
   Using hooks to trigger plugins may be more useful in various cases.

This is a valid observation that has bothered me a little in the past, but namespacing concerns have overridden the slight clumsiness of making interoperable plug-ins.  (Note that many plug-ins simply can't be interoperable by design, but of course many can be!)

The technique I'm thinking that would be useful for interoperable plug-ins is probably something like

var tmp = module.constructor.prototype.declare;
module.constructor.prototype.declare = function mcp_declare(deps, factory)
{
  do_stuff();
  mcp_declare.oldDeclare(deps, factory);
}
module.constructor.prototype.declare.oldDeclare = tmp;


* Package & mappings support should be recommended as a default feature for all CommonJS loaders to encourage
   interoperability of non-trivial module sets combined via packages into programs.

Definitely, I think the features required to implement them should go into all platforms.  Ideally, a plug-in like the one you've written for BravoJS would be part of a test suite. The test suite we have now covers only /1.1.1 compatibility.
 

 * When are module.eventually() callbacks called and for what purpose?

They are triggered sometime later - for browser environments, this means "not the current pass of the event loop".  So, you can invoke the callback from, say, the response portion of an XHR request, script onload, or setTimeout(0, callback).

In server environments without event loops, they are invoked when the main program ends, details in /2.0-draft8 section 4.7.

In server environment with event loops, they are invoked the same way as the browser, except using whatever mechanism the server environment provides (i.e. we cannot assume DOM exists).

The sole reason for module.eventually() is to allow us to write callbacks that behave the same on the browser and server, even if the server does not implement an event loop environment.

(If anybody following this understands and can explain better than me, please do!).

Key:

module.eventually(function() { print("world"); });
print("hello");

output is "hello world", not "world hello".


 * What is the difference now between the following. Are they all valid by default? Should they be?
     module.declare([], function(require, exports, module) {}) vs

This is an explicit declaration of a module with no dependencies.   (or, explicit dependencies if you fill in the list)
 
     module.declare(function(require, exports, module) {}) vs

In Modules/2.0-draft8, This is a declaration of a module with dependencies which may be inferred by the CommonJS environment any way it sees fit. BravoJS currently chooses to never find any dependencies, which meets the letter of the spec but is perhaps a little hostile. ;)

The jquery plug-in demonstrates how to use require-scraping to find dependencies for modules; I think it might only be enabled for Modules/1.1.1 modules, though.

Kris Kowal has suggested a formal way to describe dependency-finding more formally than "any way it sees fit".  Perhaps that should be added to draft9?  If yes, does that conflict Modules/1.1.1 semantics in edge cases?
 
     No module.declare() i.e. Modules 1.1.1

While /2.0 does not *require* the ability to load Modules/1.1.1 modules, it is demonstrably possible to do so interoperably.  The BravoJS jquery plug-in does this: we load the module, notice that it does not contain a module.declare statement, wrap it, and evaluate it as though it were a module of the form described above.
 

What do think about dropping the "http:/" like BravoJS does for module paths?  This allows us to window.location.protocol, and avoid mixing secure and insecure content on pages.
 
The TODO is what I would like to see but not necessarily all as part of BravoJS depending on what the focus of BravoJS will be. I will be wrapping BravoJS here [6] to come up with a versatile and complete loader that will eventually run out of the box on all popular CommonJS implementations [7].

BravoJS's prime focus is to provide a reference implementation for the CommonJS environment and Modules/2.0, while performing reasonably well.  To this end, I will also be shipping it with tests, and a test running environment, and (hopefully many) plugins which are known to work.

Wes

--
Wesley W. Garland
Director, Product Development
PageMail, Inc.
+1 613 542 2787 x 102

Christoph Dorn

unread,
Feb 5, 2011, 2:31:01 PM2/5/11
to comm...@googlegroups.com
On 11-02-05 5:00 AM, Wes Garland wrote:
> On Fri, Feb 4, 2011 at 11:43 PM, Christoph Dorn
> <christ...@christophdorn.com <mailto:christ...@christophdorn.com>>

> wrote:
>
> * Module provider plugins via module.constructor.prototype are
> less useful as only one plugin will work at a time.
> Using hooks to trigger plugins may be more useful in various cases.
>
> This is a valid observation that has bothered me a little in the past,
> but namespacing concerns have overridden the slight clumsiness of making
> interoperable plug-ins. (Note that many plug-ins simply can't be
> interoperable by design, but of course many can be!)
>
> The technique I'm thinking that would be useful for interoperable
> plug-ins is probably something like
>
> var tmp = module.constructor.prototype.declare;
> module.constructor.prototype.declare = function mcp_declare(deps, factory)
> {
> do_stuff();
> mcp_declare.oldDeclare(deps, factory);
> }
> module.constructor.prototype.declare.oldDeclare = tmp;

This will work for some things but other extension points will need a
plugin system more like I added I think. I'll be porting my apps to work
with BravoJS now that packages and mappings are working and will know
which other extension points are needed more specifically soon.


> * Package & mappings support should be recommended as a default
> feature for all CommonJS loaders to encourage
> interoperability of non-trivial module sets combined via
> packages into programs.
>
> Definitely, I think the features required to implement them should go
> into all platforms. Ideally, a plug-in like the one you've written for
> BravoJS would be part of a test suite. The test suite we have now covers
> only /1.1.1 compatibility.

I was not able to locate these additions into a plugin at this time.
Maybe it can be done, but not without several more extension points for
the loader. This functionality really needs to go into the core of the
loader as it affects ID resolution for all plugins and must be very
performant.

The demo can be easily converted to run unit tests instead of simply
printing the summary results. That way we would have a test suite for
packages and mappings that I plan on maintaining.


> * When are module.eventually() callbacks called and for what purpose?
>
> They are triggered sometime later - for browser environments, this means
> "not the current pass of the event loop". So, you can invoke the
> callback from, say, the response portion of an XHR request, script
> onload, or setTimeout(0, callback).
>
> In server environments without event loops, they are invoked when the
> main program ends, details in /2.0-draft8 section 4.7.

<snip>

> (If anybody following this understands and can explain better than me,
> please do!).

Ok, makes more sense now. I guess the only part I am still not clear on
is what "the program end" is exactly.


> * What is the difference now between the following. Are they all
> valid by default? Should they be?
> module.declare([], function(require, exports, module) {}) vs
>
> This is an explicit declaration of a module with no dependencies. (or,
> explicit dependencies if you fill in the list)
>
> module.declare(function(require, exports, module) {}) vs
>
> In Modules/2.0-draft8, This is a declaration of a module with
> dependencies which may be inferred by the CommonJS environment any way
> it sees fit. BravoJS currently chooses to never find any dependencies,
> which meets the letter of the spec but is perhaps a little hostile. ;)
>
> The jquery plug-in demonstrates how to use require-scraping to find
> dependencies for modules; I think it might only be enabled for
> Modules/1.1.1 modules, though.
>
> Kris Kowal has suggested a formal way to describe dependency-finding
> more formally than "any way it sees fit". Perhaps that should be added
> to draft9? If yes, does that conflict Modules/1.1.1 semantics in edge
> cases?
>
> No module.declare() i.e. Modules 1.1.1
>
> While /2.0 does not *require* the ability to load Modules/1.1.1 modules,
> it is demonstrably possible to do so interoperably. The BravoJS jquery
> plug-in does this: we load the module, notice that it does not contain a
> module.declare statement, wrap it, and evaluate it as though it were a
> module of the form described above.

So I guess the main difference is that without the wrapper, the loader
must load modules via AJAX vs script injection. This is a decision/flag
that must be provided to the loader prior to attempting to load the
first module. Given package support now we could add such a flag to the
package descriptor so the loader can act accordingly when it comes to
loading modules for that package.


>
> e.g. require("http://registry/hostname/path/package1/!/lib/main
> <http://registry/hostname/path/package1/%21/lib/main>")


>
> What do think about dropping the "http:/" like BravoJS does for module
> paths? This allows us to window.location.protocol, and avoid mixing
> secure and insecure content on pages.

Possible, but let me recap what this is doing before I outline our options.

The following:

require("http://registry/hostname/path/package1/!/lib/main")

makes a require with a top-level ID that follows:

require("<packageUID>!/<modulePathId>")

where <packageUID> will match a memoized package descriptor that
includes the UID property:

package.json ~ {
"uid": "http://registry/hostname/path/package1/"
}

and <modulePathId> is the path to the module without the extension
relative to the package root.

This additional format for require is important to be able to reference
modules from other packages (not mapped for the calling module's
package) that where mapped via various mapping descriptors in other
packages where one might use a relative location property and another a
catalog property (resulting in different path-based top-level IDs for
the packages - I still need to add another patch to get this working in
all cases and take package versions into account).

See here for an example [1].

In these cases we need a common denominator to be able to get the same
module exports no matter how the package was mapped and the only way
this can be done is via such a uid property in the package descriptor.

Given all this and also my dislike of the "http://" prefix for the UID
property we are arriving at the purpose of the design for packages,
mappings and catalogs I have been pushing for a while [2].

If we have a network of mirroring authoritative package registry servers
hosted for example at:

http[s]://registry.commonjs.org/
http[s]://registry.pinf.org/
http[s]://registry.github.org/

Where the registry servers follow the established namespacing rules:

http[s]://<registry>/domain.com/
* Need to verify control of domain before one can claim

http[s]://<registry>/na...@email.com/
* Need to verify control of email address before one can claim

http[s]://<registry>/<topLevelNamespaceNoPeriods>/
* Need to formally apply to authoritative registry group to claim

We can omit the "http://" prefix as well as the "<registry>" hostname as
the namespaces are mirrored identically across all participating
registry servers.

For example I have a package with a UID of:

http://registry.pinf.org/cadorn.org/github/util/

which you can access to load a package-specific catalog for all branches
of the package [3].

The registry server implementation [4] I have will automatically put
this package into an aggregating namespace catalog at[4]:

http://registry.pinf.org/cadorn.org/github/catalog.json

Now, rather than dropping the "http://<registry>/" prefix for the "uid"
property in the package descriptor (it should always be complete to be
able to access a package specific catalog from the selected root
registry server for that package) we can drop the prefix when requiring
modules for registered namespaces by contacting an authoritative
registry server by default if a top-level package ID without "/" prefix
is used together with "!/":

require("http://registry/hostname/path/package1/!/lib/main")

becomes

require("hostname/path/package1/!/lib/main")

i.e.

require("<registeredNamespace>!/<modulePathId>")

Does this make sense?


> The TODO is what I would like to see but not necessarily all as part
> of BravoJS depending on what the focus of BravoJS will be. I will be
> wrapping BravoJS here [6] to come up with a versatile and complete
> loader that will eventually run out of the box on all popular
> CommonJS implementations [7].
>
> BravoJS's prime focus is to provide a reference implementation for the
> CommonJS environment and Modules/2.0, while performing reasonably well.
> To this end, I will also be shipping it with tests, and a test running
> environment, and (hopefully many) plugins which are known to work.

Sounds good and along the lines I assumed. So how should we collaborate
on this? I plan on actively contributing to this loader and syncing it
with the specs. I think we should include all necessary plugin hooks
(I'll be proposing more) and arrive at an architecture that is highly
optimizable where we can maintain a raw and optimized version in parallel.

I expect a few more patches to do with the package and mappings stuff
that should land before we revisit the architecture of the loader to
maybe refactor some things. I hope to have these in next week.

Also, can we move BravoJS to github to make collaboration easier?

Christoph

[1] -
https://github.com/pinf/loader-js/blob/master/lib/bravojs/demos/mappings/package3-dir/lib-dir/package3.js#L20
[2] - https://github.com/cadorn/pinf/blob/master/docs/Design/Foundation.md
[3] - http://registry.pinf.org/cadorn.org/github/util/
[4] - http://registry.pinf.org/cadorn.org/github/catalog.json

Wes Garland

unread,
Feb 6, 2011, 10:21:07 AM2/6/11
to comm...@googlegroups.com
On Sat, Feb 5, 2011 at 2:31 PM, Christoph Dorn <christ...@christophdorn.com> wrote:
I was not able to locate these additions into a plugin at this time. Maybe it can be done, but not without several more extension points for the loader. This functionality really needs to go into the core of the loader as it affects ID resolution for all plugins and must be very performant.

I think id resolution can be done via labelled dependencies using only primitives in /2.0draft8, with code *something* like this untested block:

var packageMap = { "A/a": "/modules/package/a/a", "A/b": "/modules/packages/A/b", "C": "/modules/packages/C/package" };
var tmp = module.constructor.prototype.declare;
module.constructor.prototype.declare = function() mcp_declare(deps, factory)
{
  deps.push(packageMap);
  mcp_declare.oldDeclare(deps, factory);
}
module.constructor.prototype.declare = tmp;

Caveats:
 - Spec is unclear if slashes in dependency labels are allowed -- this should be adjusted
 - This technique precludes require-scraping as it forces the presence of a dependency array.  The spec could be adjusted to fix that, as well.  For example, passing a labelled dependency object could be seen a way to pass labels without dependencies, thus kicking of a scraping pass?


     * When are module.eventually() callbacks called and for what purpose?
Ok, makes more sense now. I guess the only part I am still not clear on is what "the program end" is exactly.

When the main module factory finishes execution -- with no event loop and no module.eventually calls in place, the javascript environment will normally terminate, returning the user to the UNIX shell prompt or whatever.
 

So I guess the main difference is that without the wrapper, the loader must load modules via AJAX vs script injection.

Or via some kind of a server-side dynamic wrapping.
 
This is a decision/flag that must be provided to the loader prior to attempting to load the first module. Given package support now we could add such a flag to the package descriptor so the loader can act accordingly when it comes to loading modules for that package.

Yes -- if the CommonJS Environment is able to load Modules/1.1.1 code, it needs to know ahead of time and adjust itself to make this possible.  This is an environment-specific detail; the /2.0 specification does not require that a conformant environment is able to natively execute /1.1.1 modules: only that if it is handed a properly-wrapped /1.1.1 module (now a /2.0 module) that it will have the semantics as it would on a conformant /1.1.1 system.
 

This additional format for require is important to be able to reference modules from other packages (not mapped for the calling module's package) that where mapped via various mapping descriptors in other packages where one might use a relative location property and another a catalog property (resulting in different path-based top-level IDs for the packages - I still need to add another patch to get this working in all cases and take package versions into account).

Right -- the presence or lack of the protocol field doesn't really impact the uniqueness of the name, provided we all promise to serve up the same modules regardless of the protocol.   Since path names starting with "/" are also not reserved by Modules/anything, we can also determine if we are detailing with a canonical resource identifier simply by looking at the first character, as well.


Given all this and also my dislike of the "http://" prefix for the UID property we are arriving at the purpose of the design for packages, mappings and catalogs I have been pushing for a while [2].

*nod* - I think I'm mostly on board with what you're doing, but I think it's important that
1) it be completely separate from Modules/next
2) it be completely implementable using only browser DOM + Modules/next

So the exercise you're going through is critical in helping us know if Modules/2.0draft8 exposes sufficient functionality.
 

 require("http://registry/hostname/path/package1/!/lib/main")
becomes

 require("hostname/path/package1/!/lib/main")

i.e.

 require("<registeredNamespace>!/<modulePathId>")

Does this make sense?

It does, except for the require.paths problem -- going out to the internet to check a package registry for each mis-spelled top-level module name is a problem. Also, how do we handle collisions between registered and local packages, if they can have the same syntax?

suggestion:  require("@registeredNamespace/moduleIdentifier")  ?

(what exactly does the bang do in this proposal?)

 

Sounds good and along the lines I assumed. So how should we collaborate on this? I plan on actively contributing to this loader and syncing it with the specs. I think we should include all necessary plugin hooks (I'll be proposing more) and arrive at an architecture that is highly optimizable where we can maintain a raw and optimized version in parallel.

What do you think about forking a version on google code (or wherever) where you can develop and publish ahead of spec, and push changes to the official BravoJS repository as spec drafts get published?
 
I expect a few more patches to do with the package and mappings stuff that should land before we revisit the architecture of the loader to maybe refactor some things. I hope to have these in next week.

Also, can we move BravoJS to github to make collaboration easier?

I'd rather not -- I already use the BravoJS repository as a sub-repository in other projects. That said, https://github.com/blog/439-hg-git-mercurial-plugin is probably a fine tool. The docs seem to say we can push/pull changesets at will in either direction. OTOH, if you're interested in learning hg, I'd be happy to offer a little irc guidance. :)

Wes

Sander Tolsma

unread,
Feb 8, 2011, 10:50:27 AM2/8/11
to comm...@googlegroups.com
Wes, Christoph,

I checked the code that Christoph added to BravoJS and I like most of it.
But we still need to find a way to add loader plugins to the Core CommonJS
System (used the defined name of that layer from the framework ;-)). When I
was describing the loading process for the framework document I also
stumbled upon the same things as Christoph: not only module loading but also
for loading package descriptors and querying package registries (and
probably more in the future) we need loading hooks...

I want to propose the following:
Define a function in the module API where plugins can register (like for
example module.registerPlugin) which accepts a configuration object with
defined callback properties for different load functions.
The registerPlugin function will return the Loader API as an object so the
plugin has private entry to the specific Generic Loader Functions like id ,
uri and memoize functions needed to define modules, packages and registry
information in the Core CommonJS System.

Example code:
var loaderAPI;

// function that loads a basic module
function loadModule(id, readyCallback) {
// code that loads the module and calls
loaderAPI.memoize(id, deps, factory function);
readyCallback();
}

// function that loads a package descriptor
function loadPackage(packageURL, readyCallback){
// code that loads the package descriptor and calls
loaderAPI.addPackage(descriptor);
readyCallback();
}

// function that queries a registry for location information of a package
function queryRegistry(packageInfo, readyCallback){
// code that queries the registry
loaderAPI.addPackageLocation(packagename, location);
readyCallback();
}

loaderAPI = module.registerPlugin({
"loadModule": loadModule,
"loadPackage"; loadPackage,
"queryRegistry": queryRegistry
});

What do you two think ??

Sander

Christoph Dorn

unread,
Feb 8, 2011, 12:43:11 PM2/8/11
to comm...@googlegroups.com
On 11-02-08 7:50 AM, Sander Tolsma wrote:
> I checked the code that Christoph added to BravoJS and I like most of it.

That is a good start :). It is going to get a bit uglier, but I think we
can clean it up nicely.

I definitely think we need such a plugin system and various hooks, but
am not sure about which ones yet. I hope to have a better idea once I
finish off implementing Mappings/C in the loader and see how we can
generalize things.

I think the loader needs to stay as dumb as possible and while I can see
a hook for "loadPackage", a hook for "queryRegistry" may be too specific
and belong into a plugin that attaches to "resolvePackageMapping" instead.

My hope is to have a pluggable loader that boots a CommonJS program
where the program can specify which additional loader plugins to load
for that program. That way a program can carry it's micro-environment
with it which will make it more interoperable and provide package
resolution semantics and other base features that a package author can
choose to use for his program no matter what environment it runs in.

Christoph

Sander Tolsma

unread,
Feb 8, 2011, 1:12:49 PM2/8/11
to comm...@googlegroups.com
Christoph,

Sorry, I think I wasn't clear enough... The names I used and the hooks I
defined are NOT the ones I want, they are just an example.. ;-)
What I wanted to show is flexible registering/attaching of plugins to
function hooks. In the example I used three hooks in one plugin but it could
be that two plugins register and one uses the loadModule hook and the other
plugin the queryRegistry hook. The other flexibility is that new hooks from
the Core CommonJS system layer can be defined in specifications (by defining
other properties to use from the configuration object) without changing the
registerPlugin entry point or creating a lot of functions in for example the
modules or require namespace... In this way we can extend the basic CommonJS
layer with lots of functionality without every time having a naming
discussion.. ;-)

>
> I think the loader needs to stay as dumb as possible and while I can see
> a hook for "loadPackage", a hook for "queryRegistry" may be too specific
> and belong into a plugin that attaches to "resolvePackageMapping" instead.
>

Agreed, I don't want every plugin to attach to every hook!! See my
explaination above!!

> My hope is to have a pluggable loader that boots a CommonJS program
> where the program can specify which additional loader plugins to load
> for that program. That way a program can carry it's micro-environment
> with it which will make it more interoperable and provide package
> resolution semantics and other base features that a package author can
> choose to use for his program no matter what environment it runs in.

Ah, something that was crossing my mind a couple of weeks ago too but I
decided to first put my time in describing the general framework... ;-)

>
> Christoph
>

Sander

Christoph Dorn

unread,
Feb 8, 2011, 2:29:37 PM2/8/11
to comm...@googlegroups.com
On 11-02-06 7:21 AM, Wes Garland wrote:
> On Sat, Feb 5, 2011 at 2:31 PM, Christoph Dorn
> <christ...@christophdorn.com <mailto:christ...@christophdorn.com>>

> wrote:
>
> I was not able to locate these additions into a plugin at this time.
> Maybe it can be done, but not without several more extension points
> for the loader. This functionality really needs to go into the core
> of the loader as it affects ID resolution for all plugins and must
> be very performant.
>
> I think id resolution can be done via labelled dependencies using only
> primitives in /2.0draft8, with code *something* like this untested block:
>
> var packageMap = { "A/a": "/modules/package/a/a", "A/b":
> "/modules/packages/A/b", "C": "/modules/packages/C/package" };
> var tmp = module.constructor.prototype.declare;
> module.constructor.prototype.declare = function() mcp_declare(deps, factory)
> {
> deps.push(packageMap);
> mcp_declare.oldDeclare(deps, factory);
> }
> module.constructor.prototype.declare = tmp;

This will solve some cases but not all. I'll see how far I can get with
relocating things into plugins once I have the core set of features
working together to identify common hooks. I am getting close.


> Caveats:
> - Spec is unclear if slashes in dependency labels are allowed -- this
> should be adjusted

They should not be allowed and are not necessary with the current
design. I think labels need to be one term to be consistent.

If we can implement all resolution via labelled dependencies then we
could entertain changing this, but I don't think labelled dependencies
are enough. Don't know yet.


> - This technique precludes require-scraping as it forces the presence
> of a dependency array. The spec could be adjusted to fix that, as
> well. For example, passing a labelled dependency object could be seen a
> way to pass labels without dependencies, thus kicking of a scraping pass?

The design I have now can accommodate all cases. Cannot comment on
alternatives yet.


> * When are module.eventually() callbacks called and for what
> purpose?
> Ok, makes more sense now. I guess the only part I am still not clear
> on is what "the program end" is exactly.
>
> When the main module factory finishes execution -- with no event loop
> and no module.eventually calls in place, the javascript environment will
> normally terminate, returning the user to the UNIX shell prompt or
> whatever.

Ah, ok. So if module.eventually calls are in place the "last" call would
essentially trigger the factory function of the original module.declare
if any of the dependencies were loaded async?


> So I guess the main difference is that without the wrapper, the
> loader must load modules via AJAX vs script injection.
>
> Or via some kind of a server-side dynamic wrapping.
>
> This is a decision/flag that must be provided to the loader prior to
> attempting to load the first module. Given package support now we
> could add such a flag to the package descriptor so the loader can
> act accordingly when it comes to loading modules for that package.
>
> Yes -- if the CommonJS Environment is able to load Modules/1.1.1 code,
> it needs to know ahead of time and adjust itself to make this possible.
> This is an environment-specific detail; the /2.0 specification does not
> require that a conformant environment is able to natively execute /1.1.1
> modules: only that if it is handed a properly-wrapped /1.1.1 module (now
> a /2.0 module) that it will have the semantics as it would on a
> conformant /1.1.1 system.

Ok. I am going to add something to load Modules/1.1.1 packages as an
optional feature to my loader.


> This additional format for require is important to be able to
> reference modules from other packages (not mapped for the calling
> module's package) that where mapped via various mapping descriptors
> in other packages where one might use a relative location property
> and another a catalog property (resulting in different path-based
> top-level IDs for the packages - I still need to add another patch
> to get this working in all cases and take package versions into
> account).
>
> Right -- the presence or lack of the protocol field doesn't really
> impact the uniqueness of the name, provided we all promise to serve up
> the same modules regardless of the protocol. Since path names starting
> with "/" are also not reserved by Modules/anything, we can also
> determine if we are detailing with a canonical resource identifier
> simply by looking at the first character, as well.

Right. I have removed the protocol prefix with the following assumption:

<packageUID>/!/<resouecePath>

Where:

* <packageUID> - is the uid property from package.json without http://
prefix.
* If hostname (uid property is a URL) is a known registry server it
is dropped as a prefix as well leaving the registry namespace as the
<packageUID>.

* <resourcePath> is the UNIX path to a resource in the package from
the package root (no beginning slash).


> Given all this and also my dislike of the "http://" prefix for the
> UID property we are arriving at the purpose of the design for
> packages, mappings and catalogs I have been pushing for a while [2].
>
> *nod* - I think I'm mostly on board with what you're doing, but I think
> it's important that
> 1) it be completely separate from Modules/next

I agree.

> 2) it be completely implementable using only browser DOM + Modules/next

Yes. That is my goal. I have BravoJS running in the browser (as-is) and
server (via pinf/loader-js) now both loading the mappings demo. This
works out of the box as the mappings demo only uses relative path
location-based mappings.

The next step is to build a server helper that will rewrite package
mappings that use catalogs or archives to be relative path
location-based on the server and bundle modules for transport to browser
automatically. I'll get to this once I have catalog and archive based
mappings working on the server.


> So the exercise you're going through is critical in helping us know if
> Modules/2.0draft8 exposes sufficient functionality.
>
> require("http://registry/hostname/path/package1/!/lib/main

> <http://registry/hostname/path/package1/%21/lib/main>")


> becomes
>
> require("hostname/path/package1/!/lib/main")
>
> i.e.
>
> require("<registeredNamespace>!/<modulePathId>")
>
> Does this make sense?
>
> It does, except for the require.paths problem -- going out to the
> internet to check a package registry for each mis-spelled top-level
> module name is a problem.

That must not happen. The ID namespace must be completely deterministic
based on the mappings parsed from the package descriptors. You cannot
reference a namespace that has not been mapped prior.

If a non path-based location mapping is found we have the following options:

1) Cross-domain request
2) Contact own server via a fetch url. i.e. fetch?mapping={...}
3) Have server helper re-write mapping as files are lazy fetched or
bundled for the client
4) Use package via a catalog that normalizes various mappings into own
namespace by supplying archive and mappings property (catalogs can
overwrite mappings in package descriptors for packages they hold).


> Also, how do we handle collisions between
> registered and local packages, if they can have the same syntax?
>
> suggestion: require("@registeredNamespace/moduleIdentifier") ?
>
> (what exactly does the bang do in this proposal?)

This is not necessary if the top-level ID namespace uses URIs with
hostnames. The distinction between registered and local packages is
irrelevant at the point of resolution as long as the packages are
already mapped.

While mapping we can look at the UID property to establish if the
package is officially registered or not.


> Sounds good and along the lines I assumed. So how should we
> collaborate on this? I plan on actively contributing to this loader
> and syncing it with the specs. I think we should include all
> necessary plugin hooks (I'll be proposing more) and arrive at an
> architecture that is highly optimizable where we can maintain a raw
> and optimized version in parallel.
>
> What do you think about forking a version on google code (or wherever)
> where you can develop and publish ahead of spec, and push changes to the
> official BravoJS repository as spec drafts get published?

Ok. The code will reside here:

https://github.com/pinf/loader-js/tree/master/lib/bravojs


> I expect a few more patches to do with the package and mappings
> stuff that should land before we revisit the architecture of the
> loader to maybe refactor some things. I hope to have these in next week.
>
> Also, can we move BravoJS to github to make collaboration easier?
>
> I'd rather not -- I already use the BravoJS repository as a
> sub-repository in other projects. That said,
> https://github.com/blog/439-hg-git-mercurial-plugin is probably a fine
> tool. The docs seem to say we can push/pull changesets at will in either
> direction. OTOH, if you're interested in learning hg, I'd be happy to
> offer a little irc guidance. :)

I would prefer to stay away from HG. I'll work on the code at the above
link and we can manually coordinate a merge when appropriate. Ideally
changes over time will be minimal.

The only problem we have right now is if you make a bunch of changes we
are not going to be able to merge easily. If anything we should look at
what I have done at the end of this week and decide how we can
merge/refactor to get a codebase we can move forward with.

Christoph

Wes Garland

unread,
Feb 9, 2011, 8:35:26 AM2/9/11
to comm...@googlegroups.com
On Tue, Feb 8, 2011 at 1:12 PM, Sander Tolsma <goo...@tolsma.net> wrote:
The other flexibility is that new hooks from
the Core CommonJS system layer can be defined in specifications (by defining
other properties to use from the configuration object) without changing the
registerPlugin entry point or creating a lot of functions in for example the
modules or require namespace... In this way we can extend the basic CommonJS
layer with lots of functionality without every time having a naming
discussion.. ;-)


You know -- it occurs to me that we could do a layered hook object that works in the same spirit as the UNIX VFS switch through prototypal inheritance.

Say we have facilities a, b, c, and d.

facilities = { a: aNative, b: bNative, c: cNative, d: dNative }

function registerHook(facilityObject)
{
  var f = clone(facilityObject);
  f.prototype = facilities;
  facilities = f;
}

Will "this" have the right value to let us write something like this? :

function aHook(param)
{
  if (condition)
    magic();
  else  // fallback to previous functionality
    return this.prototype.a(param);
}

Wes

Christoph Dorn

unread,
Feb 11, 2011, 10:45:12 AM2/11/11
to comm...@googlegroups.com
On 11-02-08 11:29 AM, Christoph Dorn wrote:
> The next step is to build a server helper that will rewrite package
> mappings that use catalogs or archives to be relative path
> location-based on the server and bundle modules for transport to browser
> automatically. I'll get to this once I have catalog and archive based
> mappings working on the server.

Quick update. I just pushed a bunch of changes to pinf/loader-js:

https://github.com/pinf/loader-js

There are a few demos now:

git clone git://github.com/pinf/loader-js.git
cd loader-js

node ./pinf-loader -v ./demos/HelloWorld
node ./pinf-loader -v ./demos/Mappings
node ./pinf-loader -v ./demos/CommonJSModules1
node ./pinf-loader -v ./demos/CommonJSModules2
node ./pinf-loader -v ./demos/LoadExtraCode
node ./pinf-loader -v ./demos/GithubArchiveDependency

Working on getting some of these demos working in the browser now. After
that on versioning and catalogs.

When this stuff is prototyped I'll revisit BravoJS and try to locate as
much as I can into a plugin.

Let me know what you think!

Christoph

Wes Garland

unread,
Feb 11, 2011, 10:50:30 AM2/11/11
to comm...@googlegroups.com
When this stuff is prototyped I'll revisit BravoJS and try to locate as much as I can into a plugin.

Let me know if you want me to extend the BravoJS base-layer in a fork to include a labeling object that is not part of the dependency declaration.

From your experience thus far, I think that such a change should probably make its way into /2.0d9 - it nicely decouples labels and dependencies.

This would make declaration syntax

module.declare([labeling object], [dependency array], exports factory)

dependency array would still support labels, and labeling object would have the same syntax as labeled dependencies.

Christoph Dorn

unread,
Feb 11, 2011, 11:24:42 AM2/11/11
to comm...@googlegroups.com
On 11-02-11 7:50 AM, Wes Garland wrote:
> > When this stuff is prototyped I'll revisit BravoJS and try to locate
> as much as I can into a plugin.
>
> Let me know if you want me to extend the BravoJS base-layer in a fork to
> include a labeling object that is not part of the dependency declaration.
>
> From your experience thus far, I think that such a change should
> probably make its way into /2.0d9 - it nicely decouples labels and
> dependencies.
>
> This would make declaration syntax
>
> module.declare([labeling object], [dependency array], exports factory)
>
> dependency array would still support labels, and labeling object would
> have the same syntax as labeled dependencies.

What do I need this for and how would that work?

Christoph

Wes Garland

unread,
Feb 11, 2011, 12:54:08 PM2/11/11
to comm...@googlegroups.com
> What do I need this for and how would that work?

The idea would be that you could describe the package hierachy to the module system somehow.

So, if you had the labeling object

{
  "a": "/path/to/package/a"
}

require("a")   and require("/path/to/package/a")

would be interchangeable

Hmm - maybe we also need

{
  "b/": "/path/to/package/b/"
}

which would make

require("b/c")  and require("/patch/to/package/b/c")

interchangeable

Order of precedence for top-level resolution:
 - labeled dependency
 - labeling object
 - environment (e.g. require.paths)
Reply all
Reply to author
Forward
0 new messages