Getting Serious about JavaScript Tooling

193 views
Skip to first unread message

Gregory Szorc

unread,
Feb 16, 2015, 4:26:21 PM2/16/15
to firefox-dev-owner firefox-dev-owner
Firefox Developers,

You write a lot of JavaScript. (I know - I was on your team once.) Unfortunately, a lot of the JavaScript going into Firefox is what I call "Gecko flavored JavaScript." This is JavaScript that doesn't conform to any standard (like ES6). Instead, it's JavaScript that takes advantage of non-standard, SpiderMonkey/Gecko-only extensions like Components.utils.import().

I think the prevalence of all this "non-standard" JavaScript poses a major problem to the productivity of Firefox developers and hinders the ability to more quickly ship high quality features, which undermines the ability for Mozilla to achieve its Mission.

In my capacity as a Developer Productivity Engineer, I'd love to build and deploy tools for you so you can do your job better and more efficiently. Unfortunately, the lack of standard JavaScript in the Firefox code base makes that significantly more difficult than it could be. This is because pretty much all the existing JavaScript tools out there barf when processing Firefox source code because it contains non-standard JavaScript.

There has been a wave of advancements in JavaScript tools these past few years. Unfortunately, many of them can't be leveraged by Firefox developers. The rest of the world is reaping the rewards of better tooling. But for Firefox development at Mozilla, we're still stuck in the past. Others have increased their development velocity while we have stayed the same. We're losing ground. And the gap is only getting wider. We're already playing around with automatic linting and code rewriting for Python and C++ developers at Mozilla. Unfortunately, those advancements can't easily come to JavaScript until the tools can understand the Firefox code.

I think it should be an organizational priority to address the "JavaScript tooling gap" for Firefox development.

Here is where I need your help.

I'd like to start a dialog within the Firefox team about 1) adopting coding standards that facilitate tool usage 2) adopting a plan to convert existing source code to be "standards compliant" so tools can be deployed with reasonable success.

I understand there are valid reasons for diverging from specified language behavior from time to time. However, the tooling gap is widening and the drawbacks from deviating from what tools support are increasing, and this only hurts Firefox and Mozilla more as time goes by. Yes, it might be possible to patch 3rd party tools and teach them about SpiderMonkey extensions. It has been done before. But, I don't think we want to be in the position of maintaining 3rd party tools if it can be avoided. And, we can't expect tools to accept these changes in the first place (this would be like asking Gecko to implement a non-standard, Chrome-only feature - it's definitely not very Mozilla-y if nothing else).

So, I ask a question: what today is preventing JavaScript in Firefox from conforming to the specified ECMAScript language and what can we do to minimize that gap going forward so productivity and quality enhancing tools and services may be utilized?

Gregory

Dave Townsend

unread,
Feb 16, 2015, 4:47:13 PM2/16/15
to Gregory Szorc, Firefox Dev

Can you actually enumerate the things we are using that aren't standard (or soon to be standard) ES6? I can't think of many. My experience is that generally when a nice new ES6 feature comes along we rush excitedly to start using it, sometimes before the JS team think is is ready to ship unfortunately.

I know we have old-style generator functions that we can and should just switch to function*. For...each can probably be ditched in favour of for...of easily too.

Cu.import will be harder to remove but I don't understand what the cost of that is there, it's certainly a well-formed ES6 statement and until modules become a real thing there isn't really a replacement.

_______________________________________________
firefox-dev mailing list
firef...@mozilla.org
https://mail.mozilla.org/listinfo/firefox-dev

Bill McCloskey

unread,
Feb 16, 2015, 4:55:56 PM2/16/15
to Dave Townsend, Firefox Dev, Gregory Szorc
Use of the preprocessor would be pretty high on my list of things to get rid of.

I suspect that a lot of the older non-ES6 JS features probably don't get used much in new code (old-style generators, for each, braceless functions), so a style guide won't help much. We just need to remove them from old code.

-Bill

Sebastian Hengst

unread,
Feb 16, 2015, 4:57:46 PM2/16/15
to firef...@mozilla.org
Remove of non-standard features is tracked in the meta bug
https://bugzilla.mozilla.org/show_bug.cgi?id=1103158
Dependency tree:
https://bugzilla.mozilla.org/showdependencytree.cgi?id=1103158&hide_resolved=1

Furthermore, there are also the files which need to be preprocessed.

Archaeopteryx

-------- Original-Nachricht --------
Betreff: Re: Getting Serious about JavaScript Tooling
Von: Dave Townsend <dtow...@mozilla.com>
An: Gregory Szorc <g...@mozilla.com>
CC: Firefox Dev <firef...@mozilla.org>
Datum: 2015-02-16 22:47
> Can you actually enumerate the things we are using that aren't standard
> (or soon to be standard) ES6? I can't think of many. My experience is
> that generally when a nice new ES6 feature comes along we rush excitedly
> to start using it, sometimes before the JS team think is is ready to
> ship unfortunately.
>
> I know we have old-style generator functions that we can and should just
> switch to function*. For...each can probably be ditched in favour of
> for...of easily too.
>
> Cu.import will be harder to remove but I don't understand what the cost
> of that is there, it's certainly a well-formed ES6 statement and until
> modules become a real thing there isn't really a replacement.
>
> On Feb 16, 2015 1:26 PM, "Gregory Szorc" <g...@mozilla.com
> firef...@mozilla.org <mailto:firef...@mozilla.org>
> https://mail.mozilla.org/listinfo/firefox-dev

Gijs Kruitbosch

unread,
Feb 16, 2015, 5:03:42 PM2/16/15
to Bill McCloskey, Firefox Dev
The preprocessor shouldn't be an issue; we should just run whatever tooling we have on preprocessed source code. There are standardized things for this type of stuff, like source maps, and we can use them.

Ditching the preprocessor would also mean ungodly-long in-tree JS files because extra script references add overhead. It's no coincidence web devs unify their JS into only a few files, too. Obviously we could replace it with other magical "put the things together" solutions, but that's unlikely to solve the same tooling problem, ie, multiple files that apply to a single global. If anything, I wish we split up browser.js even more ways.

Depending on the kind of analysis, like Dave I would expect that our modules not having window globals, and both them and the window globals having "odd" APIs (document.getBindingParent, document.getAnonymousElementByAttribute, XUL/XBL elements + syntax, ...) is more problematic. I'd also loathe to give up any kind of useful syntactic sugar that's on a standards-track just because the tools aren't there yet, and write more verbose and error-prone code instead.

I would settle for our internal tools actually working, like the browser debugger which is too unreliable to use (with XBL, with content scripts, ...) without debugger; statements... don't think we need any syntax changes for that!

~ Gijs

Bill McCloskey

unread,
Feb 16, 2015, 5:24:09 PM2/16/15
to Gijs Kruitbosch, Firefox Dev
On Mon, Feb 16, 2015 at 2:03 PM, Gijs Kruitbosch <gijskru...@gmail.com> wrote:
Ditching the preprocessor would also mean ungodly-long in-tree JS files because extra script references add overhead. It's no coincidence web devs unify their JS into only a few files, too. Obviously we could replace it with other magical "put the things together" solutions, but that's unlikely to solve the same tooling problem, ie, multiple files that apply to a single global. If anything, I wish we split up browser.js even more ways.

I didn't mean to suggest that we should just expand out all the preprocessor invocations. I count 795 uses of #ifdef in our JS code (some from b2g). All of these could easily be replaced by something like |if (Constant.XP_MACOSX)|. The new code would only be a tiny bit slower.

#include usage is harder to fix but also much less common--only 88 uses. Some of these were originally JSMs but got converted to #includes when compartments had much higher overhead than they do now. The stuff in browser.js would probably require more refactoring, but I think it's worth doing.
 
-Bill

Gregory Szorc

unread,
Feb 16, 2015, 6:36:27 PM2/16/15
to Gijs Kruitbosch, Bill McCloskey, Firefox Dev
While preprocessed code could likely ultimately be tolerated, it is my preference to see it go away *unless* we adopt preprocessing conventions utilized by the larger JavaScript community and are well supported by 3rd party JavaScript tools. (Currently, the preprocessor is part of the build system and I'm pretty sure any overlap with existing JS preprocessors is coincidental.) Much like the larger Java world doesn't preprocess Java (like we do on Fennec, unfortunately), we shouldn't be doing it in Firefox JavaScript unless we have a very good reason. And if we must preprocess code, I think we should take steps to limit its impact (e.g. isolate all preprocessed code to standalone files so its effects don't creep into most code).

Preprocessing is an additional step to statically analyze JavaScript code. With our current preprocessor, we must have a build system context (i.e. run configure) in order to preprocess. And this means potentially different output on Windows, Linux, Mac, etc. This means you have to perform static analysis on different platforms. Or, it involves low-level workarounds to trick the build system. This raises the barrier to writing and deploying new tools, which undermines developer productivity efforts. (We ideally want to deploy once and run everywhere.)

I think eliminating or marginalizing preprocessing of JavaScript code would be a noble goal.


On Mon, Feb 16, 2015 at 2:03 PM, Gijs Kruitbosch <gijskru...@gmail.com> wrote:

Nicholas Alexander

unread,
Feb 16, 2015, 7:10:45 PM2/16/15
to Gregory Szorc, Bill McCloskey, Firefox Dev, Gijs Kruitbosch
Hey everybody,

On Mon, Feb 16, 2015 at 3:36 PM, Gregory Szorc <g...@mozilla.com> wrote:
While preprocessed code could likely ultimately be tolerated, it is my preference to see it go away *unless* we adopt preprocessing conventions utilized by the larger JavaScript community and are well supported by 3rd party JavaScript tools. (Currently, the preprocessor is part of the build system and I'm pretty sure any overlap with existing JS preprocessors is coincidental.) Much like the larger Java world doesn't preprocess Java (like we do on Fennec, unfortunately), we shouldn't be doing it in Firefox JavaScript unless we have a very good reason. And if we must preprocess code, I think we should take steps to limit its impact (e.g. isolate all preprocessed code to standalone files so its effects don't creep into most code).

I was going to support billm to push for less preprocessed JS and cite Fennec earlier, but let's clarify here: Fennec is a model preprocessing citizen!

Before elaborating on what we preprocess, I will state that reducing our preprocessed Java has made development simpler (much!) and IDE use possible.  (At one time we preprocessed all of our test files and a large chunk of our Java sources.  It was awful.)

We preprocess:

* the AndroidManifest.xml -- we pretty much have to;
* a few resources files, including strings.xml -- again, we're forced;
* a single .java.in file -- AppConstants.java;
* a single .jsm file -- AppConstants.jsm;
* one or two .xhtml files that use the pre-processing * annotation during packaging and are hard to move away from preprocessing.

I have vague hopes that we can move the AndroidManifest and AppConstants.java.in preprocessing into an alternate system, but I'm stumped as to what to do with AppConstants.jsm.

Yours,
Nick

Gregory Szorc

unread,
Feb 16, 2015, 7:37:47 PM2/16/15
to Nicholas Alexander, Bill McCloskey, Gijs Kruitbosch, Gregory Szorc, Firefox Dev
Apologies, Nick. I wasn't aware that Fennec has successfully weaned off the preprocessor as much as it has. I agree that Fennec is a model for how the preprocessor should be used: sparingly. Well done.

Richard Newman

unread,
Feb 16, 2015, 8:35:50 PM2/16/15
to bi...@mozilla.com, Gregory Szorc, Firefox Dev, Dave Townsend
Use of the preprocessor would be pretty high on my list of things to get rid of.
 
We've done a bunch of work in the past year in Fennec to dramatically reduce the use of preprocessing for both Java and JS. It really screws up IDEs and tooling. E.g., Bug 1093358, where we centralized a lot of constants so that browser.js doesn't need to be preprocessed.

Partly this is because we can rely on ProGuard to strip out unreachable code branches, or simply not build whole files for certain targets.

Thumbs up on having this trend continue.

Chris Peterson

unread,
Feb 17, 2015, 2:46:17 AM2/17/15
to firef...@mozilla.org

On 2/16/15 1:55 PM, Bill McCloskey wrote:
Use of the preprocessor would be pretty high on my list of things to get rid of.

I suspect that a lot of the older non-ES6 JS features probably don't get used much in new code (old-style generators, for each, braceless functions), so a style guide won't help much. We just need to remove them from old code.

Here is a longer list of deprecated SpiderMonkey features:

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Deprecated_and_obsolete_features

I added telemetry probes for some of SpiderMonkey's nonstandard language extensions in web content (not add-on or chrome JS). It looks like for-each, legacy generators, and let blocks/expressions are rarely used in web content:

http://is.gd/VKxQkQ

In my spare time, I have been slowly removing their use in chrome JS. I would like to hide some of these language extensions from web content in Firefox 39 and then finish removing them from chrome JS. Firefox 38 will be the next ESR, so it would be nice to keep these language extensions there in case some ESR users' add-ons or intranet sites rely on them. :)


chris

Mike Ratcliffe

unread,
Feb 17, 2015, 5:51:37 AM2/17/15
to firef...@mozilla.org
When something is not possible using our current JS implementation we tend to implement it in e.g. nsIDOMUtils.cpp and use it in our code. If we had to wait for every command to be standardized there would be very significant roadblocks to innovation.

At the same time, of course, if something can be done with our current JS implementation without too much extra overhead and complexity then it should be.

/Mike

Mike de Boer

unread,
Feb 17, 2015, 6:14:28 AM2/17/15
to Mike Ratcliffe, firef...@mozilla.org
I’m really looking forward to having your awesomeness, gps, and that of others focussed on this! There’s some overlap here with ideas that exist within the Loop/ Hello team: we are trying to get NodeJS available on the build system so that we can run linters and other static analysis goodies on our infra per-push so that we can track the state of things over time and in the future backout changesets that yield errors from these tools.
Having NodeJS in our toolchain would greatly improve things, I believe, because there’s much good stuff that we can use on NPM.

And don’t forget about CSS! I’m one of those weirdos that think CSS is harder than JS ;-)

I also think that we need to be less strict about landing big search-and-replace patches to update our codebase in one go to, for example, remove our usage of deprecated Spidermonkey features. The only real reason to reject this practice I’ve heard is to preserve blame. I think that’s an outdated point of view and is holding us back unnecessarily.

Onward to something that has become more troubling over the course of my two years at Moz: printf-debugging.
I cheered out loud when the browser toolbox popped up last year, but now I’m just sad. It does not make me remove DOM Inspector. In fact, I still only use that piece of arcane software, because it _just works_. But that’s the least of my trouble (I still have a tool to use instead); the debugger is unusable. I’ve had crashes, missed breakpoints, hangs and more. So the reality is that dump() was, is and will be my best friend.
I’m getting more worried now though, because e10s introduces more message passing, indirect code flows that take more time to debug using this old tradition. We will be in need of a real debugger rather soon.
To be clear: I’m talking about the product, not the team. I know there’s a massive amount of technical debt they are working hard to resolve. I’m trying to say that the team needs more resources. A lot more.

Thanks for starting this thread!

Boris Zbarsky

unread,
Feb 17, 2015, 9:41:58 AM2/17/15
to firef...@mozilla.org
On 2/17/15 5:51 AM, Mike Ratcliffe wrote:
> When something is not possible using our current JS implementation we
> tend to implement it in e.g. nsIDOMUtils.cpp and use it in our code.

This is not a problem, because use of XPCOM bits like that is totally
valid ES6 (and ES5, and ES3).

The problems for code-munging tools are with use of nonstandard
syntactic features, not with nonstandard bits of the "standard library".

-Boris

Gijs Kruitbosch

unread,
Feb 17, 2015, 10:36:45 AM2/17/15
to Boris Zbarsky, firef...@mozilla.org
On 17/02/2015 15:41, Boris Zbarsky wrote:
> On 2/17/15 5:51 AM, Mike Ratcliffe wrote:
>> When something is not possible using our current JS implementation we
>> tend to implement it in e.g. nsIDOMUtils.cpp and use it in our code.
>
> This is not a problem, because use of XPCOM bits like that is totally
> valid ES6 (and ES5, and ES3).
>
> The problems for code-munging tools are with use of nonstandard
> syntactic features, not with nonstandard bits of the "standard library".
That really depends on what kind of code-munging you're doing. It'll not
be a problem for parsing, sure.

However, you would be surprised how much JS tooling assumes that
references to |window| or |document| (or window's other properties!) are
always valid, or that local variables with those names are by definition
shadowing the global one, or where (in order to get good/reasonable
results for inferences and so on) we would need to provide descriptions
of (web/xp)idl interfaces in some way that they can grok, as we can't
always point to a JS implementation instead...

Maybe we have very different expectations of what "tooling" means here -
we're already testing whether our own JS implementation can parse all
the JS we're shipping, albeit at mochitest rather than at
compile/check-in time.

~ Gijs

Boris Zbarsky

unread,
Feb 17, 2015, 10:41:01 AM2/17/15
to Gijs Kruitbosch, firef...@mozilla.org
On 2/17/15 10:36 AM, Gijs Kruitbosch wrote:
> However, you would be surprised how much JS tooling assumes that
> references to |window| or |document| (or window's other properties!) are
> always valid

This tooling is already broken for workers, node.js, etc, right? Can we
not avoid such broken tooling and use tooling that doesn't make broken
assumptions about the host environment?

> or where (in order to get good/reasonable
> results for inferences and so on) we would need to provide descriptions
> of (web/xp)idl interfaces in some way that they can grok, as we can't
> always point to a JS implementation instead...

That's fair.

> Maybe we have very different expectations of what "tooling" means here -

That's possible. Gregory's post was very light on specifics...

-Boris

Gregory Szorc

unread,
Feb 17, 2015, 11:35:20 AM2/17/15
to Boris Zbarsky, firefox-dev-owner firefox-dev-owner, Gijs Kruitbosch
On Tue, Feb 17, 2015 at 7:40 AM, Boris Zbarsky <bzba...@mit.edu> wrote:
On 2/17/15 10:36 AM, Gijs Kruitbosch wrote:
However, you would be surprised how much JS tooling assumes that
references to |window| or |document| (or window's other properties!) are
always valid

This tooling is already broken for workers, node.js, etc, right?  Can we not avoid such broken tooling and use tooling that doesn't make broken assumptions about the host environment?

or where (in order to get good/reasonable
results for inferences and so on) we would need to provide descriptions
of (web/xp)idl interfaces in some way that they can grok, as we can't
always point to a JS implementation instead...

That's fair.

Maybe we have very different expectations of what "tooling" means here -

That's possible.  Gregory's post was very light on specifics...

Here's a partial list:

* Machines should tell you when code doesn't conform to style guidelines
* Machines should be able to automatically reformat code to conform to style guidelines
* We should be able to mass-rewrite code by writing scripts that transform existing code (this is how Google and Facebook perform large refactors)
* Developers should be able to see code coverage for tests, etc (I view this as blocked on SpiderMonkey lacking a built-in high-performance tracing/profiling mechanism)
* Code editors and IDEs with JavaScript support should "just work"
* We should be able to write custom "linter" rules looking for common issues with Mozilla code (e.g. usage of nsIFile over OS.File)

A common theme in these is that machines should do more work for humans. I don't want people wasting time looking at things like code formatting during code review when they could be looking at something more important, such as what the code actually does.

Boris Zbarsky

unread,
Feb 17, 2015, 11:45:20 AM2/17/15
to Gregory Szorc, firefox-dev-owner firefox-dev-owner, Gijs Kruitbosch
On 2/17/15 11:35 AM, Gregory Szorc wrote:
> * Machines should tell you when code doesn't conform to style guidelines
> * Machines should be able to automatically reformat code to conform to
> style guidelines

OK, this seems like it should not depend on anything other than parsing,
right?

> * We should be able to mass-rewrite code by writing scripts that
> transform existing code (this is how Google and Facebook perform large
> refactors)

Likewise, though I guess depending on the transformations you want to do
you may need to make assumptions about variable names and whatnot...

> * Code editors and IDEs with JavaScript support should "just work"

Still seems like this should only depend on parsing.

> * We should be able to write custom "linter" rules looking for common
> issues with Mozilla code (e.g. usage of nsIFile over OS.File)

And here.

Gijs Kruitbosch

unread,
Feb 17, 2015, 11:58:18 AM2/17/15
to Gregory Szorc, Boris Zbarsky, Firefox Dev
On 17/02/2015 17:35, Gregory Szorc wrote:
On Tue, Feb 17, 2015 at 7:40 AM, Boris Zbarsky <bzba...@mit.edu> wrote:
On 2/17/15 10:36 AM, Gijs Kruitbosch wrote:
However, you would be surprised how much JS tooling assumes that
references to |window| or |document| (or window's other properties!) are
always valid

This tooling is already broken for workers, node.js, etc, right?  Can we not avoid such broken tooling and use tooling that doesn't make broken assumptions about the host environment?

or where (in order to get good/reasonable
results for inferences and so on) we would need to provide descriptions
of (web/xp)idl interfaces in some way that they can grok, as we can't
always point to a JS implementation instead...

That's fair.

Maybe we have very different expectations of what "tooling" means here -

That's possible.  Gregory's post was very light on specifics...

Here's a partial list:

* Machines should tell you when code doesn't conform to style guidelines
* Machines should be able to automatically reformat code to conform to style guidelines
There aren't clear style guidelines in browser JS. This is also not really ES6/"custom SpiderMonkey stuff"-related, though. Unlike Mike, I don't think losing most/all blame by reformatting all our source is an "outdated" objection to changing that.

I also don't think that purely a reformat will be sufficient to unify style, we'd have to rename variables and s/var/let/ and such, which is even more invasive (and will need significant amounts of logic in the reformatter so as not to break stuff because of the semantics changes). Or leave that out of the style guidelines, but I don't see how that would then gain us much compared to the status quo, if all that's "standardized" is whitespace, wrapping and brace placement preferences.

Either way, I think blame is a serious issue here.

* We should be able to mass-rewrite code by writing scripts that transform existing code (this is how Google and Facebook perform large refactors)
I can't think of much where I would really want to use this besides... getting us off the old SpiderMonkey-specific patterns? Which turns this into a bootstrap problem...

I also suspect that our patterns of Cu.import and XPCOM usage (both with string params) will not easily lend themselves to automatic refactoring even if you solved the parsing problem.

* Developers should be able to see code coverage for tests, etc (I view this as blocked on SpiderMonkey lacking a built-in high-performance tracing/profiling mechanism)
* Code editors and IDEs with JavaScript support should "just work"
As noted, this needs interface descriptions for XPCOM and some understanding of how Cu.import works (which will be tricky, considering it relies on contents-of-string-arguments) if you want anything in terms of completion. Not saying that wouldn't be worth it, but it'd be tricky, even if we were ES3/5/6-compliant in terms of syntax tomorrow.

* We should be able to write custom "linter" rules looking for common issues with Mozilla code (e.g. usage of nsIFile over OS.File)
Right. So as Boris said, most of this just needs parsing and "glue" to process the AST and hook up to our custom processing scripts/specifications (for refactors/reformatting/linting).

Do you have evidence that a significant number of the above problems have ready-made "glue" that will work with ES6 once we get rid of the SM-specific stuff? Because we already have a working parser and AST generator - Reflect.parse. Its output format, last I checked, is largely compatible with the existing toolset. The main problem was ES6 stuff like "let", "const", "yield", generators, for...of, and destructuring expressions. Ditching SM-specific things wouldn't bridge that gap. Maybe I've been looking at the wrong tools, though?

Separately, if we do need to do this, I guess I would expect that writing such glue to refactor us out of SM-specific land into vanilla-ES6-heaven would be more cost-effective (considering the parser already exists!) than manually trying to do the same.

~ Gijs

Dan Mosedale

unread,
Feb 17, 2015, 12:59:32 PM2/17/15
to Gijs Kruitbosch, Boris Zbarsky, Firefox Dev, Gregory Szorc
* We should be able to write custom "linter" rules looking for common issues with Mozilla code (e.g. usage of nsIFile over OS.File)
Right. So as Boris said, most of this just needs parsing and "glue" to process the AST and hook up to our custom processing scripts/specifications (for refactors/reformatting/linting).

Do you have evidence that a significant number of the above problems have ready-made "glue" that will work with ES6 once we get rid of the SM-specific stuff? Because we already have a working parser and AST generator - Reflect.parse. Its output format, last I checked, is largely compatible with the existing toolset. The main problem was ES6 stuff like "let", "const", "yield", generators, for...of, and destructuring expressions. Ditching SM-specific things wouldn't bridge that gap. Maybe I've been looking at the wrong tools, though?

Have a look at ESlint <http://eslint.org/docs/about/>.  It's moving very fast to support ES6 stuff on an opt-in basis; <http://eslint.org/docs/configuring/> has details.  It already has a very large chunk of the es6 features we use; see <http://eslint.org/docs/configuring/>.  It's also designed to be pluggable so that we can add rules to match our own prevailing style usages.

Dan

Jim Blandy

unread,
Feb 17, 2015, 2:04:56 PM2/17/15
to Dan Mosedale, Gregory Szorc, Boris Zbarsky, Firefox Dev, Gijs Kruitbosch
Mechanical reformatting makes "hg annotate" output more difficult to use, but it's not fatal. When the line you're interested in was affected by a reformatting changeset, you can note the changeset number <N>, and then look at "hg annotate -R <N-1>". (Emacs, at least, has an annotation viewing mode that has "annotate the changeset prior to the one that touched this line" a single keystroke.)

It seems like it should be possible to change 'hg annotate' to simply skip certain changesets. This would cause 'hg annotate" output to no longer correspond to the text of any actual revision - it would be a mix of different revisions - but if one is specifically skipping changesets that have no semantic effect (like reformatting), the mixed-revision source should be equivalent to the original.

Perhaps this is something we could add to Mercurial. But it doesn't seem like an adequate reason to never fix up the sources.

Gavin Sharp

unread,
Feb 17, 2015, 2:51:43 PM2/17/15
to Gregory Szorc, Bill McCloskey, Firefox Dev, Gijs Kruitbosch
I agree with pretty much everything in this post - we should wean
ourselves off the preprocessor for Firefox JS. Who wants to drive
this?

Gavin

Gavin Sharp

unread,
Feb 17, 2015, 2:53:10 PM2/17/15
to Mike de Boer, Firefox Dev, Mike Ratcliffe
> The only real reason
> to reject this practice I’ve heard is to preserve blame. I think that’s an
> outdated point of view and is holding us back unnecessarily.

Absolutely. If this particular argument is holding up patches landing
in practice, that's a problem. Flag it and I will help fix it.

Gavin

Gavin Sharp

unread,
Feb 17, 2015, 3:01:49 PM2/17/15
to Gijs Kruitbosch, Boris Zbarsky, Firefox Dev, Gregory Szorc
On Tue, Feb 17, 2015 at 11:58 AM, Gijs Kruitbosch
<gijskru...@gmail.com> wrote:
> There aren't clear style guidelines in browser JS.

Not entirely true - but it is inconsistently applied. We can change
that. There is a lot of value in not wasting new-contributor
patch-writing time and review cycles on style nits.

> Either way, I think blame is a serious issue here.

I disagree, and am surprised to see you say that - we should perhaps
discuss this further!

Gavin

Gavin Sharp

unread,
Feb 17, 2015, 3:08:57 PM2/17/15
to Gregory Szorc, firefox-dev
Thanks for this, Gregory. At a high level I agree with the sentiment
of wanting to rid us of our Mozilla-baggage where it is causing
friction. I'm pretty committed to helping with that where I can.

I think we need to dive into specifics, though, because these things
are just distracting to argue about at a high level (a lot of
potential for misunderstanding and arguing past each other). I think
use of non-standard JS language features is probably not our biggest
pain point (though we should continue to reduce it). I think our
biggest pain point is the overhead (technical, political) to being
able to use other common JS tools with Firefox code, and integrating
them with the build system/continuous integration/etc..

Getting rid of our custom JS preprocessor and ugly #ifdefs is a good
first thing to tackle. Bug 1124503 is a step in that direction, it
would be great if someone picked that up and extended the nalexander
and the Fennec's team work to Firefox.

What are some other examples?

Gavin

Gijs Kruitbosch

unread,
Feb 17, 2015, 3:32:45 PM2/17/15
to Gavin Sharp, Firefox Dev
On 17/02/2015 21:01, Gavin Sharp wrote:
> On Tue, Feb 17, 2015 at 11:58 AM, Gijs Kruitbosch
> <gijskru...@gmail.com> wrote:
>> There aren't clear style guidelines in browser JS.
> Not entirely true - but it is inconsistently applied. We can change
> that.
Of course we can change that, but how valuable is doing so?
I'm also surprised to see you say this, after just last week saying I
should "embrace" the inconsistencies when I asked about a specific point
of style...
> There is a lot of value in not wasting new-contributor
> patch-writing time and review cycles on style nits.
I definitely agree on this point, but it seems like that can be fixed
without unifying and/or mass-reformatting the entire codebase. We could
have a tool to make things "fit in" with the style of that file. My
editor knows how to indent stuff, so seems like that shouldn't be /that/
hard.

>> Either way, I think blame is a serious issue here.
> I disagree, and am surprised to see you say that - we should perhaps
> discuss this further!
I believe mass-changing blame like this has 2 serious issues:

1) it is no longer possible to easily see which lines got introduced in
the same commit without being changed later (because N% of them will be
auto-reformatted and then have that blame instead)
2) it adds unnecessary extra steps to archaeology of "why does code X do
things in way Y" aka blame-hunting. The history of our C++ story here
(inasmuch as I'm aware) is that things get done on a per-file basis,
which further makes it difficult to exclude the relevant changesets. As
every step of this archaeology process on hgweb already takes multiple
clicks, I think this should be avoided (I'm sorry, but I don't think
I'll ever be using emacs to make this "less bad" as it would have the
opposite effect for me). I also fully expect part of the style guide to
evolve and/or be applied in several rounds (whitespace first,
argument/constant/global names second/third/nth), which worsens this issue.

~ Gijs

Gavin Sharp

unread,
Feb 17, 2015, 3:55:15 PM2/17/15
to Gijs Kruitbosch, Firefox Dev
I have a complicated and nuanced (and evolving) position on code style
guidelines, so I don't blame you for being surprised :)

I think that avoiding nit-reviews for new contributors is the biggest
style-related issue we have today, and so I am most interested in
solutions that address that problem specifically. For example,
automated scanning of submitted patches to ensure they don't introduce
"bad style" (definition of "bad style" not to be explored in this
thread :). Solutions to that problem tend to immediately run into
other problems, though: if your codebase is already inconsistent,
building that tool is more difficult. So I am somewhat sympathetic to
the desire to solve that problem first, even though I don't think it
is most pressing on its own. I would love it if the world of
developers were able to set aside style OCD and just learn to embrace
minor styling differences, but that is not the world we live in :)

Regarding "blame problems", I see it this way: finding relevant blame
for a given line typically involves this algorithm:
1- load annotated file for revision N, find relevant line
2- look at annotation, determine whether it is relevant
3- if so, you're done!
4- if not, go to step 1 with revision N's parent

Given Mozilla's long history and high check-in churn, this process
pretty much always involves multiple iterations. The difference
between, for example, 3 iterations and 4 iterations of that algorithm
is insignificant today, with existing tools (custom blame lookup
scripts or hgweb). The difference could even be further reduced if
hgweb was actually built with this process in mind, or if we had
better local tools built into mach.

Gavin

Mike de Boer

unread,
Feb 17, 2015, 4:20:39 PM2/17/15
to Gavin Sharp, Bill McCloskey, Firefox Dev, Gregory Szorc, Gijs Kruitbosch

> On 17 Feb 2015, at 20:51, Gavin Sharp <ga...@gavinsharp.com> wrote:
>
> I agree with pretty much everything in this post - we should wean
> ourselves off the preprocessor for Firefox JS. Who wants to drive
> this?

I’d be happy to jot this down as secondary Q2 goal for me!

Mike.

Blair McBride

unread,
Feb 17, 2015, 5:15:39 PM2/17/15
to Firefox Dev
Gijs Kruitbosch wrote:
> Because we already have a working parser and AST generator -
> Reflect.parse. Its output format, last I checked, is largely
> compatible with the existing toolset.

Reflect.parse's output is the standard that pretty much all other tools
are based on. However, it's also inadequate for usage in pretty much all
other tools. It's missing key things such as support for comments[1] -
which many tools rely on. And of course there's the issue of nodejs
compatibility. Other parsers are roughly the same, until you get down to
certain details.

There's been some activity lately in being more intentional and
organised around parsing and AST trees, and developing a more formal
spec, with https://github.com/estree/estree


[1] Awhile ago I started working on a parse to add comment support to
Reflect.parse, but never got it finished.

- Blair

Gregory Szorc

unread,
Feb 17, 2015, 5:55:41 PM2/17/15
to Blair McBride, Firefox Dev
On Tue, Feb 17, 2015 at 2:15 PM, Blair McBride <bmcb...@mozilla.com> wrote:
Gijs Kruitbosch wrote:
Because we already have a working parser and AST generator - Reflect.parse. Its output format, last I checked, is largely compatible with the existing toolset.

Reflect.parse's output is the standard that pretty much all other tools are based on. However, it's also inadequate for usage in pretty much all other tools. It's missing key things such as support for comments[1] - which many tools rely on. And of course there's the issue of nodejs compatibility. Other parsers are roughly the same, until you get down to certain details.

There's been some activity lately in being more intentional and organised around parsing and AST trees, and developing a more formal spec, with https://github.com/estree/estree

That reminds me of another area for improvement: automatic documentation generation. I think its wasteful of developers' time to update inline source/API docs *and* MDN. We should generate API docs from source code and markup MDN docs with additional comments, like how MSDN, php.net, etc do their docs. Extraction of documentation from source also enables IDEs and editors to have cool features like autocomplete and pop-ups containing usage docs. This does poke the hornets nest of how to format docs. But I'd like to think the end result outweighs the pain to transition to it.

Blair McBride

unread,
Feb 17, 2015, 6:35:32 PM2/17/15
to Gregory Szorc, Firefox Dev
Gregory Szorc wrote:
That reminds me of another area for improvement: automatic documentation generation. I think its wasteful of developers' time to update inline source/API docs *and* MDN. We should generate API docs from source code and markup MDN docs with additional comments, like how MSDN, php.net, etc do their docs. Extraction of documentation from source also enables IDEs and editors to have cool features like autocomplete and pop-ups containing usage docs. This does poke the hornets nest of how to format docs. But I'd like to think the end result outweighs the pain to transition to it.

Yep. I've noticed a general trend towards JSDoc compliant formatting - and all my new code is documented following the JSDoc spec.

- Blair

Gregory Szorc

unread,
Feb 25, 2015, 6:02:13 PM2/25/15
to Gregory Szorc, firefox-dev-owner firefox-dev-owner
I wanted to clarify what I meant by "Gecko flavored JavaScript," since some of you pinged me about it. As many of you know, Mozilla frequently experiments with new JavaScript features before they become standardized. This is done as part of evolving the JavaScript language, which Mozilla plays an active role in. This foraging into experimental language features are what I mean by "Gecko flavored JavaScript." I think this experimentation is healthy. I just wanted to raise awareness of its potential drawback of sacrificing the ability for us to use more tools.

These are my thoughts and I'm sure there will be others who will want to chime in.

Gregory

Mike de Boer

unread,
Apr 3, 2015, 8:22:38 AM4/3/15
to Gregory Szorc, firefox-dev-owner firefox-dev-owner
I filed https://bugzilla.mozilla.org/show_bug.cgi?id=1150859 - 'Remove preprocessor directives from browser code’ to track work related to some of the things mentioned in this thread.

Mike.

Bill McCloskey

unread,
Apr 3, 2015, 10:45:01 AM4/3/15
to Mike de Boer, firefox-dev-owner firefox-dev-owner, Gregory Szorc
This is probably a good time to mention that we've already removed the preprocessor from some important frontend files:
- tabbrowser.xml
- urlbarBindings.xml
- browser/components/sessionstore/*
- browser/modules/*

If you're editing one of these files, you don't need to run make anymore. You can also trust the line numbers in error messages. There's a lot of value that comes just from removing the preprocessor from the most commonly used files. (Although there's of course more value in getting rid of it all together.)

If you're working on a JS/JSM file and it uses the preprocessor, please consider changing those uses to AppConstants.jsm while you're there. And definitely don't add any more preprocessor usage!

-Bill

Gijs Kruitbosch

unread,
Apr 3, 2015, 11:02:43 AM4/3/15
to Mike de Boer, firefox-dev-owner firefox-dev-owner
What's the plan for #include, which is implicated here?

Aeons ago, I filed https://bugzilla.mozilla.org/show_bug.cgi?id=381210 about the ridonkulous size of browser.js (still pretty big, btw!) which much later got duped to https://bugzilla.mozilla.org/show_bug.cgi?id=758812 where dolske eventually did split up a lot using #include directives.

While I recognize I have no veto rights or whatever, I would be... seriously displeased if we went back to just using a single file.

AIUI, jsms are our next-best alternative, but using a lot of them has serious performance downsides (and will also require a bunch of refactoring for the included scripts).

Perhaps we can introduce a valid JS shorthand to include extra files by simply loading them into the global scope with the scriptloader, and then preprocess them for dist and/or pgo builds?

For CSS (which has this problem but worse because no scriptability) we can presumably do this with the pre-existing @import , and change %defines to use CSS variables?

~ Gijs

Bill McCloskey

unread,
Apr 3, 2015, 11:17:33 AM4/3/15
to Gijs Kruitbosch, Mike de Boer, firefox-dev-owner firefox-dev-owner
On Fri, Apr 3, 2015 at 8:02 AM, Gijs Kruitbosch <gijskru...@gmail.com> wrote:
What's the plan for #include, which is implicated here?

Converting everything to JSMs seems like the best option.
 
Aeons ago, I filed https://bugzilla.mozilla.org/show_bug.cgi?id=381210 about the ridonkulous size of browser.js (still pretty big, btw!) which much later got duped to https://bugzilla.mozilla.org/show_bug.cgi?id=758812 where dolske eventually did split up a lot using #include directives.

While I recognize I have no veto rights or whatever, I would be... seriously displeased if we went back to just using a single file.

I'm sure you wouldn't be the only person :-).
 
AIUI, jsms are our next-best alternative, but using a lot of them has serious performance downsides (and will also require a bunch of refactoring for the included scripts).

We already import ~50 JSMs into browser.js. Converting 21 #include files to JSMs will be annoying, but I doubt it will have any significant effect on performance. We'll use a little more memory, but the code will be loaded more lazily in most cases. It's worth measuring though.
 
Perhaps we can introduce a valid JS shorthand to include extra files by simply loading them into the global scope with the scriptloader, and then preprocess them for dist and/or pgo builds?

Well, the point of removing the preprocessor is to use only standard JavaScript, so I don't think we should do that.
 
For CSS (which has this problem but worse because no scriptability) we can presumably do this with the pre-existing @import , and change %defines to use CSS variables?

That would be cool. I'm also worried about .xul files. It would be great to get rid of the preprocessor in those, but I don't know of any replacement for #ifdef in XUL.

-Bill

Dirkjan Ochtman

unread,
Apr 3, 2015, 11:20:29 AM4/3/15
to Gijs Kruitbosch, Mike de Boer, firefox-dev-owner firefox-dev-owner
On Fri, Apr 3, 2015 at 5:02 PM, Gijs Kruitbosch
<gijskru...@gmail.com> wrote:
> What's the plan for #include, which is implicated here?

The plan in bug 1150859 is to just move everything to separate JSMs
(as a bunch of things already are).

Cheers,

Dirkjan

Gijs Kruitbosch

unread,
Apr 3, 2015, 11:29:22 AM4/3/15
to bi...@mozilla.com, Mike de Boer, firefox-dev-owner firefox-dev-owner
On 03/04/2015 16:17, Bill McCloskey wrote:
On Fri, Apr 3, 2015 at 8:02 AM, Gijs Kruitbosch <gijskru...@gmail.com> wrote
AIUI, jsms are our next-best alternative, but using a lot of them has serious performance downsides (and will also require a bunch of refactoring for the included scripts).

We already import ~50 JSMs into browser.js. Converting 21 #include files to JSMs will be annoying, but I doubt it will have any significant effect on performance. We'll use a little more memory, but the code will be loaded more lazily in most cases. It's worth measuring though.
There are bugs on file to save even using 1 extra JSM ( https://bugzilla.mozilla.org/show_bug.cgi?id=986503 ) so 21 extra JSMs does not sound nice.


Perhaps we can introduce a valid JS shorthand to include extra files by simply loading them into the global scope with the scriptloader, and then preprocess them for dist and/or pgo builds?

Well, the point of removing the preprocessor is to use only standard JavaScript, so I don't think we should do that.
I think you misunderstood my suggestion. I'm saying we should write a function (e.g. in utilityOverlay.js) that looks something like:

function import_script(url) {
  var sl = Services.scriptloader;
  sl.loadSubscript(url, this);
} // not actually tried this, but the idea should be clear!

and then use import_script("chrome://browser/content/browser-whatever.js");

Using a separate function makes it easy to apply build-time grep magic to inline the file in optimized builds.

As an extra (opaque) function call is valid JS, it wouldn't break preprocessors and avoids the refactoring and memory overhead of JSMs. We could always switch to using JSMs as/where appropriate.


 
For CSS (which has this problem but worse because no scriptability) we can presumably do this with the pre-existing @import , and change %defines to use CSS variables?

That would be cool. I'm also worried about .xul files. It would be great to get rid of the preprocessor in those, but I don't know of any replacement for #ifdef in XUL.
XUL overlays would be one. Don't know what the runtime/memory overheads of that would be.

JS adjustments and hidden="true" on the markup you usually don't want would be another option. Could probably decide on a case-by-case basis.

~ Gijs

Dirkjan Ochtman

unread,
Apr 3, 2015, 11:35:50 AM4/3/15
to Gijs Kruitbosch, Mike de Boer, bi...@mozilla.com, firefox-dev-owner firefox-dev-owner
On Fri, Apr 3, 2015 at 5:29 PM, Gijs Kruitbosch
<gijskru...@gmail.com> wrote:
>> AIUI, jsms are our next-best alternative, but using a lot of them has
>> serious performance downsides (and will also require a bunch of refactoring
>> for the included scripts).
>
> We already import ~50 JSMs into browser.js. Converting 21 #include files to
> JSMs will be annoying, but I doubt it will have any significant effect on
> performance. We'll use a little more memory, but the code will be loaded
> more lazily in most cases. It's worth measuring though.
>
> There are bugs on file to save even using 1 extra JSM (
> https://bugzilla.mozilla.org/show_bug.cgi?id=986503 ) so 21 extra JSMs does
> not sound nice.

That bug is a year old. Is the performance hit/memory consumption
still unchanged?

Cheers,

Dirkjan

Gavin Sharp

unread,
Apr 3, 2015, 11:52:06 AM4/3/15
to Gijs Kruitbosch, Mike de Boer, firefox-dev-owner firefox-dev-owner
On Fri, Apr 3, 2015 at 11:02 AM, Gijs Kruitbosch
<gijskru...@gmail.com> wrote:
> AIUI, jsms are our next-best alternative, but using a lot of them has
> serious performance downsides (and will also require a bunch of refactoring
> for the included scripts).
>
> Perhaps we can introduce a valid JS shorthand to include extra files by
> simply loading them into the global scope with the scriptloader, and then
> preprocess them for dist and/or pgo builds?

Why can't we just replace the <script> that loads browser.js in
global-scripts.inc (and web-panels.xul) with N <script>s that load
multiple files? There might be some ordering/dependencies to refactor
but those seem like they should be surmountable.

Gavin

Gijs Kruitbosch

unread,
Apr 3, 2015, 11:56:48 AM4/3/15
to Gavin Sharp, Mike de Boer, firefox-dev-owner firefox-dev-owner
On 03/04/2015 16:52, Gavin Sharp wrote:
> On Fri, Apr 3, 2015 at 11:02 AM, Gijs Kruitbosch
> <gijskru...@gmail.com> wrote:
>> AIUI, jsms are our next-best alternative, but using a lot of them has
>> serious performance downsides (and will also require a bunch of refactoring
>> for the included scripts).
>>
>> Perhaps we can introduce a valid JS shorthand to include extra files by
>> simply loading them into the global scope with the scriptloader, and then
>> preprocess them for dist and/or pgo builds?
> Why can't we just replace the <script> that loads browser.js in
> global-scripts.inc (and web-panels.xul) with N <script>s that load
> multiple files? There might be some ordering/dependencies to refactor
> but those seem like they should be surmountable.
>
> Gavin
I would assume that, too, would impact (IO) performance on startup / new
window creation and such, in much the same way that it is impacted for
web pages, which unify scripts for similar reasons (though not quite the
same because I assume we don't apply HTTP pipelining restrictions to
chrome:// URLs ?). I don't know how hard it'd be to fix that with a
preprocessing step for dist builds.

I don't know how bad the ordering dependencies would be, though I would
hope that they would be very minimal.

~ Gijs

Gijs Kruitbosch

unread,
Apr 3, 2015, 12:00:40 PM4/3/15
to Dirkjan Ochtman, Nicholas Nethercote, Mike de Boer, bi...@mozilla.com, Bobby Holley, firefox-dev-owner firefox-dev-owner
On 03/04/2015 16:35, Dirkjan Ochtman wrote:
> On Fri, Apr 3, 2015 at 5:29 PM, Gijs Kruitbosch
> <gijskru...@gmail.com> wrote:
>>> AIUI, jsms are our next-best alternative, but using a lot of them has
>>> serious performance downsides (and will also require a bunch of refactoring
>>> for the included scripts).
>> We already import ~50 JSMs into browser.js. Converting 21 #include files to
>> JSMs will be annoying, but I doubt it will have any significant effect on
>> performance. We'll use a little more memory, but the code will be loaded
>> more lazily in most cases. It's worth measuring though.
>>
>> There are bugs on file to save even using 1 extra JSM (
>> https://bugzilla.mozilla.org/show_bug.cgi?id=986503 ) so 21 extra JSMs does
>> not sound nice.
> That bug is a year old. Is the performance hit/memory consumption
> still unchanged?

I don't know, but I would be surprised if it changed a lot. Bobby Holley
or Nicholas Nethercote (who's on PTO, IIRC) might know more?

I'll also (again) point out that the refactoring required for moving
per-window code into a JSM will be non-trivial. I'm not sure why we
would choose it over either Gavin or my suggestion.

~ Gijs

Bobby Holley

unread,
Apr 3, 2015, 12:12:24 PM4/3/15
to Gijs Kruitbosch, Dirkjan Ochtman, Nicholas Nethercote, Mike de Boer, William McCloskey, firefox-dev-owner firefox-dev-owner, Bobby Holley
The bug to watch is bug 989373. We currently merge all JSMs into a single compartment on b2g (something I'd love to change), and turning that behavior off gives us a hit of about 11MiB. We'd still like to figure out how to fix that, but it's a ways off.

Bill McCloskey

unread,
Apr 3, 2015, 12:15:56 PM4/3/15
to Dirkjan Ochtman, Mike de Boer, firefox-dev-owner firefox-dev-owner, Gijs Kruitbosch
On Fri, Apr 3, 2015 at 8:35 AM, Dirkjan Ochtman <dir...@ochtman.nl> wrote:
> There are bugs on file to save even using 1 extra JSM (
> https://bugzilla.mozilla.org/show_bug.cgi?id=986503 ) so 21 extra JSMs does
> not sound nice.

That bug is a year old. Is the performance hit/memory consumption
still unchanged?

Bug 989373 comment 21 [1] has the most current data as far as I know. When Nick tried to remove the compartment merging hack on b2g, memory went up by ~11MB. Number of compartments went up by ~150. So each compartment costs roughly 73KB overall. I have about 450 compartments in FF right now, so the total compartment overhead is ~33MB.

In my opinion, if we can load compartments more lazily or merge compartments that have some affinity, we should do that. But if there's code that basically looks like a module, then we should use a JSM.

-Bill

Mike de Boer

unread,
Apr 3, 2015, 12:16:17 PM4/3/15
to Gavin Sharp, firefox-dev-owner firefox-dev-owner, Gijs Kruitbosch
I’d like to avoid creating more compartments (looks like we’re talking about 20-ish) on browser startup, especially since it’s been a target to further reduce this number. The more we can lazy load, the better.

Mike.

Gijs Kruitbosch

unread,
Apr 3, 2015, 12:17:58 PM4/3/15
to Mike de Boer, Gavin Sharp, firefox-dev-owner firefox-dev-owner
To be clear, <script> tags don't create compartments, JSMs do. So
Gavin's suggestion wouldn't create more compartments.

~ Gijs

Robert Kaiser

unread,
Apr 3, 2015, 2:49:21 PM4/3/15
to Firefox Dev
Gijs Kruitbosch schrieb:
> For CSS (which has this problem but worse because no scriptability) we
> can presumably do this with the pre-existing @import , and change
> %defines to use CSS variables?

If we could get rid of all preprocessing in CSS, that would be extremely
nice for 3rd-party theme work (Like my LCARStrek).

We probably would need CSS variables and
https://bugzilla.mozilla.org/show_bug.cgi?id=1060941 in addition to
@import - but it would be fantastic from my POV.

KaiRo
Reply all
Reply to author
Forward
0 new messages