W3C Proposed Recommendation: HTML5

876 views
Skip to first unread message

L. David Baron

unread,
Sep 19, 2014, 8:23:36 PM9/19/14
to dev-pl...@lists.mozilla.org
W3C recently published the following proposed recommendation (the
stage before W3C's final stage, Recommendation):

http://www.w3.org/TR/html5/
HTML5

There's a call for review to W3C member companies (of which Mozilla
is one) open until October 14.

This specification is largely based on a snapshot of the WHATWG's
HTML specification, written by Ian Hickson.

If there are comments you think Mozilla should send as part of the
review, or if you think Mozilla should voice support or opposition
to the specification, please say so in this thread. (I'd note,
however, that there have been many previous opportunities to make
comments, so it's somewhat bad form to bring up fundamental issues
for the first time at this stage.)

One of the open issues being raised in this review is the status of
the spec's normative reference to the URL specification. The
specification currently references http://www.w3.org/TR/url/ ; it
might be possible for us to suggest that it instead reference either
http://url.spec.whatwg.org/ or
https://whatwg.org/specs/url/2014-07-30/ (although if we did so, it
would probably be best for somebody to raise the issue elsewhere in
addition to it just being part of our review).

I expect the finalization of this specification at W3C to be a big
event, compared to other specifications.

-David

--
𝄞 L. David Baron http://dbaron.org/ 𝄂
𝄢 Mozilla https://www.mozilla.org/ 𝄂
Before I built a wall I'd ask to know
What I was walling in or walling out,
And to whom I was like to give offense.
- Robert Frost, Mending Wall (1914)
signature.asc

Boris Zbarsky

unread,
Sep 19, 2014, 10:46:21 PM9/19/14
to
On 9/19/14, 8:23 PM, L. David Baron wrote:
> W3C recently published the following proposed recommendation (the
> stage before W3C's final stage, Recommendation):

The biggest issue I have with this is exiting CR without anything
resembling a comprehensive enough test suite to ensure anything like
interop on some of the core hard pieces (they left out the navigation
algorithm, smart, but still have the bogus WindowProxy spec in this
supposed PR, for example).

My second biggest issue is that I don't have a concrete proposal for
addressing this the first issue.

Maybe it all doesn't matter too much as long as implementors keep
reading the whatwg spec instead.

-Boris

Karl Dubost

unread,
Sep 20, 2014, 5:03:37 AM9/20/14
to Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
Boris, David,

Le 20 sept. 2014 à 11:46, Boris Zbarsky <bzba...@mit.edu> a écrit :
> The biggest issue I have with this is exiting CR without anything resembling a comprehensive enough test suite

* What is a comprehensive enough test suite?
* How far the current test suite is from the comprehensive test suite you would have wished.
* Does Mozilla has a comprehensive test suite on the same set of features?

> to ensure anything like interop on some of the core hard pieces (they left out the navigation algorithm, smart, but still have the bogus WindowProxy spec in this supposed PR, for example).

s/they/we/
The first rule of a group in which we (Mozilla) participate is to include yourself in the discussion. It helps a lot to change the attitude with regards to the issues.


> My second biggest issue is that I don't have a concrete proposal for addressing this the first issue.

The test suite? My biggest issue with HTML5 spec is that it is too big to be meaningfully implementable and/or testable. Having a comprehensive test suite on something that big is close to insane. It is not necessary solvable for this round, but that could teach us on how to improve how to develop the future of features for the Web with more testing upfront and more modular approach. Basically we can learn from our mistakes. Not everything is lost ^_^


> Maybe it all doesn't matter too much as long as implementors keep reading the whatwg spec instead.

It's here where I have a disconnect with the first comment. Be whatwg spec or w3c spec if we dim that a comprehensive test suite is important then there should be one whatever the stamp on the text. If we think it's not that important, it doesn't matter if it's w3c or not.



--
Karl Dubost, Mozilla
http://www.la-grange.net/karl/moz

Anne van Kesteren

unread,
Sep 20, 2014, 5:20:52 AM9/20/14
to Karl Dubost, Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
On Sat, Sep 20, 2014 at 11:03 AM, Karl Dubost <kdu...@mozilla.com> wrote:
> My biggest issue with HTML5 spec is that it is too big to be meaningfully implementable and/or testable.

Yeah the W3C crowd keeps saying that, yet hasn't invested any
meaningful effort into creating modules.


> It's here where I have a disconnect with the first comment. Be whatwg spec or w3c spec if we dim that a comprehensive test suite is important then there should be one whatever the stamp on the text. If we think it's not that important, it doesn't matter if it's w3c or not.

The problem is that the W3C publishes something that is 500 commits
behind what they copied from and claims it's interoperable while the
test coverage is mediocre. That may be fine for PP purposes and
getting your logo in the press, but if you want to get converge across
implementations you need a specification that is developed in tandem
with implementations.


--
https://annevankesteren.nl/

Karl Dubost

unread,
Sep 20, 2014, 5:41:11 AM9/20/14
to Anne van Kesteren, Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
Anne,

Le 20 sept. 2014 à 18:20, Anne van Kesteren <ann...@annevk.nl> a écrit :
> Yeah the W3C crowd keeps saying that

Here the W3C crowd. We (Mozilla) have a conflict ;)
http://www.w3.org/2000/09/dbwg/details?group=40318&public=1&order=org#_MozillaFoundation

This apart, I would love to have this discussion during the work week in December. I think F2F removes a lot of the imaginary tensions conveyed by emails. :) You know me, I know you. I don't know that much Boris though apart online. Unfortunately. So discussions in December please.


> The problem is that the W3C publishes something that is 500 commits
> behind what they copied from and claims it's interoperable while the
> test coverage is mediocre.

Is the whatwg spec interoperable? Will it ever be?
So I guess the answer will be "no". Which makes an interesting issue and it's why the discussion currently happening about the future of HTML is cool.

Let's see http://dev.w3.org/html5/decision-policy/public-permissive-exit-criteria.html

> Interoperable
>
> Qualitatively interoperable at at a judgment level, not necessarily for every spec assertion. A test suite may be used as guidance for the qualitative decision.

Does it meet this criteria? If not on which sections it doesn't.

Also there is a link about features at Risk.
http://www.w3.org/html/wg/wiki/HTML5.0AtRiskFeatures
Should they be removed?


That doesn't help David Baron in his job as an AC rep though.

> If there are comments you think Mozilla should send as part of the
> review, or if you think Mozilla should voice support or opposition
> to the specification, please say so in this thread. (I'd note,
> however, that there have been many previous opportunities to make
> comments, so it's somewhat bad form to bring up fundamental issues
> for the first time at this stage.)


So Boris said incomplete test suite. That's one comment.

My take is that we should support its publication with a record of the parts we think didn't work and what we would love to see for the next generation of HTML and how it should be developed with us participating.

Kyle Huey

unread,
Sep 20, 2014, 11:26:37 AM9/20/14
to Karl Dubost, L. David Baron, dev-pl...@lists.mozilla.org, Boris Zbarsky
On Sat, Sep 20, 2014 at 2:41 AM, Karl Dubost <kdu...@mozilla.com> wrote:
> Anne,
>
> Le 20 sept. 2014 à 18:20, Anne van Kesteren <ann...@annevk.nl> a écrit :
>> Yeah the W3C crowd keeps saying that
>
> Here the W3C crowd. We (Mozilla) have a conflict ;)
> http://www.w3.org/2000/09/dbwg/details?group=40318&public=1&order=org#_MozillaFoundation

I categorically reject this idea that all W3C and/or WG members have
equal responsibility for any action the W3C and/or WG takes.

- Kyle

Boris Zbarsky

unread,
Sep 20, 2014, 2:23:12 PM9/20/14
to
On 9/20/14, 5:03 AM, Karl Dubost wrote:
>> The biggest issue I have with this is exiting CR without anything resembling a comprehensive enough test suite
>
> * What is a comprehensive enough test suite?

Ideally, one that has a test for every normative requirement in the
specification. This means at least one test per sentence of normative
text, basically.

In practice, this is a very high bar, because that includes testing
various interactions between features, which can get pretty hairy.

A good start, though, would be direct testing of at least all the
obvious conformance requirements explicitly listed in the specification,
if not of their non-obvious interactions.

> * How far the current test suite is from the comprehensive test suite you would have wished.

I haven't looked into this in detail, honestly.

Given that I know there are parts of the specification that don't match
browsers and that no one has brought up, clearly "somewhat"....

> * Does Mozilla has a comprehensive test suite on the same set of features?

Probably not.

>> to ensure anything like interop on some of the core hard pieces (they left out the navigation algorithm, smart, but still have the bogus WindowProxy spec in this supposed PR, for example).
>
> s/they/we/
> The first rule of a group in which we (Mozilla) participate is to include yourself in the discussion. It helps a lot to change the attitude with regards to the issues.

I don't think Mozilla meaningfully participates in this working group.
We've tried, but the environment was hostile, and our participation
seemed generally unwelcome, so we gave up for all but process purposes.

If a group explicitly chooses to exclude me from the discussion, I feel
no particular need to consider myself part of that group, so I am
sticking by my "they". Nor do I feel any particular responsibility for
their actions, for what it's worth.

>> My second biggest issue is that I don't have a concrete proposal for addressing this the first issue.
>
> The test suite?

Yes. I have no concrete proposal for scrounging up the resources to
evaluate which aspects of the test suite are lacking, much less for
writing tests to remedy that lack.

> My biggest issue with HTML5 spec is that it is too big to be meaningfully implementable and/or testable.

We have a slight problem, don't we? It's not like the plan is to lose
any of these features, and browsers _are_ expected to implement them in
non-buggy ways.

> It is not necessary solvable for this round, but that could teach us on how to improve how to develop the future of features for the Web with more testing upfront and more modular approach.

A more modular approach doesn't necessarily help, since you have to test
interactions between the modules (though it sure makes it easier to
ignore this need). In the end, whatever amount of interacting stuff you
have will require testing en-masse.

Now maybe a modular approach will mean that there won't be interactions.
Or maybe it'll mean the interactions are less obvious and easier to
overlook and get wrong. We'll see.

100% agreed on more testing up front.

> Basically we can learn from our mistakes. Not everything is lost ^_^

Again, agreed.

> It's here where I have a disconnect with the first comment. Be whatwg spec or w3c spec if we dim that a comprehensive test suite is important then there should be one whatever the stamp on the text.

Yes, agreed. I should have been clearer.

The important part to me about implementations is that implementations
shouldn't follow the known-bogus parts of the HTML5 REC once said
bogosity if fixed in the WHATWG spec and HTML5.1 (with the former more
likely to happen sooner).

-Boris

Boris Zbarsky

unread,
Sep 20, 2014, 2:25:38 PM9/20/14
to
On 9/20/14, 5:41 AM, Karl Dubost wrote:
> Is the whatwg spec interoperable?

No.

> Will it ever be?

That's the goal. Whether we manage to get there, we'll see.

> So Boris said incomplete test suite. That's one comment.

Note that I didn't say we should bring the comment back to the AC, since
again I have nothing actionable to say here...

-Boris

Karl Dubost

unread,
Sep 20, 2014, 6:29:01 PM9/20/14
to Boris Zbarsky, dev-pl...@lists.mozilla.org

Le 21 sept. 2014 à 03:23, Boris Zbarsky <bzba...@mit.edu> a écrit :
> The important part to me about implementations is that implementations shouldn't follow the known-bogus parts of the HTML5 REC once said bogosity if fixed in the WHATWG spec and HTML5.1 (with the former more likely to happen sooner).

Maybe it's an actionable feedback.
To propose a "notes for implementers" section saying something along:
(text can be improved, better suggestions, etc.)


"This published recommendation has switched
to a non maintenance mode. It may contain
mistakes or things may have changed since
the publication. Please make sure to check
the most up to date document BLAH [with
link to the whatwg spec] before implementing
any features."


Would that partly solve your concerns?

Boris Zbarsky

unread,
Sep 20, 2014, 6:53:10 PM9/20/14
to
On 9/20/14, 6:29 PM, Karl Dubost wrote:
> "This published recommendation has switched
> to a non maintenance mode. It may contain
> mistakes or things may have changed since
> the publication. Please make sure to check
> the most up to date document BLAH [with
> link to the whatwg spec] before implementing
> any features."
>
> Would that partly solve your concerns?

That would be fairly useful, but I personally am not willing to spend
effort, much less political capital, fighting to get something like that
added.

-Boris

James Graham

unread,
Sep 21, 2014, 9:00:08 AM9/21/14
to dev-pl...@lists.mozilla.org
On 20/09/14 03:46, Boris Zbarsky wrote:
> On 9/19/14, 8:23 PM, L. David Baron wrote:
>> W3C recently published the following proposed recommendation (the
>> stage before W3C's final stage, Recommendation):
>
> The biggest issue I have with this is exiting CR without anything
> resembling a comprehensive enough test suite to ensure anything like
> interop on some of the core hard pieces (they left out the navigation
> algorithm, smart, but still have the bogus WindowProxy spec in this
> supposed PR, for example).

Obviously I agree that good testing of implementations is key to
interoperability. I also agree that we should encourage vendors to
create and run shared tests for the web technologies that we implement.
I am substantially less convinced that tying these tests to the spec
lifecycle makes sense. The W3C Process encourages people to think of
interoperability as a binary condition; either implementations are
interoperable or they are not. This leads to ideas like the CSS WG's
rule that two implementations must pass every test. On the face of it
this may appear eminently sensible. However the incentives for testing
under such a regime are not well aligned with the goal of finding bugs
in implementations; in essence people are encouraged to write as many
tests as are needed to satisfy the letter of the rules but to make them
all as shallow and unlikely to find bugs as possible to avoid causing
unwanted holdups in the specification process. I would much prefer to
have a testsuite written by people making a genuine effort to find
errors in implementations even if the result is that no one passes every
single test.

Of course it's possible that going through the spec and making sure
there is at least one test for every conformance criterion will make the
testsuite good even if people are determined to produce a poor
testsuite. However I strongly doubt this to be the case. Indeed I'm only
aware of a handful of examples of someone setting out to write a test
for every conformance criterion in a specification and ending up with a
useful result. The canvas / 2d context tests are one of those cases, and
that benefits from being a rather self-contained set of operations
without much interaction with the rest of the platform. Even if someone
took the same approach to, say, document loading tests, it is unlikely
that the result would be as good because the features are much more
likely to interact in unpleasant ways so that, for example, the load
event and document.open might work independently but using the two in
combination might break in an unexpected way.

I'm also dubious that requiring a test for every assertion in the spec
is a standard that we are prepared to hold ourselves to when we ship
code. Since shipping code is, in the grand scheme of things,
substantially more important than shipping a spec � the former affecting
all our users and the latter only lawyers � it doesn't seem at all
reasonable to demand that the specification is held to a higher standard.

> My second biggest issue is that I don't have a concrete proposal for
> addressing this the first issue.

My concrete suggestion is that we, as an organisation, work to achieve
parity between the tests we require to ship our own code and those we
release in ways that can be used to support a spec and, more
importantly, those that can be used verify that different
implementations match up. In practice this means not writing tests in
Mozilla-specific formats, and making sure that we have a way to upstream
tests that we've written. Making this process as low-overhead as
possible is something that I'm working on.

Obviously this isn't going to make a short-term difference for old
features like WindowProxy. I'm not sure what to suggest for those cases
given that we have de-facto shown an unwillingness to invest even
relatively small amounts of effort into reviewing existing tests that
could be part of the HTML testsuite for that feature [1].

In the longer term, one might hope that bugfixes will produce new
testcases that could be upstreamed, and Servo might need a proper
testsuite to achieve interoperability. Having said that, Servo has so
far not produced a significant number of tests, which has been a little
surprising as they have been implementing some of the core pieces of the
platform which are indeed under-tested. I suspect this is because the
skills, interests and goals of the team are around producing code rather
than tests. For people making small contributions it would also be
rather off-putting to be told "no you can't land this fix that makes
Wikipedia look better without a comprehensive testsuite for the relevant
feature". However if we as an organisation really care about testing
core platform features which already have an implementation in gecko,
one way to achieve that would be to give someone working on Servo the
explicit job of creating testsuites for the big-ticket features they
implement as they implement them.

> Maybe it all doesn't matter too much as long as implementors keep
> reading the whatwg spec instead.

In terms of HTML at the W3C it's pretty clear that they've dropped the
ball, and haven't even realised it yet. There was a thread started five
days ago about the future of HTML after this release and so far it has
gained exactly no responses from implementors. The WG turned into an
unproductive, bureaucratic, nightmare and succeeded in driving out their
core constituency leaving the remaining participants to debate topics of
little consequence.

If we were to complain about the lack of testing for HTML, we would be
told in no uncertain terms that we (or at least the WG) had agreed to
the "Plan 2014" which explicitly chose to forego testing in certain
areas, and specification accuracy, in favour of shipping at a defined
time. This idea of shipping on a date-based schedule isn't actually
obviously bad, as long as you set the expectations correctly, which is
something the W3C will get wrong. It would indeed be nice if W3C would
embrace this fully and move to a WHATWG-style model with no eratta but a
spec kept continually up-to-date in the areas where there is a need for
change, and absolute rigidity in the areas where reality dictates that
change is impossible. However moving them there is a longer term goal
that seems to be met with extreme resistance from people who haven't
fully grasped how shipping a web browser works.

So yes, I think that, at the moment, "everybody knows" looking at HTML
under w3.org/TR/ is a mistake. Even if they don't it's probably not
*that* important because HTML isn't the most interesting spec right now;
in terms of new features most of the action is happening in the somewhat
less dysfunctional WebApps group, and elsewhere. I guess there is some
chance that when parts of the fundamental model are moved to HTML e.g.
adding things required for rAF or Web Components, people won't realise
where those bits defined. However that's a rather fundamental problem
with people looking at any kind of dated snapshot and I doubt we will
argue against any further publication of snapshots.

Boris Zbarsky

unread,
Sep 21, 2014, 5:19:18 PM9/21/14
to
On 9/21/14, 9:00 AM, James Graham wrote:
> I am substantially less convinced that tying these tests to the spec
> lifecycle makes sense.

Agreed. The only reason it's an issue for me is the lack of
errata-issuance by the W3C and hence the tendency to attempt to enshrine
obviously-wrong things in specifications forever.

> The W3C Process encourages people to think of
> interoperability as a binary condition; either implementations are
> interoperable or they are not.

More interestingly, either the specification is implementable or not.
Again, because once the REC is published everyone goes home and never
touches that document again.

The two implementations condition was there to make sure you didn't end
up with specs that basically reflected one UA's internals and that no
one else could implement....

> However the incentives for testing
> under such a regime are not well aligned

Yes, absolutely agreed.

> I would much prefer to
> have a testsuite written by people making a genuine effort to find
> errors in implementations even if the result is that no one passes every
> single test.

This would be much more helpful, absolutely.

Of course then you need to ask yourself whether the test failures are
just bugs in implementations or fundamental problems with the spec. A
question spec writers are often loath to ask. :(

> I'm also dubious that requiring a test for every assertion in the spec
> is a standard that we are prepared to hold ourselves to when we ship
> code.

Indeed.

> Since shipping code is, in the grand scheme of things,
> substantially more important than shipping a spec � the former affecting
> all our users and the latter only lawyers

I wish specs only affected lawyers, especially given how they are often
created/maintained.

Sadly, they do affect web developers and browser developers, not to
mention other specs.

> it doesn't seem at all
> reasonable to demand that the specification is held to a higher standard.

Note that I made no such demand, precisely because I think it's unrealistic.

> My concrete suggestion is that we, as an organisation, work to achieve
> parity between the tests we require to ship our own code and those we
> release in ways that can be used to support a spec and, more
> importantly, those that can be used verify that different
> implementations match up.

This is a good idea, but not terribly relevant to what dbaron should say
in his AC rep capacity, right?

> Making this process as low-overhead as
> possible is something that I'm working on.

And it's much appreciated!

Note that actually sanely testing something like navigation in
non-browser-specific ways is ... hard. Basic things like "open a
cross-origin window and wait for it to load" aren't really possible. :(

> Obviously this isn't going to make a short-term difference for old
> features like WindowProxy. I'm not sure what to suggest for those cases
> given that we have de-facto shown an unwillingness to invest even
> relatively small amounts of effort into reviewing existing tests that
> could be part of the HTML testsuite for that feature [1].

Missing reference?

> In the longer term, one might hope that bugfixes will produce new
> testcases that could be upstreamed, and Servo might need a proper
> testsuite to achieve interoperability. Having said that, Servo has so
> far not produced a significant number of tests, which has been a little
> surprising as they have been implementing some of the core pieces of the
> platform which are indeed under-tested. I suspect this is because the
> skills, interests and goals of the team are around producing code rather
> than tests.

Yep. And because they really haven't been aiming for full web compat
yet, I'd hope, but rather prototyping out some of the parallalelism ideas.

> The WG turned into an
> unproductive, bureaucratic, nightmare and succeeded in driving out their
> core constituency leaving the remaining participants to debate topics of
> little consequence.

Yup.

> If we were to complain about the lack of testing for HTML

Again, note that I don't think we have any realistic way to complain
about it, for precisely the reasons you list.

> This idea of shipping on a date-based schedule isn't actually
> obviously bad, as long as you set the expectations correctly, which is
> something the W3C will get wrong.

I hope you're wrong (e.g. that the W3C will actually continue fairly
frequent date-based updates to HTML), but I fear you're right.

> So yes, I think that, at the moment, "everybody knows" looking at HTML
> under w3.org/TR/ is a mistake. Even if they don't it's probably not
> *that* important because HTML isn't the most interesting spec right now;

Unless you're Servo, yeah.

-Boris

Ms2ger

unread,
Sep 22, 2014, 3:13:20 AM9/22/14
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi David,

On 09/20/2014 02:23 AM, L. David Baron wrote:
> W3C recently published the following proposed recommendation (the
> stage before W3C's final stage, Recommendation):
>
> http://www.w3.org/TR/html5/ HTML5
>
> There's a call for review to W3C member companies (of which
> Mozilla is one) open until October 14.
>

Many things could be said (and have been said in this thread) about
the HTMLWG and whether this work is useful. However, given that I
expect any objections from our side to be ignored at best and lead to
you being pressured to retract them at worst, I believe the answers
that make the most sense are

* Regarding the HTML5 specification, my organization: abstains from
this review.
* My organization: does not expect to produce or use products or
content addressed by this specification

HTH
Ms2ger
-----BEGIN PGP SIGNATURE-----

iQEcBAEBAgAGBQJUH8u7AAoJEOXgvIL+s8n2NUkH/A613tpVUV1jUcnQwumQFcSP
kBou9BPQqT6tr/bse2YdljnnqtcvJHi7eFO+yocfJfEwuGFuIWgsxHe5CeuDRLUw
SOweot5onCUxsLZwiHx++oYK7TVKb8E2HL7FArtbhmBL4SnDS7FWw5kwHQXh3RiI
Nc2YyZ6ETyiJL2DE8ym2o1wyrVRS4xjB0kiY4ADSxAMYJ5JAqd4o94kIrDIAAa8Q
e5XoqhylEBJQc1Gdn/7JQGcynL4SpSWQzMzO7oNd0okxbUJs1arrSJGN4lYvwV+r
UX8s2MzN3i+fEjPzohBbJxxhmZXBumFY+EZ3xGq773Fmw/Wbv1NoTkzlElYqQJs=
=Yd1M
-----END PGP SIGNATURE-----

Robin Berjon

unread,
Sep 22, 2014, 7:41:01 AM9/22/14
to L. David Baron, dev-pl...@lists.mozilla.org
Hi David,

On 20/09/2014 02:23 , L. David Baron wrote:
> One of the open issues being raised in this review is the status of
> the spec's normative reference to the URL specification. The
> specification currently references http://www.w3.org/TR/url/ ; it
> might be possible for us to suggest that it instead reference either
> http://url.spec.whatwg.org/ or
> https://whatwg.org/specs/url/2014-07-30/ (although if we did so, it
> would probably be best for somebody to raise the issue elsewhere in
> addition to it just being part of our review).

I think the world would be better place if we could pacify the
WHATWG/W3C relationship. Of course, I realise that this can be a
relatively ironic statement to make in the context of a vote on
publishing W3C HTML, but I am making it nevertheless because I believe
that baby steps can help get us there.

I was hoping that we could simply reference WHATWG URL as a (small)
token of good faith and normalisation, adding a small cobblestone to
pave the way to cooperation. Sadly, this has proven contentious. But as
with all polarised discussions, it is hard to tell if there is genuine
opposition or just a small group of angry agitators.

Therefore, if you believe that making such a reference would be a good
step forward, I would encourage you to make a comment in that direction
(note that we can reference both the latest version and the snapshot).
You don't need to raise the issue elsewhere, it has already been raised,
burnt, buried, and zombified a few times over. Mentioning it in your
review (as several others have done already) would already carry weight
(and wouldn't cost you the trouble that entering the fray otherwise might).

Thanks!

--
Robin Berjon - http://berjon.com/ - @robinberjon

Henri Sivonen

unread,
Sep 22, 2014, 7:43:48 AM9/22/14
to dev-platform
On Sun, Sep 21, 2014 at 4:00 PM, James Graham <ja...@hoppipolla.co.uk> wrote:
> On 20/09/14 03:46, Boris Zbarsky wrote:
>> On 9/19/14, 8:23 PM, L. David Baron wrote:
>>> W3C recently published the following proposed recommendation (the
>>> stage before W3C's final stage, Recommendation):
>>
>> The biggest issue I have with this is exiting CR without anything
>> resembling a comprehensive enough test suite to ensure anything like
>> interop on some of the core hard pieces (they left out the navigation
>> algorithm, smart, but still have the bogus WindowProxy spec in this
>> supposed PR, for example).
>
> Obviously I agree that good testing of implementations is key to
> interoperability.

Talking about testing in the context of the snapshot reads like
pretending that we should treat the snapshot as something other than a
lawyer point of reference for the purpose of the PP. Of course, if
something in the spec is wrong and we implement something else, maybe
the PP isn't that strong on that point, but it still seems worthwhile
to get PP commitment for the stuff that happens to be right or close
enough to right for lawyer purposes.

> I also agree that we should encourage vendors to
> create and run shared tests for the web technologies that we implement.
> I am substantially less convinced that tying these tests to the spec
> lifecycle makes sense. The W3C Process encourages people to think of
> interoperability as a binary condition; either implementations are
> interoperable or they are not. This leads to ideas like the CSS WG's
> rule that two implementations must pass every test. On the face of it
> this may appear eminently sensible. However the incentives for testing
> under such a regime are not well aligned with the goal of finding bugs
> in implementations; in essence people are encouraged to write as many
> tests as are needed to satisfy the letter of the rules but to make them
> all as shallow and unlikely to find bugs as possible to avoid causing
> unwanted holdups in the specification process. I would much prefer to
> have a testsuite written by people making a genuine effort to find
> errors in implementations even if the result is that no one passes every
> single test.

These effects of tying tests to snapshots are clearly harmful. We
should track EDs for testing and implementation instead of targeting
known-stale snapshots.

> My concrete suggestion is that we, as an organisation, work to achieve
> parity between the tests we require to ship our own code and those we
> release in ways that can be used to support a spec and, more
> importantly, those that can be used verify that different
> implementations match up. In practice this means not writing tests in
> Mozilla-specific formats, and making sure that we have a way to upstream
> tests that we've written. Making this process as low-overhead as
> possible is something that I'm working on.

Yes. Furthermore, the test damage from snapshots lessens if the people
who do the work to write the tests contribute tests written to EDs
instead of snapshots. Our tip of the tree of code should follow the
tip of the tree for specs and the tests we contribute should make
sense for our tip of the tree.

> However if we as an organisation really care about testing
> core platform features which already have an implementation in gecko,
> one way to achieve that would be to give someone working on Servo the
> explicit job of creating testsuites for the big-ticket features they
> implement as they implement them.

I agree.

>> Maybe it all doesn't matter too much as long as implementors keep
>> reading the whatwg spec instead.
>
> In terms of HTML at the W3C it's pretty clear that they've dropped the
> ball, and haven't even realised it yet. There was a thread started five
> days ago about the future of HTML after this release and so far it has
> gained exactly no responses from implementors. The WG turned into an
> unproductive, bureaucratic, nightmare and succeeded in driving out their
> core constituency

Sadly, yes.

> leaving the remaining participants to debate topics of
> little consequence.

FWIW, this bit is being spun into Twitter propaganda about Mozilla not
caring about accessibility, so it might be worthwhile to be careful
with the phrasing.

For the people tweeting:

It's not like a11y was the only thing left. Let's not forget Polyglot.
See https://twitter.com/mattur/status/510818045749899264 and
https://twitter.com/mattur/status/461240899503407105 if you still
think Polyglot is a worthy WG deliverable.

And to the extent debates that have remained fit under the topic of
a11y--and in that light what James said may seem offensive--it doesn't
do a11y any favors to hold the permathreads about longdesc as a symbol
of pro-a11y activity. (Note that e.g.
http://lists.w3.org/Archives/Public/public-html-admin/2014Aug/0028.html
is not an anti-a11y or not-caring-about-a11y statement!)

But both Polyglot and the longdesc stuff are in extension specs, so
while they are relevant to the WG bureaucracy and unproductivity, they
are less relevant to the HTML5 snapshot at hand. Even if one believes
all of http://www.w3.org/wiki/HTML/W3C-WHATWG-Differences to be actual
a11y improvements over the WHATWG version that we should implement and
it would be wrong for James' characterization to extend to them,
making those improvements through a framework that drove browser
implementors away from public-html isn't really a success worth
bragging about (or, from the Mozilla perspective on the whole a good
tradeoff). Also, if our a11y team is implementing the W3C spec on
those points, it would be more productive to say so here than to fuel
the flames on Twitter.

> This idea of shipping on a date-based schedule isn't actually
> obviously bad, as long as you set the expectations correctly, which is
> something the W3C will get wrong.

I think our comments to the W3C or otherwise should not support the
meme that snapshots are something that browser developers, Web
developers, ebook platforms or government procurements should target.
I think we should not object to the snapshot publication though, since
if our objection was successful (it wouldn't be), we'd fail to get a
PP snapshot out of all the trouble that trying to work with the HTML
WG has caused. To get the snapshot for the purpose of the PP but to
pre-empt the use of Mozilla's "support" for promoting the snapshot for
purposes where referring to stale specs is harmful, maybe choosing the
option to explicitly abstain is the way to achieve that.

> It would indeed be nice if W3C would
> embrace this fully and move to a WHATWG-style model with no eratta but a
> spec kept continually up-to-date in the areas where there is a need for
> change, and absolute rigidity in the areas where reality dictates that
> change is impossible. However moving them there is a longer term goal
> that seems to be met with extreme resistance from people who haven't
> fully grasped how shipping a web browser works.

Yes.

--
Henri Sivonen
hsiv...@hsivonen.fi
https://hsivonen.fi/

Robin Berjon

unread,
Sep 22, 2014, 7:45:06 AM9/22/14
to Anne van Kesteren, Karl Dubost, Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
On 20/09/2014 11:20 , Anne van Kesteren wrote:
> On Sat, Sep 20, 2014 at 11:03 AM, Karl Dubost <kdu...@mozilla.com> wrote:
>> My biggest issue with HTML5 spec is that it is too big to be
>> meaningfully implementable and/or testable.
>
> Yeah the W3C crowd keeps saying that, yet hasn't invested any
> meaningful effort into creating modules.

I'm not sure who the "W3C crowd" are (it sounds like an arbitrary
moniker designed to encourage "us vs them" thinking) but the only
meaningful investment into creating modules that I know of is starting
pretty much now. So I'm relatively certain that we don't have the
hindsight to evaluate it much.

Unless you're thinking of XHTML Modularisation. But that would be like
blaming Mozilla for <LAYER>: not entirely fair :)

Robin Berjon

unread,
Sep 22, 2014, 7:55:05 AM9/22/14
to Kyle Huey, Karl Dubost, Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
Hi Kyle,

On 20/09/2014 17:26 , Kyle Huey wrote:
> On Sat, Sep 20, 2014 at 2:41 AM, Karl Dubost <kdu...@mozilla.com> wrote:
>> Le 20 sept. 2014 à 18:20, Anne van Kesteren <ann...@annevk.nl> a écrit :
>>> Yeah the W3C crowd keeps saying that
>>
>> Here the W3C crowd. We (Mozilla) have a conflict ;)
>> http://www.w3.org/2000/09/dbwg/details?group=40318&public=1&order=org#_MozillaFoundation
>
> I categorically reject this idea that all W3C and/or WG members have
> equal responsibility for any action the W3C and/or WG takes.

I agree with your general sentiment but I would qualify it. If you are
participating *and* have made a bona fide attempt at fixing the issues
you see with the group then you can certainly distance yourself from the
group's actions.

But if you haven't tried to change things constructively, complaining
about it elsewhere doesn't seem all that helpful.

Robin Berjon

unread,
Sep 22, 2014, 8:16:43 AM9/22/14
to James Graham, dev-pl...@lists.mozilla.org
Hi James,

On 21/09/2014 15:00 , James Graham wrote:
> Obviously I agree that good testing of implementations is key to
> interoperability. I also agree that we should encourage vendors to
> create and run shared tests for the web technologies that we implement.
> I am substantially less convinced that tying these tests to the spec
> lifecycle makes sense. The W3C Process encourages people to think of
> interoperability as a binary condition; either implementations are
> interoperable or they are not. This leads to ideas like the CSS WG's
> rule that two implementations must pass every test. On the face of it
> this may appear eminently sensible. However the incentives for testing
> under such a regime are not well aligned with the goal of finding bugs
> in implementations; in essence people are encouraged to write as many
> tests as are needed to satisfy the letter of the rules but to make them
> all as shallow and unlikely to find bugs as possible to avoid causing
> unwanted holdups in the specification process. I would much prefer to
> have a testsuite written by people making a genuine effort to find
> errors in implementations even if the result is that no one passes every
> single test.

I couldn't agree more. In fact, part of my hope when we were setting up
Web Platform Tests was that having a continuously updated test suite
instead of a bunch of hasty rushes to get enough coverage of a spec for
it to "ship" would help people realise that specs should be handled in a
similar manner too.

I can't say it has brought about a revolution yet, but it has certainly
helped change minds. It's hard to argue against a continuously updated
test suite. It's hard to imagine that such an animal wouldn't find spec
bugs in addition to implementation bugs. It's hard to justify knowing
about bugs and not shipping an update. Things tend to make their own way
from there.

> My concrete suggestion is that we, as an organisation, work to achieve
> parity between the tests we require to ship our own code and those we
> release in ways that can be used to support a spec and, more
> importantly, those that can be used verify that different
> implementations match up. In practice this means not writing tests in
> Mozilla-specific formats, and making sure that we have a way to upstream
> tests that we've written. Making this process as low-overhead as
> possible is something that I'm working on.

A very strong +1.

> However if we as an organisation really care about testing
> core platform features which already have an implementation in gecko,
> one way to achieve that would be to give someone working on Servo the
> explicit job of creating testsuites for the big-ticket features they
> implement as they implement them.

That would certainly be helpful.

In addition, I would note that while a shared test suite benefits
everyone. At this point Mozilla has proven to be a huge contributor to
the WPT project (with Opera's massive release of tests another notable
help) but we have not yet seen comparable commitments from the other
browser vendors. So any help you can provide in convincing people to
contribute is very welcome.

>> Maybe it all doesn't matter too much as long as implementors keep
>> reading the whatwg spec instead.
>
> In terms of HTML at the W3C it's pretty clear that they've dropped the
> ball, and haven't even realised it yet. There was a thread started five
> days ago about the future of HTML after this release and so far it has
> gained exactly no responses from implementors. The WG turned into an
> unproductive, bureaucratic, nightmare and succeeded in driving out their
> core constituency leaving the remaining participants to debate topics of
> little consequence.
>
> If we were to complain about the lack of testing for HTML, we would be
> told in no uncertain terms that we (or at least the WG) had agreed to
> the "Plan 2014" which explicitly chose to forego testing in certain
> areas, and specification accuracy, in favour of shipping at a defined
> time. This idea of shipping on a date-based schedule isn't actually
> obviously bad, as long as you set the expectations correctly, which is
> something the W3C will get wrong. It would indeed be nice if W3C would
> embrace this fully and move to a WHATWG-style model with no eratta but a
> spec kept continually up-to-date in the areas where there is a need for
> change, and absolute rigidity in the areas where reality dictates that
> change is impossible. However moving them there is a longer term goal
> that seems to be met with extreme resistance from people who haven't
> fully grasped how shipping a web browser works.

Well, my plan is to move pretty much there in a matter of months. For me
that's the biggest value in putting HTML5 out of the door: it frees up a
lot of flexibility (and energy) in how things are done from now on.

Constructive input is certainly welcome :)

Robin Berjon

unread,
Sep 22, 2014, 8:35:06 AM9/22/14
to Karl Dubost, Boris Zbarsky, dev-pl...@lists.mozilla.org
On 21/09/2014 00:29 , Karl Dubost wrote:
> Le 21 sept. 2014 � 03:23, Boris Zbarsky <bzba...@mit.edu> a �crit :
>> The important part to me about implementations is that
>> implementations shouldn't follow the known-bogus parts of the HTML5
>> REC once said bogosity if fixed in the WHATWG spec and HTML5.1
>> (with the former more likely to happen sooner).
>
> Maybe it's an actionable feedback. To propose a "notes for
> implementers" section saying something along: (text can be improved,
> better suggestions, etc.)
>
> "This published recommendation has switched to a non maintenance
> mode. It may contain mistakes or things may have changed since the
> publication. Please make sure to check the most up to date document
> BLAH [with link to the whatwg spec] before implementing any
> features."
>
> Would that partly solve your concerns?

I am currently thinking about some text that would include something
like the above and the goal of which is to indicate how a given document
ought to be used.

At the heart of the idea is the notion that different constituencies may
have different needs, some needing a (relatively) stable snapshot while
others need a (relatively) up to date document. Additionally, we can't
presume to know who would need which when. The usual distinction made
between lawyers and implementers is overly simplistic (e.g. a lawyer
could need either, implementers of some specific tools might need a
snapshot for some reason). Instead we can try something crazy new and
trust readers not be radically dumb by providing them with the
information they need to make up their minds.

This is a quick draft of the idea, it's very much open to evolution.

"""
## Usage of this Document

Web standards are available in two flavours: snapshots, which are
immutable versions made at a point in time and guaranteed never to
change, and live versions which capture as much as possible the latest
state of the technology and are intended to be continuously maintained
and kept up to date.

Which version you should rely on and reference depends on your needs. If
your specific situation demands am unchanging document, understanding
that it is likely to contain defects that have been addressed elsewhere,
then you will want to use the snapshot. If however you require a
document that is as up-to-date as possible, then the live version is for
you. If in doubt, we recommend you rely on the live document.

This document is a snapshot document. For the live version, please visit
[link].
""

One suggestion in this space is to also drop the style sheet from
snapshots to make it clear that they're not nice things to use (as in
https://whatwg.org/specs/url/2014-07-30/). I understand the thinking but
I am concerned that it could introduce issues (in some cases losing
information). I do agree however that the styling of snapshots and live
documents are altogether too often too close to one another. Workable
suggestions in this space are very welcome (I'll be trying some stuff in
my corner, which I'll make available soon).

Henri Sivonen

unread,
Sep 22, 2014, 8:40:52 AM9/22/14
to Robin Berjon, L. David Baron, dev-platform
On Mon, Sep 22, 2014 at 2:41 PM, Robin Berjon <ro...@w3.org> wrote:
> I was hoping that we could simply reference WHATWG URL as a (small) token of
> good faith and normalisation, adding a small cobblestone to pave the way to
> cooperation.

If that was the goal, changing the "Goals" section of the spec to cast
doubts about whether the direction the W3C envisions for the spec is
consistent with the goal that are the actual reason for the spec's
existence was a rather bad way to go about it.

As for whether it's a small-group concern, I wish there was less
confrontational rhetoric, so I don't want to show up to make a "group
of angry agitators" larger, but I think there should be a spec that
defines how URLs work in a way that's well-defined realistically
implementable in browser engines (and in other software that wishes to
work with content that's written mainly to be consumed by browser
engines). Considering how long the IETF has had to deliver such a spec
but hasn't delivered and how practically infeasible it seems to get
the kind of work that Anne is doing done within the framework of an
IETF WG, I think Anne's spec should be given a chance without casting
doubts from the start about it getting changed over motivations other
that Web compatibility in a later revision.

Robin Berjon

unread,
Sep 22, 2014, 8:48:49 AM9/22/14
to Henri Sivonen, L. David Baron, dev-platform
On 22/09/2014 14:40 , Henri Sivonen wrote:
> On Mon, Sep 22, 2014 at 2:41 PM, Robin Berjon <ro...@w3.org> wrote:
>> I was hoping that we could simply reference WHATWG URL as a (small) token of
>> good faith and normalisation, adding a small cobblestone to pave the way to
>> cooperation.
>
> If that was the goal, changing the "Goals" section of the spec to cast
> doubts about whether the direction the W3C envisions for the spec is
> consistent with the goal that are the actual reason for the spec's
> existence was a rather bad way to go about it.

For context, you are talking about changing the "Goals" section of the
URL spec, right? That part is largely out of my hands, but it is
certainly something that referencing the WHATWG specification directly
would have solved directly.

> As for whether it's a small-group concern, I wish there was less
> confrontational rhetoric, so I don't want to show up to make a "group
> of angry agitators" larger

Actually I was talking about the small group of people who *object* to
referencing a WHATWG specification.

> but I think there should be a spec that
> defines how URLs work in a way that's well-defined realistically
> implementable in browser engines (and in other software that wishes to
> work with content that's written mainly to be consumed by browser
> engines). Considering how long the IETF has had to deliver such a spec
> but hasn't delivered and how practically infeasible it seems to get
> the kind of work that Anne is doing done within the framework of an
> IETF WG, I think Anne's spec should be given a chance without casting
> doubts from the start about it getting changed over motivations other
> that Web compatibility in a later revision.

Well yes, that's pretty much my point.

Henri Sivonen

unread,
Sep 22, 2014, 9:04:45 AM9/22/14
to Robin Berjon, L. David Baron, dev-platform
On Mon, Sep 22, 2014 at 3:48 PM, Robin Berjon <ro...@w3.org> wrote:
> On 22/09/2014 14:40 , Henri Sivonen wrote:
>>
>> On Mon, Sep 22, 2014 at 2:41 PM, Robin Berjon <ro...@w3.org> wrote:
>>>
>>> I was hoping that we could simply reference WHATWG URL as a (small) token
>>> of
>>> good faith and normalisation, adding a small cobblestone to pave the way
>>> to
>>> cooperation.
>>
>>
>> If that was the goal, changing the "Goals" section of the spec to cast
>> doubts about whether the direction the W3C envisions for the spec is
>> consistent with the goal that are the actual reason for the spec's
>> existence was a rather bad way to go about it.
>
>
> For context, you are talking about changing the "Goals" section of the URL
> spec, right?

Yes.

For reference for those following along who don't want to bother to
look it up, the WHATWG URL Goals section reads:

> The URL standard standardizes URLs, aiming to make them fully interoperable.
> It does so as follows:
>
> * Align RFC 3986 and RFC 3987 with contemporary implementations and
> obsolete them in the process. (E.g. spaces, other "illegal" code points,
> query encoding, equality, canonicalization, are all concepts not entirely
> shared, or defined.) URL parsing needs to become as solid as HTML
> parsing. [URI] [IRI]
>
> * Standardize on the term URL. URI and IRI are just confusing. In practice
> a single algorithm is used for both so keeping them distinct is not helping
> anyone. URL also easily wins the search result popularity contest.
>
> * Supplanting Origin of a URI [sic]. [ORIGIN]
>
> * Define URL's existing JavaScript API in full detail and add
> enhancements to make it easier to work with. Add a new URL object
> as well for URL manipulation without usage of HTML elements.
> (Useful for Web Workers.)
>
> Note: As the editor learns more about the subject matter the goals might
> increase in scope somewhat.

The W3C Goals section replaces the last "Note" paragraph with:

> W3C-specific note: This specification documents current RFC 3986 and
> RFC 3987 handling in contemporary Web browser implementations. As a
> consequence, this specification is not compatible with those RFCs. It is
> published for the purpose of providing a stable reference for the HTML5
> specification and reflecting current Web browser HTML5 implementations.
> The W3C Technical Architecture Group expects to continue the work on
> the URL specification and produce a future version that will attempt to
> re-align the URL specification with an updated version of RFC 3986
> while preserving interoperability.

James Graham

unread,
Sep 22, 2014, 9:22:32 AM9/22/14
to dev-pl...@lists.mozilla.org
On 22/09/14 12:43, Henri Sivonen wrote:
> On Sun, Sep 21, 2014 at 4:00 PM, James Graham <ja...@hoppipolla.co.uk> wrote:

>> leaving the remaining participants to debate topics of
>> little consequence.
>
> FWIW, this bit is being spun into Twitter propaganda about Mozilla not
> caring about accessibility, so it might be worthwhile to be careful
> with the phrasing.

Maybe I should address the question of accessibility directly.

I think that accessibility is very important, and that of all the things
that have suffered from the HTML specification schism it is by far the
most significant. A situation where web technologies are largely being
developed in one location, before being slightly rewritten to add or
change a11y features by a different set of people in a different
location seems likely to result in a poor outcome for users that depend
on these features. Therefore it seems like we should be actively working
to fix this divide.

In the short term, if we are using the W3C version of the spec when we
implement a11y support, we should ensure that any substantive
differences that affect our implementation are backported to the WHATWG
version. I understand that Hixie is not held in high esteem by everyone
working in a11y, but he has committed to defer to implementations if
they agree on a common behaviour, whatever his personal feelings, and I
see no reason he should renege on that promise.

In the long term, I don't know what the solution is. There is clearly a
cultural divide that we have failed to bridge. It's unclear how to
examine that history without stirring bad feelings on both sides. At the
least I suggest not trying to hold this discussion via Twitter; it is
not a medium not known for subtlety or nuance. Without these it is easy
to preach to the choir but hard to win over doubters.

I don't think this discussion has much direct bearing on the HTML5 PR,
although it does indicate that we are actively implementing technologies
from the W3C specification.

Robin Berjon

unread,
Sep 22, 2014, 9:31:47 AM9/22/14
to Henri Sivonen, L. David Baron, dev-platform
On 22/09/2014 15:04 , Henri Sivonen wrote:
> On Mon, Sep 22, 2014 at 3:48 PM, Robin Berjon <ro...@w3.org> wrote:
>> On 22/09/2014 14:40 , Henri Sivonen wrote:
>>> If that was the goal, changing the "Goals" section of the spec to cast
>>> doubts about whether the direction the W3C envisions for the spec is
>>> consistent with the goal that are the actual reason for the spec's
>>> existence was a rather bad way to go about it.
>>
>> For context, you are talking about changing the "Goals" section of the URL
>> spec, right?
>
> Yes.

Right. So I can't speak for the people who are working on that, but I
can vouch that they are open to feedback and have no foul intention
whatsoever. The formulation of the note may come across as unfortunate,
but I know that their intent was always to operate through pull requests
made against the upstream spec.

Overall, Anne's URL spec puts us all in a much better situation than we
were when we only had the RFCs. However, there are (likely mostly
non-Web) implementations and domains that are more strictly close the
RFCs. If we could keep those worlds separate, we'd all be fine, but of
course these things have a tendency to leak. As a result, some form of
unified URL spec that can work across the board makes sense to me
(though it's not on my personal high priority list). If there are people
interested in the work and it can be done through non-disruptive PRs I'm
very much fine with it.

Anne van Kesteren

unread,
Sep 22, 2014, 10:06:57 AM9/22/14
to Robin Berjon, Henri Sivonen, dev-platform, L. David Baron
On Mon, Sep 22, 2014 at 3:31 PM, Robin Berjon <ro...@w3.org> wrote:
> Right. So I can't speak for the people who are working on that, but I can
> vouch that they are open to feedback and have no foul intention whatsoever.

I've yet to receive replies to the feedback I gave when it was announced.


> Overall, Anne's URL spec puts us all in a much better situation than we were
> when we only had the RFCs. However, there are (likely mostly non-Web)
> implementations and domains that are more strictly close the RFCs. If we
> could keep those worlds separate, we'd all be fine, but of course these
> things have a tendency to leak. As a result, some form of unified URL spec
> that can work across the board makes sense to me (though it's not on my
> personal high priority list). If there are people interested in the work and
> it can be done through non-disruptive PRs I'm very much fine with it.

It makes sense. However, so far we haven't even tested yet whether
browsers can migrate from their current (somewhat broken) URL strategy
to something that is slightly saner. Let alone whether they can
migrate to something they never conformed with in the first place and
which was written while simply ignoring important deployments.


--
https://annevankesteren.nl/

James Graham

unread,
Sep 22, 2014, 12:23:49 PM9/22/14
to Robin Berjon, dev-pl...@lists.mozilla.org
On 22/09/14 13:16, Robin Berjon wrote:

> I can't say it has brought about a revolution yet, but it has certainly
> helped change minds. It's hard to argue against a continuously updated
> test suite. It's hard to imagine that such an animal wouldn't find spec
> bugs in addition to implementation bugs. It's hard to justify knowing
> about bugs and not shipping an update. Things tend to make their own way
> from there.

That sounds like a plausible story; let's hope the truth is something
close to it.

>> However if we as an organisation really care about testing
>> core platform features which already have an implementation in gecko,
>> one way to achieve that would be to give someone working on Servo the
>> explicit job of creating testsuites for the big-ticket features they
>> implement as they implement them.
>
> That would certainly be helpful.
>
> In addition, I would note that while a shared test suite benefits
> everyone. At this point Mozilla has proven to be a huge contributor to
> the WPT project (with Opera's massive release of tests another notable
> help) but we have not yet seen comparable commitments from the other
> browser vendors. So any help you can provide in convincing people to
> contribute is very welcome.

Yes, getting more contributions from all vendors would be a big
improvement. However I think it's worth noting that others have made
contributions; for example Google have contributed a number of
testsuites for new technologies they are implementing.

> Well, my plan is to move pretty much there in a matter of months. For me
> that's the biggest value in putting HTML5 out of the door: it frees up a
> lot of flexibility (and energy) in how things are done from now on.

If this happens it will be great progress. I fear there is still a lot
of stop energy for large changes, but I guess trying to push them
through is the only way to find out of that's actually the case.

Kyle Huey

unread,
Sep 22, 2014, 12:26:49 PM9/22/14
to Robin Berjon, Karl Dubost, Boris Zbarsky, L. David Baron, dev-pl...@lists.mozilla.org
That's fair. I think a lot of people have put a lot of time into
trying to change things constructively though.

- Kyle

L. David Baron

unread,
Sep 22, 2014, 1:04:36 PM9/22/14
to Robin Berjon, Karl Dubost, Kyle Huey, dev-pl...@lists.mozilla.org, Boris Zbarsky
On Monday 2014-09-22 13:55 +0200, Robin Berjon wrote:
> I agree with your general sentiment but I would qualify it. If you
> are participating *and* have made a bona fide attempt at fixing the
> issues you see with the group then you can certainly distance
> yourself from the group's actions.
>
> But if you haven't tried to change things constructively,
> complaining about it elsewhere doesn't seem all that helpful.

I don't think you want to require every individual to try; this
would lead to lots of "+1" or duplicative posts to lists, and
similar duplicative effort, wasting the time of both those making
the effort and those reading their output.

-David

--
𝄞 L. David Baron http://dbaron.org/ 𝄂
𝄢 Mozilla https://www.mozilla.org/ 𝄂
Before I built a wall I'd ask to know
What I was walling in or walling out,
And to whom I was like to give offense.
- Robert Frost, Mending Wall (1914)
signature.asc

James Graham

unread,
Sep 22, 2014, 1:18:07 PM9/22/14
to dev-pl...@lists.mozilla.org
On 21/09/14 22:19, Boris Zbarsky wrote:
> On 9/21/14, 9:00 AM, James Graham wrote:
> More interestingly, either the specification is implementable or not.
> Again, because once the REC is published everyone goes home and never
> touches that document again.
>
> The two implementations condition was there to make sure you didn't end
> up with specs that basically reflected one UA's internals and that no
> one else could implement....

Yeah, I understand the reasoning. But I think it's an example of
replacing a difficult question with a simpler, but less accurate, one. I
think you'd get a better result by asking for agreement from all the
relevant implementors that they felt that the spec was implementable.
Obviously answering this question with any accuracy requires the
implementor to actually put some effort into understanding the design in
the context of their code, probably by implementing it. This would allow
you to be less conservative (you don't have to wait for every
low-priority bug to be fixed), without having to sabotage the testsuite
in order to keep up the pretense those bugs don't exist at all.

Of course in reality, if someone ships something that exposes their
internals and content comes to depend on it, everyone else is forced to
copy that behaviour anyway, with as much fidelity as they can. Specs are
only helpful here in the face of good actors. Even well-intentioned
actors without the right level of technical insight, and incentives to
ship quickly, can be harmful. As we both know, it's not unheard of for
people to follow enough process to get their stuff to Rec. with two
"interoperable" implementations, and gaping holes in the spec.

>> it doesn't seem at all
>> reasonable to demand that the specification is held to a higher standard.
>
> Note that I made no such demand, precisely because I think it's
> unrealistic.

Right, I didn't intend to accuse you personally of making unrealistic
demands. In general my reply to you wasn't intended to be taken as
disagreement, just as a useful jumping off point into the discussion.

>> My concrete suggestion is that we, as an organisation, work to achieve
>> parity between the tests we require to ship our own code and those we
>> release in ways that can be used to support a spec and, more
>> importantly, those that can be used verify that different
>> implementations match up.
>
> This is a good idea, but not terribly relevant to what dbaron should say
> in his AC rep capacity, right?

No, this was intended to be a broader point aimed at ensuring that, as
far as possible, the kind of problems we're experiencing with HTML don't
repeat in the future.

In terms of what dbaron should say, it seems like we should report the
publication, but note that we are unhappy with the overall process that
has led to this specification for all the reasons that have been
identified elsewhere on this thread.

>
>> Making this process as low-overhead as
>> possible is something that I'm working on.
>
> And it's much appreciated!
>
> Note that actually sanely testing something like navigation in
> non-browser-specific ways is ... hard. Basic things like "open a
> cross-origin window and wait for it to load" aren't really possible. :(

Using window.open("http://some.cross.origin.url"), you mean? Couldn't
you put a postMessage() in the load event of the opened document? It
requires your test to go async and depends on how precise your needs are
of course.

There is a hope that over time we can address more use cases that need
access to privileged APIs. There is a work in progress that will allow
using WebDriver from test cases, for example. It's not clear that this
will allow us to meet all needs, but it should make a difference in some
cases.

>> Obviously this isn't going to make a short-term difference for old
>> features like WindowProxy. I'm not sure what to suggest for those cases
>> given that we have de-facto shown an unwillingness to invest even
>> relatively small amounts of effort into reviewing existing tests that
>> could be part of the HTML testsuite for that feature [1].
>
> Missing reference?

https://critic.hoppipolla.co.uk/r/282

Boris Zbarsky

unread,
Sep 22, 2014, 1:53:01 PM9/22/14
to
On 9/22/14, 1:18 PM, James Graham wrote:
> I think you'd get a better result by asking for agreement from all the
> relevant implementors that they felt that the spec was implementable.

The problem was that in some cases this was more a less a non-goal (in
some cases an anti-goal) for the spec editor. Hence the process bit to
somewhat force them to deal with the issue. :(

> Of course in reality, if someone ships something that exposes their
> internals and content comes to depend on it, everyone else is forced to
> copy that behaviour anyway, with as much fidelity as they can.

Yes, true.

> Specs are only helpful here in the face of good actors.

And in making it clear when people are being bad actors, yes.

> As we both know, it's not unheard of for
> people to follow enough process to get their stuff to Rec. with two
> "interoperable" implementations, and gaping holes in the spec.

Indeed.

>> Note that actually sanely testing something like navigation in
>> non-browser-specific ways is ... hard. Basic things like "open a
>> cross-origin window and wait for it to load" aren't really possible. :(
>
> Using window.open("http://some.cross.origin.url"), you mean? Couldn't
> you put a postMessage() in the load event of the opened document?

You can in some cases. In other cases (like when you're testing the
nulling out of window.opener by callers to disconnnect the callee from
them, or testing opening of sandboxed things that shouldn't be allowed
to run script) this is not an option.

> It requires your test to go async

You need that anyway for window.open, so that's not an issue.

> and depends on how precise your needs are of course.

Right.

> There is a hope that over time we can address more use cases that need
> access to privileged APIs. There is a work in progress that will allow
> using WebDriver from test cases, for example. It's not clear that this
> will allow us to meet all needs, but it should make a difference in some
> cases.

Yeah, I'm very much looking forward to this.

> https://critic.hoppipolla.co.uk/r/282

Thank you. This is definitely something we should find time to review....

-Boris

Tantek Çelik

unread,
Sep 22, 2014, 4:10:37 PM9/22/14
to Boris Zbarsky, dev-pl...@lists.mozilla.org
Specifically on the subject of what URL spec to reference, I think it
should be Mozilla's position (which I'm willing to represent) that the
W3C HTML5 spec reference the dated URL spec[1] instead of the
copy/paste/modified(even if informatively) W3C WebApps URL spec.

[1] https://whatwg.org/specs/url/2014-07-30/

I'd like to make this proposal on public-html (since I'm still at
least somewhat participating in W3C HTMLWG) by Wednesday unless there
are objections from Mozilla folks on this thread (which I expect 2
days sufficient to resolve).

Usually I'd just go ahead with this proposal, but given the diversity
of opinions on this thread, figured I'd see what folks here thought
first.

Regardless, I support providing that same feedback in our response to
the W3C HTML5 PR.

Thanks,

Tantek
> _______________________________________________
> dev-platform mailing list
> dev-pl...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform

Patrick Walton

unread,
Sep 30, 2014, 11:56:41 AM9/30/14
to dev-pl...@lists.mozilla.org
On 9/21/14 6:00 AM, James Graham wrote:
> In the longer term, one might hope that bugfixes will produce new
> testcases that could be upstreamed, and Servo might need a proper
> testsuite to achieve interoperability. Having said that, Servo has so
> far not produced a significant number of tests, which has been a little
> surprising as they have been implementing some of the core pieces of the
> platform which are indeed under-tested. I suspect this is because the
> skills, interests and goals of the team are around producing code rather
> than tests. For people making small contributions it would also be
> rather off-putting to be told "no you can't land this fix that makes
> Wikipedia look better without a comprehensive testsuite for the relevant
> feature". However if we as an organisation really care about testing
> core platform features which already have an implementation in gecko,
> one way to achieve that would be to give someone working on Servo the
> explicit job of creating testsuites for the big-ticket features they
> implement as they implement them.

We land simple reftests whenever we fix bugs in Servo in order to
prevent regressions. Our big-ticket items tend to be tested by
comprehensive reftests that test many things at the same time (e.g. the
border test).

On the Servo team we don't have the manpower to dedicate one member to
writing comprehensive test suites. Our research goals are oriented
toward proving that parallel layout works on the Web, which means, at
this time, showing that real, popular Web sites look correct and
demonstrate parallel speedups. Standards work, including writing test
suites, is useful, but has to be secondary to proving the viability of
the project.

Patrick

James Graham

unread,
Oct 1, 2014, 9:12:47 AM10/1/14
to dev-pl...@lists.mozilla.org
On 30/09/14 16:56, Patrick Walton wrote:
> On 9/21/14 6:00 AM, James Graham wrote:
>> In the longer term, one might hope that bugfixes will produce new
>> testcases that could be upstreamed, and Servo might need a proper
>> testsuite to achieve interoperability. Having said that, Servo has so
>> far not produced a significant number of tests, which has been a little
>> surprising as they have been implementing some of the core pieces of the
>> platform which are indeed under-tested. I suspect this is because the
>> skills, interests and goals of the team are around producing code rather
>> than tests. For people making small contributions it would also be
>> rather off-putting to be told "no you can't land this fix that makes
>> Wikipedia look better without a comprehensive testsuite for the relevant
>> feature". However if we as an organisation really care about testing
>> core platform features which already have an implementation in gecko,
>> one way to achieve that would be to give someone working on Servo the
>> explicit job of creating testsuites for the big-ticket features they
>> implement as they implement them.

[...]

> On the Servo team we don't have the manpower to dedicate one member to
> writing comprehensive test suites. Our research goals are oriented
> toward proving that parallel layout works on the Web, which means, at
> this time, showing that real, popular Web sites look correct and
> demonstrate parallel speedups. Standards work, including writing test
> suites, is useful, but has to be secondary to proving the viability of
> the project.

Right, I understand that dedicated work on testing doesn't match the
current priorities of the project. However I think that it's worth
considering the benefits of doing more of this work under the Servo
umbrella when reassessing those priorities in the future.

Producing tests seems clearly in line with the Mozilla mission, and
provides a way for Servo to have a near-term impact on other Mozilla
products such as Gecko. This is particularly the case when Servo is
implementing parts of the platform that have poor interop in existing
browsers.

It should also allow Servo itself to move faster by making it more
likely that the implementations you make are actually web-compatible on
the first try, rather than needing many cycles of redesigns and fixups.
The new HTML parser is an example of a complex feature that has a large
existing testsuite and which I am therefore confident will work in a
web-compatible way (at least for static documents) as soon as it lands.
Without the tests I am sure this would not have been the case.

L. David Baron

unread,
Oct 14, 2014, 1:29:16 AM10/14/14
to dev-pl...@lists.mozilla.org, James Graham, Henri Sivonen, Boris Zbarsky, Tantek Çelik
On Friday 2014-09-19 17:23 -0700, L. David Baron wrote:
> W3C recently published the following proposed recommendation (the
> stage before W3C's final stage, Recommendation):
>
> http://www.w3.org/TR/html5/
> HTML5
>
> There's a call for review to W3C member companies (of which Mozilla
> is one) open until October 14.

Here is my current draft of the comments I plan to submit in about 12
hours (cc:ing the whole AC, I think). Sorry for not getting this out
for people to have a look at sooner.

-David

Regarding the HTML5 specification, my organization:

(X) suggests changes, but supports publication as a W3C Recommendation
whether or not the changes are adopted (your details below).

Comments:

General comments:

We support publication as a Recommendation although there are surely
many details in the specification that are wrong, either because the
specification disagrees with itself or because it disagrees with what is
needed to make an implementation that can suceed in the market. The
level of coverage in the test suite is not enough to avoid that. These
errors will be found over time.

The harm these errors cause will be determined by how the W3C community
handles the HTML specification in the future:

* Statements in the specification that are inconsistent or incompatible
with what Web content requires should keep being fixed in the future,
just as they have been while developing HTML5 up to this point. That
those statements are part of a W3C Recommendation should not increase
the burden of proof.

* Development of tests that test this specification should continue.
Being declared "interoperable enough" for Recommendation should not
stop future increase in interoperability. And this development of
tests should focus on the latest specification, not on the
Recommendation snapshot.

To put this another way: while we support the publication of this
specification as a W3C Recommendation, we do not likewise support the
promotion of W3C Recommendation status as a major milestone. The
process of continuous improvement, which should continue, is far more
important than the snapshot.


Specific actionable change proposals:

(1) We would like to see the reference for the URL specification point
to the CG snapshot, as proposed in
http://lists.w3.org/Archives/Public/public-html/2014Sep/0061.html

(2) While it would be helpful to have the recommendation contain
pointers to current and future work (e.g., have a more useful
"Latest Editor's Draft" link that's likely to point to the editor's
draft for future HTML specification development), and a useful
explanation in the status section of the differences between the
recommendation and the editor's draft.


Usage:

[X] produces products addressed by this specification
[X] expects to produce products conforming to this specification
[X] expects to produce content conforming to this specification
[X] expects to use products conforming to this specification

When the W3C's and WHATWG's HTML specifications differ, we tend to
follow the WHATWG one.


Other comments:
signature.asc

Boris Zbarsky

unread,
Oct 14, 2014, 8:31:13 AM10/14/14
to
On 10/14/14, 1:29 AM, L. David Baron wrote:
> (2) While it would be helpful to have the recommendation contain

The "While" seems extraneous.

The rest looks great!

-Boris

Karl Dubost

unread,
Oct 17, 2014, 1:28:09 PM10/17/14
to L. David Baron, James Graham, Henri Sivonen, Boris Zbarsky, dev-pl...@lists.mozilla.org, Tantek Çelik
David,

> Le 14 oct. 2014 à 07:29, L. David Baron <dba...@dbaron.org> a écrit :
> Here is my current draft of the comments I plan to submit in about 12
> hours (cc:ing the whole AC, I think). Sorry for not getting this out
> for people to have a look at sooner.

Good summary of our discussions. Thanks.

--
Karl Dubost, Mozilla
http://www.la-grange.net/karl/moz

Reply all
Reply to author
Forward
0 new messages