TL;DR - The KumaScript macro is a temporary solution until the rendering
service is ready. The primary use of the rendering service will be to
render the tables on MDN, to solve the problem of stale data. Use by other
services or third parties is secondary.
The long version...
Occasionally, we really want to refresh a large number of pages on MDN, but
we just can't. For example, when HTML5 switched from a Living Standard to a
W3C Recommendation, a lot of pages continued to show Living Standard for
months. This is because page refreshes are expensive, and a full refresh
of the site is measured in days. We currently refresh pages when they are
saved, or when someone manually force-refreshes the page. This is still
enough to cause strain on the site and require IT involvement. A better
refresh story for MDN is still a goal, but there are other goals we're
working toward first, like reducing page load time to an average of 2
seconds, and moving the infrastructure.
The refresh problem gets worse with BrowserCompat, because there are many
ways to update the compatibility data without updating the MDN page. Some
look like a current KumaScript update (update a specification from a
Candidate Recommendation to a Recommendation, add a localization for
"Firefox for Desktop") but even "traditional" edits like adding a supported
version for a feature are going to happen against the API, on the
contribution and moderation site, or even by a script directly against the
API. Updating a single bit of data, like promoting a Firefox version from
beta to the current released version, can impact hundreds to thousands of
pages on MDN. MDN refresh is not a viable solution at this time.
We thought of several solutions in Q3 2015 [1], but narrowed in on a
heavily cached site that serves up HTML fragments for MDN [2]. The workflow
will be:
1) On MDN page save, KumaScript will ask the render service for localized
HTML fragment for the current page, and inject this HTML into the rendered
MDN page. This looks a lot like the EmbedCompatTable macro [3], but the
table construction code is in the rendering service, and the KumaScript
portion is reduced to talking to the service, returning the results, and
error handling.
2) When a user views the MDN page, a bit of JS will ask the render service
for new localized HTML.
3) If nothing has changed since the page was saved, the render service will
return a "204 No Content" message, which, as implied, has no content. To be
more precise, a cache server in front of the render service will return the
204 message, since the JS will send headers it can use to confirm that the
MDN page is up-to-date.
4) When compatibility data changes, the caches will be invalidated, making
fresh content available. In the case of feature data, the cache may be
force invalidated, so that the MDN page will instantly get the updated
data. In the case of changes with a wide impact, like browser name
localizations, it may take 30-60 minutes for caches to expire and notice
the new value.
5) When the in-page JS asks the render service for new, localized HTML, the
cache server will recognize that new data is available, and ask the render
service to generate the new HTML. The MDN user briefly sees the "old" data,
which is then replaced with the latest compatibility data. The new rendered
page is stored in the renderer's cache server for future requests from the
same MDN page.
If something goes wrong (the renderer service is down, the user has
disabled JS, the user is a search engine crawler that doesn't execute JS),
then they will get the data at the time the page was last saved or
refreshed. Users that know the trick can still use force-refresh.
The table rendering service may be useful in the contribution and
moderation interface. The UI for adding and updating information will be
different from the table display, but we can potentially display a preview
of how the table will look with the new data. It also may be useful in
moderation, for displaying how the modification impacted the table
displayed on MDN. These may use the rendering service, but are more likely
to directly use the javascript used by the rendering service.
While I think we have a good infrastructure planned for wide, 3rd-party use
of the API, our first goal is to get the one site we control using it.
This includes the rendering service, which will provide MDN-specific HTML
fragments, which may or may not be usable on other websites. We'll
recommend that 3rd parties build their own presentations by working against
the API. We can plan a more general rendering service when there is a
demand.
John Whitlock
jwhitlock on #mdndev
[1]
https://old.etherpad-mozilla.org/mdn-compat-rebuild-solutions
[2]
https://groups.google.com/forum/#!searchin/mozilla.dev.mdn/refresh/mozilla.dev.mdn/CkBIuYNwgU8/9gXLEeRaAQAJ
[3]
https://developer.mozilla.org/en-US/docs/Template:EmbedCompatTable
> _______________________________________________
> dev-mdn mailing list
>
dev...@lists.mozilla.org
>
https://lists.mozilla.org/listinfo/dev-mdn
>