Intent to Implement and Ship: forcing flattening for elements with opacity < 1

20 views
Skip to first unread message

Chris Harrelson

unread,
Jun 6, 2016, 8:41:31 PM6/6/16
to blink-dev, pain...@chromium.org

Contact emails

chri...@chromium.org, trc...@chromium.org


Spec

Editor’s draft (includes desired future changes): https://drafts.csswg.org/css-transforms/#grouping-property-values

Current spec: https://www.w3.org/TR/css-transforms-1/


Summary

“Flattening” is the process of drawing 3D-positioned children into the parent's rendering plane. HTML elements flatten by default, but can prevent this by applying transform-style: preserve-3d CSS. This only has an effect if children are actually positioned in 3D. Flattening is required/strongly desired, however, if elements have style which require filter-like rendering or clipping. We already force flattening for a number of the situations mentioned in the editor’s draft as “grouping properties” (*)

(In the language of that spec, a “grouping property” forces flattening.)


We will change the code so that opacity < 1 force flattening.


This behavior is already part of the Editor’s Draft spec (along with a number of other changes), though no browser implements it yet.


This particular change is motivated by the simplifications it yields for how we composite 3D-positioned elements with opacity. Currently, opacity is the only post-processing effect which does not force flattening. Without this change, we are forced to apply opacity individually to elements positioned in 3D space rather than everything as a group. See here for even more background on implementation impact.


(*) In particular: filters, overflow clip, and mix-blend-mode. Clip-path, mask, CSS clip and isolation currently do not. (This latter behavor is also undesirable, but not as much of an implementation problem as opacity.)


Is this feature supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?


Yes.


Demo link


Before (applies opacity each plane individually)

After (applies opacity after flattening; this demo simulates the result by forcing flattening on the element with opacity)


Interoperability and Compatibility Risk


At most 0.006% of pages are affected, and likely much less due to over-counting. We expect that this will have very little compatibility risk.


Firefox, Safari and IE implement Chrome’s current behavior. Firefox recently regressed and implemented a fix to recover Chrome’s current behavior.


Signals


Firefox: cautiously positive (see www-style thread). They seem likely to apply our new behavior if we succeed with this Intent.

Other browsers: none

Developers: none


OWP launch tracking bug

crbug.com/617798


Entry on the feature dashboard

TBD


Rick Byers

unread,
Jun 6, 2016, 9:08:24 PM6/6/16
to Chris Harrelson, blink-dev, pain...@chromium.org
On Mon, Jun 6, 2016 at 8:41 PM, Chris Harrelson <chri...@chromium.org> wrote:

Contact emails

chri...@chromium.org, trc...@chromium.org


Spec

Editor’s draft (includes desired future changes): https://drafts.csswg.org/css-transforms/#grouping-property-values

Current spec: https://www.w3.org/TR/css-transforms-1/


Summary

“Flattening” is the process of drawing 3D-positioned children into the parent's rendering plane. HTML elements flatten by default, but can prevent this by applying transform-style: preserve-3d CSS. This only has an effect if children are actually positioned in 3D. Flattening is required/strongly desired, however, if elements have style which require filter-like rendering or clipping. We already force flattening for a number of the situations mentioned in the editor’s draft as “grouping properties” (*)

(In the language of that spec, a “grouping property” forces flattening.)


We will change the code so that opacity < 1 force flattening.


This behavior is already part of the Editor’s Draft spec (along with a number of other changes), though no browser implements it yet.


This particular change is motivated by the simplifications it yields for how we composite 3D-positioned elements with opacity. Currently, opacity is the only post-processing effect which does not force flattening. Without this change, we are forced to apply opacity individually to elements positioned in 3D space rather than everything as a group. See here for even more background on implementation impact.


(*) In particular: filters, overflow clip, and mix-blend-mode. Clip-path, mask, CSS clip and isolation currently do not. (This latter behavor is also undesirable, but not as much of an implementation problem as opacity.)


Is this feature supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?


Yes.


Demo link


Before (applies opacity each plane individually)

After (applies opacity after flattening; this demo simulates the result by forcing flattening on the element with opacity)


Interoperability and Compatibility Risk


At most 0.006% of pages are affected, and likely much less due to over-counting. We expect that this will have very little compatibility risk.


Are the sites that were broken in Firefox all similarly broken in Chrome with this change?  I'm a little surprised that a change with such low predicted compat risk would nonetheless trigger multiple bug reports in Firefox testing channels.  Is their compat testing just that good, or might we be underestimating the risk somehow? Looks like your UseCounter is indeed triggering on at least this site.

Normally breaking changes with non-zero usage go through at least one milestone with a deprecation warning and link to guidance to avoid the issue (especially when we know of substantial breakage in some site in practice).  Any reason not to do that here?

Any point in doing outreach to known affected sites?  Eg. any chance there's an open source library in use in multiple places we could submit a PR against?

Chris Harrelson

unread,
Jun 7, 2016, 2:00:22 PM6/7/16
to Rick Byers, blink-dev, pain...@chromium.org
On Mon, Jun 6, 2016 at 6:08 PM Rick Byers <rby...@chromium.org> wrote:
On Mon, Jun 6, 2016 at 8:41 PM, Chris Harrelson <chri...@chromium.org> wrote:

Contact emails

chri...@chromium.org, trc...@chromium.org


Spec

Editor’s draft (includes desired future changes): https://drafts.csswg.org/css-transforms/#grouping-property-values

Current spec: https://www.w3.org/TR/css-transforms-1/


Summary

“Flattening” is the process of drawing 3D-positioned children into the parent's rendering plane. HTML elements flatten by default, but can prevent this by applying transform-style: preserve-3d CSS. This only has an effect if children are actually positioned in 3D. Flattening is required/strongly desired, however, if elements have style which require filter-like rendering or clipping. We already force flattening for a number of the situations mentioned in the editor’s draft as “grouping properties” (*)

(In the language of that spec, a “grouping property” forces flattening.)


We will change the code so that opacity < 1 force flattening.


This behavior is already part of the Editor’s Draft spec (along with a number of other changes), though no browser implements it yet.


This particular change is motivated by the simplifications it yields for how we composite 3D-positioned elements with opacity. Currently, opacity is the only post-processing effect which does not force flattening. Without this change, we are forced to apply opacity individually to elements positioned in 3D space rather than everything as a group. See here for even more background on implementation impact.


(*) In particular: filters, overflow clip, and mix-blend-mode. Clip-path, mask, CSS clip and isolation currently do not. (This latter behavor is also undesirable, but not as much of an implementation problem as opacity.)


Is this feature supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?


Yes.


Demo link


Before (applies opacity each plane individually)

After (applies opacity after flattening; this demo simulates the result by forcing flattening on the element with opacity)


Interoperability and Compatibility Risk


At most 0.006% of pages are affected, and likely much less due to over-counting. We expect that this will have very little compatibility risk.


Are the sites that were broken in Firefox all similarly broken in Chrome with this change?  I'm a little surprised that a change with such low predicted compat risk would nonetheless trigger multiple bug reports in Firefox testing channels.  Is their compat testing just that good, or might we be underestimating the risk somehow? Looks like your UseCounter is indeed triggering on at least this site.

Short answer: I'm not sure. The site you linked to is for a restaurant in Valencia, Spain, so a very small-traffic website.

Also, I think Firefox was broken in bigger ways than just when to apply opacity. In Chrome, I hacked the opacity element which contains the animating menu to have transform-style: flat and couldn't observe a significant difference in this page. In this case the opacity of 0.95 would be applied to the whole menu post-flattening rather than each menu page.
 

Normally breaking changes with non-zero usage go through at least one milestone with a deprecation warning and link to guidance to avoid the issue (especially when we know of substantial breakage in some site in practice).  Any reason not to do that here?

In this case there is no web api changing, it's that they have a very subtle issue with opacity. Given the percentage of sites and the difficulty of explaining the change, I'm not so sure it's worth it.
 

Any point in doing outreach to known affected sites?  Eg. any chance there's an open source library in use in multiple places we could submit a PR against?

In this case the website theme comes from http://themeforest.net/user/avathemes. I'm happy to reach out to them.

Rick Byers

unread,
Jun 7, 2016, 3:30:43 PM6/7/16
to Chris Harrelson, blink-dev, pain...@chromium.org
Ok, if Firefox was broken in bigger ways then that lowers my concern dramatically.  Once you have a change prepared, can you spot-check the scenarios that were reported as broken (from a quick skim of the bugs I think it's the ones here, here and here) and verify you don't see obvious breakage with your change?  
 
In Chrome, I hacked the opacity element which contains the animating menu to have transform-style: flat and couldn't observe a significant difference in this page. In this case the opacity of 0.95 would be applied to the whole menu post-flattening rather than each menu page.

Ah cool, so pretty subtle (perhaps non-observable if it's on a white background anyway).  In the Firefox case 2 of the 3 menu pages were reported to become completely white, which means the site was badly broken in practice.

Normally breaking changes with non-zero usage go through at least one milestone with a deprecation warning and link to guidance to avoid the issue (especially when we know of substantial breakage in some site in practice).  Any reason not to do that here?

In this case there is no web api changing, it's that they have a very subtle issue with opacity. Given the percentage of sites and the difficulty of explaining the change, I'm not so sure it's worth it.

I think whether there's an "API change" or not is irrelevant - what's relevant is just the risk that some website will become broken as a result of our change (potentially costing businesses money and developers to scramble - although our beta channel generally does a decent job of mitigating the risk here).  We're breaking our contract with developers (especially since this isn't just a bug-fix but a spec change), so we need to be thoughtful about keeping disruption to an absolute minimum and giving developers every chance to transition smoothly.

That said, if it's indeed so subtle as to be unlikely to be a problem in practice in most of those 0.006% cases, then I agree we can waste more time / cause more pain than save by trying to warn about it.    

Any point in doing outreach to known affected sites?  Eg. any chance there's an open source library in use in multiple places we could submit a PR against?

In this case the website theme comes from http://themeforest.net/user/avathemes. I'm happy to reach out to them.

Cool, that would be great! 

Firefox, Safari and IE implement Chrome’s current behavior. Firefox recently regressed and implemented a fix to recover Chrome’s current behavior.


Signals


Firefox: cautiously positive (see www-style thread). They seem likely to apply our new behavior if we succeed with this Intent.

Other browsers: none

Developers: none


OWP launch tracking bug

crbug.com/617798


Entry on the feature dashboard

TBD


Please add an entry we can point to with a brief summary of the observable behavior change and how developers can fix issues, just in case some site does end up being impacted.

LGTM1 to ship assuming we don't come across a real site that's badly broken as a result.  If there's at least one (or maybe two) found (eg. during beta) that suggests to me there are likely many more lurking and that we should probably do a deprecation period instead to mitigate the compat risk.  Sound reasonable?

Chris Harrelson

unread,
Jun 8, 2016, 2:08:42 PM6/8/16
to Rick Byers, blink-dev, pain...@chromium.org
Yes, for sure. Thanks for pointing these out, I might have missed all the details otherwise.
 
 
In Chrome, I hacked the opacity element which contains the animating menu to have transform-style: flat and couldn't observe a significant difference in this page. In this case the opacity of 0.95 would be applied to the whole menu post-flattening rather than each menu page.

Ah cool, so pretty subtle (perhaps non-observable if it's on a white background anyway).  In the Firefox case 2 of the 3 menu pages were reported to become completely white, which means the site was badly broken in practice.

Yes, quite subtle.
 

Normally breaking changes with non-zero usage go through at least one milestone with a deprecation warning and link to guidance to avoid the issue (especially when we know of substantial breakage in some site in practice).  Any reason not to do that here?

In this case there is no web api changing, it's that they have a very subtle issue with opacity. Given the percentage of sites and the difficulty of explaining the change, I'm not so sure it's worth it.

I think whether there's an "API change" or not is irrelevant - what's relevant is just the risk that some website will become broken as a result of our change (potentially costing businesses money and developers to scramble - although our beta channel generally does a decent job of mitigating the risk here).  We're breaking our contract with developers (especially since this isn't just a bug-fix but a spec change), so we need to be thoughtful about keeping disruption to an absolute minimum and giving developers every chance to transition smoothly.

That said, if it's indeed so subtle as to be unlikely to be a problem in practice in most of those 0.006% cases, then I agree we can waste more time / cause more pain than save by trying to warn about it.

Agreed, fair points.
 
   

Any point in doing outreach to known affected sites?  Eg. any chance there's an open source library in use in multiple places we could submit a PR against?

In this case the website theme comes from http://themeforest.net/user/avathemes. I'm happy to reach out to them.

Cool, that would be great!

I'll do that in parallel with shipping the change. By the way, I neglected to mention in the original email, but there is generally an easy workaround for the developer: to push the opacity CSS down to individual 3d-positioned elements when that is the desired rendering.

 

Firefox, Safari and IE implement Chrome’s current behavior. Firefox recently regressed and implemented a fix to recover Chrome’s current behavior.


Signals


Firefox: cautiously positive (see www-style thread). They seem likely to apply our new behavior if we succeed with this Intent.

Other browsers: none

Developers: none


OWP launch tracking bug

crbug.com/617798


Entry on the feature dashboard

TBD


Please add an entry we can point to with a brief summary of the observable behavior change and how developers can fix issues, just in case some site does end up being impacted.

Yes I'll do that. Similar to other folks' recent feedback, I was holding off on making such an entry until I had received more feedback on whether this Intent would succeed.
 

LGTM1 to ship assuming we don't come across a real site that's badly broken as a result.  If there's at least one (or maybe two) found (eg. during beta) that suggests to me there are likely many more lurking and that we should probably do a deprecation period instead to mitigate the compat risk.  Sound reasonable?

Fair enough. I will come back for more feedback if I see significant broken-sites problems.

Other owners, what say you? (I will of course not vote on this Intent.)
 

Alex Russell

unread,
Jun 8, 2016, 3:24:12 PM6/8/16
to Chris Harrelson, Rick Byers, blink-dev, pain...@chromium.org
LGTM. 

chris....@gmail.com

unread,
Oct 14, 2016, 8:44:18 AM10/14/16
to paint-dev, chri...@chromium.org, rby...@chromium.org, blin...@chromium.org
If I understand this correctly, this means that you cannot fade a transformed object in whatsoever anymore? I found this issue while searching for a cause behind a pretty ugly regression in a presentation I wrote in 2013. I made a quick youtube because direct linking to my slide isn't possible.

Youtube: https://www.youtube.com/watch?v=DyHq20oXnOQ
Presentation: https://rupl.github.io/unfold/ (found on slide 7, and you must advance using left/right arrow keys to observe issue)

If this was done for some performance reason I get it. Y'all gotta do what you do. But it still seems quite unfriendly to an unknowing CSS author.


> there is generally an easy workaround for the developer: to push the opacity CSS down to individual 3d-positioned elements when that is the desired rendering.

I get it, but that's not how opacity works on the other 99.994% of elements in the wild. It's not easy to discover the issue, even if the solution is simple.

Alex Nicolaou

unread,
Oct 14, 2016, 10:01:00 AM10/14/16
to chris....@gmail.com, paint-dev, chri...@chromium.org, rby...@chromium.org, blin...@chromium.org
It's been a while since I did any CSS hacking but we used to use opacity instead of display: to be able to smoothly make elements appear and disappear without triggering layout. Typically opacity: 0.0001 was used to make an element vanish, and opacity: 1 to make it reappear. The elements were usually also transformed via a 3D transform to put them in position. Elements of the same opacity were not related.

Does this flattening change impact tricks like that? Are there well established alternatives now for avoiding the problems with display: none?

alex

--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.

Christian Biesinger

unread,
Oct 14, 2016, 11:17:51 AM10/14/16
to Alex Nicolaou, blink-dev, Chris Harrelson, chris....@gmail.com, paint-dev, rby...@chromium.org

Out of curiosity, why use opacity instead of visibility: hidden?

-Christian


To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

Chris Ruppel

unread,
Oct 14, 2016, 11:26:40 AM10/14/16
to Christian Biesinger, Alex Nicolaou, blink-dev, Chris Harrelson, paint-dev, rby...@chromium.org
Opacity can be transitioned or animated, like in my example. Visibility and display cannot.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.

Chris Harrelson

unread,
Oct 14, 2016, 12:47:56 PM10/14/16
to chris....@gmail.com, paint-dev, Rick Byers, blink-dev
Hi,

On Fri, Oct 14, 2016 at 5:44 AM, <chris....@gmail.com> wrote:
If I understand this correctly, this means that you cannot fade a transformed object in whatsoever anymore? I found this issue while searching for a cause behind a pretty ugly regression in a presentation I wrote in 2013. I made a quick youtube because direct linking to my slide isn't possible.

Youtube: https://www.youtube.com/watch?v=DyHq20oXnOQ
Presentation: https://rupl.github.io/unfold/ (found on slide 7, and you must advance using left/right arrow keys to observe issue)

If this was done for some performance reason I get it. Y'all gotta do what you do. But it still seems quite unfriendly to an unknowing CSS author.

I'm sorry you were hit by this change. We do our best to minimize such impact, but unfortunately all changes affect someone. :(
 


> there is generally an easy workaround for the developer: to push the opacity CSS down to individual 3d-positioned elements when that is the desired rendering.

I get it, but that's not how opacity works on the other 99.994% of elements in the wild. It's not easy to discover the issue, even if the solution is simple.

It actually changes opacity to behave the same in 3D as it does in other contexts. Now we draw all children into a single texture and then apply opacity to that. In regular "2D" layout, this is exactly how opacity behaves. Previously we had special code to push the opacity down to the "leaves" of the 3D context, which is very confusing to explain and (partly as a result) was implemented inconsistently between browsers. The spec agrees with our current implementation, and I think other browsers are likely to follow suit soon.

Chris
 

--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

Rik Cabanier

unread,
Oct 14, 2016, 1:12:45 PM10/14/16
to Alex Nicolaou, chris....@gmail.com, paint-dev, Chris Harrelson, rby...@chromium.org, blink-dev
On Fri, Oct 14, 2016 at 7:00 AM, 'Alex Nicolaou' via blink-dev <blin...@chromium.org> wrote:
It's been a while since I did any CSS hacking but we used to use opacity instead of display: to be able to smoothly make elements appear and disappear without triggering layout. Typically opacity: 0.0001 was used to make an element vanish, and opacity: 1 to make it reappear. The elements were usually also transformed via a 3D transform to put them in position. Elements of the same opacity were not related.

Changing opacity always triggered layout since a value < 1 creates a new stacking context. See https://www.w3.org/TR/css3-color/#transparency
 
Does this flattening change impact tricks like that?

No. This change removes the hack that distributes alpha for 3d transforms.
 
Are there well established alternatives now for avoiding the problems with display: none?

I'm unsure what problem you're trying to avoid with display: none? Is it that you can't animate it?
 

To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.

/#!/JoePea

unread,
Oct 14, 2016, 2:55:51 PM10/14/16
to Rik Cabanier, Alex Nicolaou, chris....@gmail.com, paint-dev, Chris Harrelson, rby...@chromium.org, blink-dev


/#!/JoePea

On Fri, Oct 14, 2016 at 10:12 AM, Rik Cabanier <caba...@gmail.com> wrote:


On Fri, Oct 14, 2016 at 7:00 AM, 'Alex Nicolaou' via blink-dev <blin...@chromium.org> wrote:
It's been a while since I did any CSS hacking but we used to use opacity instead of display: to be able to smoothly make elements appear and disappear without triggering layout. Typically opacity: 0.0001 was used to make an element vanish, and opacity: 1 to make it reappear. The elements were usually also transformed via a 3D transform to put them in position. Elements of the same opacity were not related.

Changing opacity always triggered layout since a value < 1 creates a new stacking context. See https://www.w3.org/TR/css3-color/#transparency
 
Does this flattening change impact tricks like that?

No. This change removes the hack that distributes alpha for 3d transforms.

That's not necessarily a hack. It is valid behavior in many 3D engines for people who desire that behavior. For reference, here is a list of people I've asked for opinions from. Of all who've responded with an opinion, so far all agree that the non-flattening 3D behavior is preferable over the flattening behavior:
I
've only asked for opinions from people developing web-based libraries, but I'm sure we'll get more support for the 3D non-flattening behavior if we ask "native" developers. I also haven't included my library in that list, but that's obviously another +1 in support for 3D non-flattening behavior over flattening behavior.

Perhaps this is the type of research that we should do before we make absolute decisions in specs that will impact the future of arguably the most widely-used APIs known to mankind (web APIs).

The spec may say that opacity creates new stacking contexts (or whatever), but that doesn't mean that the existing feature should dictate the future of 3D in the web. Let's consider how we can modify the spec so it can make sense for both existing 2D APIs and brand-new 3D APIs in an intuitive manner from the perspective of the end-web-developer (flattening is simply not intuitive).

--
You received this message because you are subscribed to a topic in the Google Groups "blink-dev" group.
To unsubscribe from this topic, visit https://groups.google.com/a/chromium.org/d/topic/blink-dev/eBIp90_il1o/unsubscribe.
To unsubscribe from this group and all its topics, send an email to blink-dev+unsubscribe@chromium.org.

Elliott Sprehn

unread,
Oct 14, 2016, 4:18:21 PM10/14/16
to Rik Cabanier, Alex Nicolaou, chris....@gmail.com, paint-dev, Chris Harrelson, Rick Byers, blink-dev
On Fri, Oct 14, 2016 at 10:12 AM, Rik Cabanier <caba...@gmail.com> wrote:


On Fri, Oct 14, 2016 at 7:00 AM, 'Alex Nicolaou' via blink-dev <blin...@chromium.org> wrote:
It's been a while since I did any CSS hacking but we used to use opacity instead of display: to be able to smoothly make elements appear and disappear without triggering layout. Typically opacity: 0.0001 was used to make an element vanish, and opacity: 1 to make it reappear. The elements were usually also transformed via a 3D transform to put them in position. Elements of the same opacity were not related.

Changing opacity always triggered layout since a value < 1 creates a new stacking context. See https://www.w3.org/TR/css3-color/#transparency
 

fwiw will-change: opacity removes this layout by always triggering a stacking context. Authors also used something very subtly less than 1 like 0.999 as another hack to avoid the layout. :)

- E
Reply all
Reply to author
Forward
0 new messages