CSS element blending in the compositor

95 views
Skip to first unread message

Ion Rosca

unread,
Aug 21, 2013, 11:48:11 AM8/21/13
to graphics-dev, chromi...@chromium.org

Hi everyone,

 

I would like to start working on the CSS element blending (mix-blend-mode) implementation according to the Compositing and Blending spec [1]. There is some work already done in Blink, so the mix-blend-mode CSS property [2] is exposed behind the experimental Web Platform features runtime flag, but it does not have any visual effect on elements having a valid blend mode specified.

 

The software path uses Skia, which already has full support for blending. It’s already been used for implementing background-blend-mode [5] and canvas blending [6] in Blink.

To use blending with accelerated layers, requires support in the compositor, so that the accelerated layers could blend with the underlying content within the parent rendering surface.  The OpenGL blending [3] equations are not enough for implementing all the blend modes required by the spec [1].

 

Afaik, there are two options that I think could be used for supporting blending in the compositor: use the pixel programs implementing blending formulas, or use Skia with the hardware backend.

 

Skia with the hardware backend already has support for all the required blend modes [4] and is currently used to do blending within accelerated canvases [6]. We can take advantage of Skia blend modes by using the offscreen context in the compositor for blending layers, wrap compositor resources into Skia objects, composite contents with the backdrop using the specified mix-blend-mode and provide the result texture back to the compositor. Using Skia in the compositor will ensure the consistency between blending in canvas and element blending (both hw accelerated).

 

Is it ok to take advantage of the Skia hardware blending implementation as described above? Otherwise, should we implement it with pixel programs in the compositor, or are there other alternatives you can suggest?

 

For both the above solutions we need the backdrop in a separate texture either to use it in Skia or to pass it to the pixel programs as input (afaik there is no consistent way to read the backdrop pixels directly from a GPU program). Blending requires both color and alpha information of the backdrop surface and it should store it temporarily in a GL_RGBA texture. The compositor resource provider creates GL_RGBA textures using testStorage2DEXT which makes them immutable and the copyTexImage2D from GetFramebufferTexture will certainly fail. Is there another way to copy pixels into a RGBA temporary resource or is it acceptable to create this texture outside the resource provider?

 

Thanks,

Ion.

 

[1] http://dev.w3.org/fxtf/compositing-1/#csskeywords

[2] http://dev.w3.org/fxtf/compositing-1/#mix-blend-mode

[3] http://www.opengl.org/wiki/Blending

[4] http://dev.w3.org/fxtf/compositing-1/#blending

[5] http://dev.w3.org/fxtf/compositing-1/#background-blend-mode

[6] http://dev.w3.org/fxtf/compositing-1/#canvascompositingandblending

 

Antoine Labour

unread,
Aug 21, 2013, 12:48:38 PM8/21/13
to Ion Rosca, graphics-dev, chromi...@chromium.org
On Wed, Aug 21, 2013 at 8:48 AM, Ion Rosca <ro...@adobe.com> wrote:

Hi everyone,

 

I would like to start working on the CSS element blending (mix-blend-mode) implementation according to the Compositing and Blending spec [1]. There is some work already done in Blink, so the mix-blend-mode CSS property [2] is exposed behind the experimental Web Platform features runtime flag, but it does not have any visual effect on elements having a valid blend mode specified.

 

The software path uses Skia, which already has full support for blending. It’s already been used for implementing background-blend-mode [5] and canvas blending [6] in Blink.

To use blending with accelerated layers, requires support in the compositor, so that the accelerated layers could blend with the underlying content within the parent rendering surface.  The OpenGL blending [3] equations are not enough for implementing all the blend modes required by the spec [1].

 

Afaik, there are two options that I think could be used for supporting blending in the compositor: use the pixel programs implementing blending formulas, or use Skia with the hardware backend.

 

Skia with the hardware backend already has support for all the required blend modes [4] and is currently used to do blending within accelerated canvases [6]. We can take advantage of Skia blend modes by using the offscreen context in the compositor for blending layers, wrap compositor resources into Skia objects, composite contents with the backdrop using the specified mix-blend-mode and provide the result texture back to the compositor. Using Skia in the compositor will ensure the consistency between blending in canvas and element blending (both hw accelerated).

 

Is it ok to take advantage of the Skia hardware blending implementation as described above? Otherwise, should we implement it with pixel programs in the compositor, or are there other alternatives you can suggest?

 

For both the above solutions we need the backdrop in a separate texture either to use it in Skia or to pass it to the pixel programs as input (afaik there is no consistent way to read the backdrop pixels directly from a GPU program). Blending requires both color and alpha information of the backdrop surface and it should store it temporarily in a GL_RGBA texture. The compositor resource provider creates GL_RGBA textures using testStorage2DEXT which makes them immutable and the copyTexImage2D from GetFramebufferTexture will certainly fail. Is there another way to copy pixels into a RGBA temporary resource or is it acceptable to create this texture outside the resource provider?


You may want to look at the "background filter" implementation in the compositor which should be close to what you want to achieve, assuming a Skia filter that does the blending.

Your point about copyTexImage2D on immutable textures is correct. I suppose it currently works because in the background filters implementation we create a RGB (instead of RGBA) resource which doesn't go through texStorage2DEXT. But that's fragile... It'd be easy to replace it with a copyTexSubImage2D if the texture may be immutable (comes from the RP), or maybe not go through the RP to allocate it (I'm not sure it provides value).

Antoine

ro...@adobe.com

unread,
Aug 21, 2013, 5:38:14 PM8/21/13
to graphi...@chromium.org, Antoine Labour, Ion Rosca, chromi...@chromium.org, epe...@google.com


On Wednesday, August 21, 2013 8:57:38 PM UTC+3, Eric Penner wrote:

On Wed, Aug 21, 2013 at 10:49 AM, Antoine Labour <pi...@chromium.org> wrote:



On Wed, Aug 21, 2013 at 10:13 AM, Eric Penner <epe...@google.com> wrote:




On Wed, Aug 21, 2013 at 9:48 AM, Antoine Labour <pi...@chromium.org> wrote:



On Wed, Aug 21, 2013 at 8:48 AM, Ion Rosca <ro...@adobe.com> wrote:

Hi everyone,

 

I would like to start working on the CSS element blending (mix-blend-mode) implementation according to the Compositing and Blending spec [1]. There is some work already done in Blink, so the mix-blend-mode CSS property [2] is exposed behind the experimental Web Platform features runtime flag, but it does not have any visual effect on elements having a valid blend mode specified.

 

The software path uses Skia, which already has full support for blending. It’s already been used for implementing background-blend-mode [5] and canvas blending [6] in Blink.

To use blending with accelerated layers, requires support in the compositor, so that the accelerated layers could blend with the underlying content within the parent rendering surface.  The OpenGL blending [3] equations are not enough for implementing all the blend modes required by the spec [1].

 

Afaik, there are two options that I think could be used for supporting blending in the compositor: use the pixel programs implementing blending formulas, or use Skia with the hardware backend.

 

Skia with the hardware backend already has support for all the required blend modes [4] and is currently used to do blending within accelerated canvases [6]. We can take advantage of Skia blend modes by using the offscreen context in the compositor for blending layers, wrap compositor resources into Skia objects, composite contents with the backdrop using the specified mix-blend-mode and provide the result texture back to the compositor. Using Skia in the compositor will ensure the consistency between blending in canvas and element blending (both hw accelerated).

 

Is it ok to take advantage of the Skia hardware blending implementation as described above? Otherwise, should we implement it with pixel programs in the compositor, or are there other alternatives you can suggest?

 

For both the above solutions we need the backdrop in a separate texture either to use it in Skia or to pass it to the pixel programs as input (afaik there is no consistent way to read the backdrop pixels directly from a GPU program). Blending requires both color and alpha information of the backdrop surface and it should store it temporarily in a GL_RGBA texture. The compositor resource provider creates GL_RGBA textures using testStorage2DEXT which makes them immutable and the copyTexImage2D from GetFramebufferTexture will certainly fail. Is there another way to copy pixels into a RGBA temporary resource or is it acceptable to create this texture outside the resource provider?


You may want to look at the "background filter" implementation in the compositor which should be close to what you want to achieve, assuming a Skia filter that does the blending.

Your point about copyTexImage2D on immutable textures is correct. I suppose it currently works because in the background filters implementation we create a RGB (instead of RGBA) resource which doesn't go through texStorage2DEXT. But that's fragile... It'd be easy to replace it with a copyTexSubImage2D if the texture may be immutable (comes from the RP), or maybe not go through the RP to allocate it (I'm not sure it provides value).

Antoine


Hold on, texStorage2D just makes the texture definition immutable right?

Right. copyTexImage2D modifies the dimensions and internal format, hence why it's disallowed on immutable textures.
 
You can still modify the texture contents, right?

Yes, hence why copyTexSubImage2D works.
 
Or am I missing what you are trying to do?

Other note is, you may want to prefer an FBO blit

If you're thinking about glBlitFramebuffer, it requires an extension, hence not guaranteed.
 
over copyTexSubImage. I believe copyTexSubImage is often done on the CPU (slow path).

Do you have data for that? We use glCopyTexSubImage2D in many places, e.g. to handle damage for partial swaps and other things. I've never heard this being a performance problem. I'd be surprised that glCopyTexImage2D would be accelerated but not glCopyTexSubImage2D. Also, I don't see why a driver would support glBlitFramebuffer but not glCopyTexSubImage2D which is a simpler subset.

Antoine
 

+Others again as I dropped them on a separate email.

Ahh I was missing glCopyTexSubImage/glCopyTexImage distinction.

Regarding blit, I just meant binding it as an FBO and blitting into it with an empty shader to do the copy.  However I don't have good numbers to show copyTex is slow, so let's forget that one for now.
 
I tried to use copyTexSubImage2D with a rgba texture created through RP and it seems to work well, therefore I will start uploading some patches. Thanks for your help.

Sami Kyostila

unread,
Aug 22, 2013, 8:38:23 AM8/22/13
to ro...@adobe.com, graphics-dev, Antoine Labour, chromi...@chromium.org, Eric Penner
FWIW, here's an instance of where glCopyTexSubImage2D was 2x slower
than an equivalent blit:
https://bugs.webkit.org/show_bug.cgi?id=80870.

- Sami

Dana Jansens

unread,
Aug 22, 2013, 12:54:19 PM8/22/13
to Sami Kyostila, ro...@adobe.com, graphics-dev, Antoine Labour, chromi...@chromium.org, Eric Penner
On Thu, Aug 22, 2013 at 8:38 AM, Sami Kyostila <skyo...@google.com> wrote:
FWIW, here's an instance of where glCopyTexSubImage2D was 2x slower
than an equivalent blit:
https://bugs.webkit.org/show_bug.cgi?id=80870.

Should we be banning this method then with a PRESUBMIT or something? Our readbacks for tab casting use it.

Sami Kyostila

unread,
Aug 23, 2013, 2:16:35 PM8/23/13
to Dana Jansens, ro...@adobe.com, graphics-dev, Antoine Labour, chromi...@chromium.org, Eric Penner
There are some legitimate uses -- for example if the texture format
isn't renderable for some reason -- so maybe just a presubmit warning
would be more appropriate.

- Sami

Stephen White

unread,
Aug 29, 2013, 2:31:06 PM8/29/13
to Antoine Labour, Ion Rosca, graphics-dev, chromi...@chromium.org
On Wed, Aug 21, 2013 at 12:48 PM, Antoine Labour <pi...@chromium.org> wrote:



On Wed, Aug 21, 2013 at 8:48 AM, Ion Rosca <ro...@adobe.com> wrote:

Hi everyone,

 

I would like to start working on the CSS element blending (mix-blend-mode) implementation according to the Compositing and Blending spec [1]. There is some work already done in Blink, so the mix-blend-mode CSS property [2] is exposed behind the experimental Web Platform features runtime flag, but it does not have any visual effect on elements having a valid blend mode specified.

 

The software path uses Skia, which already has full support for blending. It’s already been used for implementing background-blend-mode [5] and canvas blending [6] in Blink.

To use blending with accelerated layers, requires support in the compositor, so that the accelerated layers could blend with the underlying content within the parent rendering surface.  The OpenGL blending [3] equations are not enough for implementing all the blend modes required by the spec [1].

 

Afaik, there are two options that I think could be used for supporting blending in the compositor: use the pixel programs implementing blending formulas, or use Skia with the hardware backend.

 

Skia with the hardware backend already has support for all the required blend modes [4] and is currently used to do blending within accelerated canvases [6]. We can take advantage of Skia blend modes by using the offscreen context in the compositor for blending layers, wrap compositor resources into Skia objects, composite contents with the backdrop using the specified mix-blend-mode and provide the result texture back to the compositor. Using Skia in the compositor will ensure the consistency between blending in canvas and element blending (both hw accelerated).

 

Is it ok to take advantage of the Skia hardware blending implementation as described above? Otherwise, should we implement it with pixel programs in the compositor, or are there other alternatives you can suggest?

 

For both the above solutions we need the backdrop in a separate texture either to use it in Skia or to pass it to the pixel programs as input (afaik there is no consistent way to read the backdrop pixels directly from a GPU program). Blending requires both color and alpha information of the backdrop surface and it should store it temporarily in a GL_RGBA texture. The compositor resource provider creates GL_RGBA textures using testStorage2DEXT which makes them immutable and the copyTexImage2D from GetFramebufferTexture will certainly fail. Is there another way to copy pixels into a RGBA temporary resource or is it acceptable to create this texture outside the resource provider?


You may want to look at the "background filter" implementation in the compositor which should be close to what you want to achieve, assuming a Skia filter that does the blending.

There is a SkXfermodeImageFilter which encapsulates all the Skia blending modes (including the non-GL-expressible ones) as an image filter. It does require that both its inputs are already in textures (texture-backed SkBitmaps), but saves on a texture draw in that case, and applies in a single pass.

However, there is currently no path that applies an SkImageFilter as a background filter. It would be easy enough to add one -- basically implement setBackgroundFilter(), where that is to setBackgroundFilters() as setFilter() is to setFilters().  (I had hoped to get rid of the setFilter() path entirely, but it looks like ajuma@'s work to put filter animations to the impl thread will be making setFilters() the preferred path, and deprecating setFilter() in the long term, so you might want to coordinate with him before embarking on this work).

All that said, it might just be easier to use the background filters code as a model of how to get the backdrop, and implement a new path for the non-GL-expressible blend modes, using SkXfermode but not image filters, if that makes sense.

Stephen

ro...@adobe.com

unread,
Sep 2, 2013, 8:20:58 AM9/2/13
to graphi...@chromium.org, Antoine Labour, Ion Rosca, chromi...@chromium.org


On Thursday, August 29, 2013 9:31:06 PM UTC+3, Stephen White wrote:
On Wed, Aug 21, 2013 at 12:48 PM, Antoine Labour <pi...@chromium.org> wrote:



On Wed, Aug 21, 2013 at 8:48 AM, Ion Rosca <ro...@adobe.com> wrote:

Hi everyone,

 

I would like to start working on the CSS element blending (mix-blend-mode) implementation according to the Compositing and Blending spec [1]. There is some work already done in Blink, so the mix-blend-mode CSS property [2] is exposed behind the experimental Web Platform features runtime flag, but it does not have any visual effect on elements having a valid blend mode specified.

 

The software path uses Skia, which already has full support for blending. It’s already been used for implementing background-blend-mode [5] and canvas blending [6] in Blink.

To use blending with accelerated layers, requires support in the compositor, so that the accelerated layers could blend with the underlying content within the parent rendering surface.  The OpenGL blending [3] equations are not enough for implementing all the blend modes required by the spec [1].

 

Afaik, there are two options that I think could be used for supporting blending in the compositor: use the pixel programs implementing blending formulas, or use Skia with the hardware backend.

 

Skia with the hardware backend already has support for all the required blend modes [4] and is currently used to do blending within accelerated canvases [6]. We can take advantage of Skia blend modes by using the offscreen context in the compositor for blending layers, wrap compositor resources into Skia objects, composite contents with the backdrop using the specified mix-blend-mode and provide the result texture back to the compositor. Using Skia in the compositor will ensure the consistency between blending in canvas and element blending (both hw accelerated).

 

Is it ok to take advantage of the Skia hardware blending implementation as described above? Otherwise, should we implement it with pixel programs in the compositor, or are there other alternatives you can suggest?

 

For both the above solutions we need the backdrop in a separate texture either to use it in Skia or to pass it to the pixel programs as input (afaik there is no consistent way to read the backdrop pixels directly from a GPU program). Blending requires both color and alpha information of the backdrop surface and it should store it temporarily in a GL_RGBA texture. The compositor resource provider creates GL_RGBA textures using testStorage2DEXT which makes them immutable and the copyTexImage2D from GetFramebufferTexture will certainly fail. Is there another way to copy pixels into a RGBA temporary resource or is it acceptable to create this texture outside the resource provider?


You may want to look at the "background filter" implementation in the compositor which should be close to what you want to achieve, assuming a Skia filter that does the blending.

There is a SkXfermodeImageFilter which encapsulates all the Skia blending modes (including the non-GL-expressible ones) as an image filter. It does require that both its inputs are already in textures (texture-backed SkBitmaps), but saves on a texture draw in that case, and applies in a single pass.

However, there is currently no path that applies an SkImageFilter as a background filter. It would be easy enough to add one -- basically implement setBackgroundFilter(), where that is to setBackgroundFilters() as setFilter() is to setFilters().  (I had hoped to get rid of the setFilter() path entirely, but it looks like ajuma@'s work to put filter animations to the impl thread will be making setFilters() the preferred path, and deprecating setFilter() in the long term, so you might want to coordinate with him before embarking on this work).

All that said, it might just be easier to use the background filters code as a model of how to get the backdrop, and implement a new path for the non-GL-expressible blend modes, using SkXfermode but not image filters, if that makes sense.

Stephen

It makes sense, this is what I'm working with now (getting the backdrop as background filters do and using SkXfermode to make it blend).

Regarding filters, I looked into them and I can't see how I'd be supposed to setBackgroundFilter() with a SkImageFilter (SkXfermodeImageFilter) to make the foreground blend with the background. Background filters change background pixels, and the foreground pass remains unchanged. With the blending feature, we want to change foreground pixels and leave the background unchanged. Maybe I misunderstood something...
Though, if you want to avoid adding a new setBlendMode() method to Layer interface, I think we can set the blend mode using a FilterEffect (FEBlend) through setFilters() and translate it to SkXfermodeImageFilter (or directly to SkXfermode) in the compositor (RenderSurfaceFilters::Apply). What do you think of this?

Ion.

Reply all
Reply to author
Forward
0 new messages