Intent to Implement and Ship: imageSmoothingQuality attribute for CanvasRenderingContext2D

300 views
Skip to first unread message

Owen Min

unread,
Oct 27, 2015, 4:25:48 PM10/27/15
to blin...@chromium.org, ju...@chromium.org

Contact emails

zm...@chromium.org, ju...@chromium.org


Spec

https://html.spec.whatwg.org/#imagesmoothingquality


Summary

Add support for the imageSmoothingQuality attribute on CanvasRenderingContext2D. It allows a web developer to choose the quality/performance tradeoff when scaling images. There are 3 options in total: low, medium and high.


Motivation

It gives developers more choices to handle multiple situations. Currently the developers only have 2 choices: on and off though the imageSmoothingEnabled attribute.


Compatibility Risk

It’s a brand new attribute. Given it’s just a render quality hint, the compatibility risk is very low.


Ongoing technical constraints

Skia implementation of MipMaps is broken on Chrome OS which prevent using ‘medium’ and ‘high’ quality. A solution will be either fix the issue or ignore this attribute on Chrome OS.


Will this feature be supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?

Yes.


OWP launch tracking bug

https://code.google.com/p/chromium/issues/detail?id=540761


Link to entry on the feature dashboard

https://www.chromestatus.com/features/4717415682277376


Requesting approval to ship?

Yes.


Demo link

Screenshot: (attached)


Source code of demo:

Chris Harrelson

unread,
Oct 27, 2015, 4:41:42 PM10/27/15
to Owen Min, blin...@chromium.org, ju...@chromium.org
I like this feature.

What is the feedback from other vendors and developers?

To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.

Justin Novosad

unread,
Oct 27, 2015, 4:59:23 PM10/27/15
to Chris Harrelson, Owen Min, blink-dev
On Tue, Oct 27, 2015 at 4:41 PM, Chris Harrelson <chri...@chromium.org> wrote:
I like this feature.

What is the feedback from other vendors and developers?

It is already implemented in WebKit, no news from other vendors. The addition to the spec was written by Edward O'Connor from Apple.
Requests by web developers for this feature have been seen on the WhatWG mailing list a few times over the years.
In this thread the feature is strongly requested by K. Gadd, a prominent web dev who's given us insightful feedback on canvas features on many occasions, as well as Simon Sarris (author of HTML5 Unleashed)

Tom Hudson

unread,
Oct 27, 2015, 5:01:41 PM10/27/15
to Owen Min, blink-dev, Justin Novosad
On Tue, Oct 27, 2015 at 4:25 PM, Owen Min <zm...@chromium.org> wrote:


Ongoing technical constraints

Skia implementation of MipMaps is broken on Chrome OS which prevent using ‘medium’ and ‘high’ quality. A solution will be either fix the issue or ignore this attribute on Chrome OS.



Do you have a pointer to the bug filed for this? As a Skia team member it's a surprise. 

Justin Novosad

unread,
Oct 27, 2015, 5:13:03 PM10/27/15
to Tom Hudson, Owen Min, blink-dev
A few months ago, before this feature was in the spec, I attempted to switch CanvasRenderingContext2D.drawImage over to medium filtering quality, and it broke stuff on ChromeOS and got reverted. That saga is documented here: https://code.google.com/p/chromium/issues/detail?id=498356
It is not clear where this is to be fixed: the chromeos mesa driver, a workaround in gpu command buffer, a workaround in skia? All I know is that the way skia implements gpu-accelerated medium filtering quality does not work on several ChromeOS devices.

Rick Byers

unread,
Oct 27, 2015, 7:23:44 PM10/27/15
to Justin Novosad, Tom Hudson, Owen Min, blink-dev
Given the small surface area of this API (and relative ease of breakage if necessary) and precedent in WebKit, I agree it's low risk.  I see AnneVk approved the pull request, I guess that counts for some sign of support from Mozilla. LGTM1 to ship.

Chris Harrelson

unread,
Oct 27, 2015, 8:04:59 PM10/27/15
to Rick Byers, Justin Novosad, Tom Hudson, Owen Min, blink-dev
LGTM2

Not being able to control image resampling quality is significant performance limitation that will be good to remove.

Rick Byers

unread,
Oct 27, 2015, 8:52:00 PM10/27/15
to Chris Harrelson, Justin Novosad, Tom Hudson, Owen Min, blink-dev
Note this comment from roc@ (Mozilla):

> If we use generic quality levels, will authors get upset when they find that a given quality level looks worse/different on one browser than another?
...
I'm still concerned but not enough to try to block shipping it.

I've updated the chromestatus entry to say "public skepticism" for Mozilla.  But I still support shipping as-is.  Worst case and we discover this is a real interop issue in practice, it seems we have options for evolving the API or better aligning implementations.  Alignment with WebKit outweighs the potential concern here IMHO.

Dongseong Hwang

unread,
Oct 28, 2015, 3:29:15 AM10/28/15
to blink-dev, chri...@chromium.org, ju...@chromium.org, tomh...@google.com, zm...@chromium.org
LGTM3

The spec is improved with keeping in mind of skia implementation :) https://github.com/whatwg/html/issues/98

Justin Novosad

unread,
Oct 28, 2015, 10:50:22 AM10/28/15
to Dongseong Hwang, blink-dev, Chris Harrelson, Tom Hudson, Owen Min
Something I forgot to mention, there is also support for this feature from Google Apps. Both Google Photos and and Google Slides have independently implemented workarounds that involve constructing downsampled image pyramids in JavaScript, to compensate for 2d canvas's lack of a high quality downsizing option. Their approaches are quite inefficient compared to native GPU mipmaps.  This feature is also very useful for presenting high quality thumbnails in gallery-style apps.

On Wed, Oct 28, 2015 at 3:29 AM, Dongseong Hwang <dongseo...@intel.com> wrote:
LGTM3

Thanks for the support, but you are not an API owner (https://code.google.com/p/chromium/codesearch#chromium/src/third_party/WebKit/OWNERS) . To avoid confusion, please write "non-owner lgtm" in the future.
Still 2 official LGTMs so far.

Mike Lawther

unread,
Oct 28, 2015, 8:45:56 PM10/28/15
to Justin Novosad, Dongseong Hwang, blink-dev, Chris Harrelson, Tom Hudson, Owen Min
I agree that this sounds good - I've got some implemention questions though.

The spec defines 'low', 'medium' and 'high' for image smoothing, leaving it up to the UA to decide what those mean. As an aside, it bugs me a bit that we've got the barnacle of a boolean to switch smoothing on/off, and now a separate enum for quality. I'd prefer a single enum - off/low/medium/high. C'est la vie web.

What do they mean for Blink? For example, will low = bilinear, medium = bicubic and high = lanczos? Will they be the same for scaling up / down? Are they fixed or will we reserve the right to vary them depending on the power/abilities of the device?

From a performance perspective - developers who want to maximise performance at the cost of quality can already set imageSmoothingEnabled to false, and the spec mandates nearest neighbour interpolation. Implementing this will allow a developer to choose to use more cycles for high quality, so I don't really see it as removing a performance limitation.

This also sounds similar to https://drafts.csswg.org/css-images-3/#the-image-rendering - which applies to the whole canvas element. Implementation wise they aren't likely to intersect much?

Owen Min

unread,
Oct 29, 2015, 10:41:13 AM10/29/15
to Mike Lawther, Justin Novosad, Dongseong Hwang, blink-dev, Chris Harrelson, Tom Hudson
inline

On Wed, Oct 28, 2015 at 8:45 PM, Mike Lawther <mikel...@chromium.org> wrote:
I agree that this sounds good - I've got some implemention questions though.

The spec defines 'low', 'medium' and 'high' for image smoothing, leaving it up to the UA to decide what those mean. As an aside, it bugs me a bit that we've got the barnacle of a boolean to switch smoothing on/off, and now a separate enum for quality. I'd prefer a single enum - off/low/medium/high. C'est la vie web.
 
When imageSmoothingEnabled is set to false, the filter we use is nearest neighbor. But we still need to retain the value of imageSmoothingQuiality attribute which becomes enabled when imageSmoothingEnabled is set to true again. 

 
What do they mean for Blink? For example, will low = bilinear, medium = bicubic and high = lanczos? Will they be the same for scaling up / down? Are they fixed or will we reserve the right to vary them depending on the power/abilities of the device?
 
Low means bilinear for both. Medium is mipmaps for down-scaling, bilinear for up-scaling. High means bicubic for up and mipmaps for down. Only the algorithm for off is fixed which is nearest neighbor.

 
From a performance perspective - developers who want to maximise performance at the cost of quality can already set imageSmoothingEnabled to false, and the spec mandates nearest neighbour interpolation. Implementing this will allow a developer to choose to use more cycles for high quality, so I don't really see it as removing a performance limitation.
 
Setting imageSmoothingEnabled to false doesn't change the performance a lot especially on GPU which we get bilinear for free. Also, the developers sometimes just set imageSmoothingEnabled to false to get Retro style blocky pixel.
 

This also sounds similar to https://drafts.csswg.org/css-images-3/#the-image-rendering - which applies to the whole canvas element. Implementation wise they aren't likely to intersect much?

No, drawing picture to canvas is a independent code process from drawing canvas to the page.

Mike Lawther

unread,
Oct 29, 2015, 7:28:32 PM10/29/15
to Owen Min, Justin Novosad, Dongseong Hwang, blink-dev, Chris Harrelson, Tom Hudson
Thanks for the extra info.

On 30 October 2015 at 01:40, Owen Min <zm...@google.com> wrote:
On Wed, Oct 28, 2015 at 8:45 PM, Mike Lawther <mikel...@chromium.org> wrote:
The spec defines 'low', 'medium' and 'high' for image smoothing, leaving it up to the UA to decide what those mean. As an aside, it bugs me a bit that we've got the barnacle of a boolean to switch smoothing on/off, and now a separate enum for quality. I'd prefer a single enum - off/low/medium/high. C'est la vie web.
 
When imageSmoothingEnabled is set to false, the filter we use is nearest neighbor. But we still need to retain the value of imageSmoothingQuiality attribute which becomes enabled when imageSmoothingEnabled is set to true again. 

But only because the spec requires you to :) I still think it's a barnacle, but not a big deal.
 
What do they mean for Blink? For example, will low = bilinear, medium = bicubic and high = lanczos? Will they be the same for scaling up / down? Are they fixed or will we reserve the right to vary them depending on the power/abilities of the device?
 
Low means bilinear for both. Medium is mipmaps for down-scaling, bilinear for up-scaling. High means bicubic for up and mipmaps for down. Only the algorithm for off is fixed which is nearest neighbor.

Cool, thanks. In our communication about this to developers, will we guarantee that these mappings of low/med/high x down/upscale to algorithms will be fixed forever? Do you think we'll vary them depending on device capabilities or even in response to something like a low power mode?
 
From a performance perspective - developers who want to maximise performance at the cost of quality can already set imageSmoothingEnabled to false, and the spec mandates nearest neighbour interpolation. Implementing this will allow a developer to choose to use more cycles for high quality, so I don't really see it as removing a performance limitation.
 
Setting imageSmoothingEnabled to false doesn't change the performance a lot especially on GPU which we get bilinear for free. Also, the developers sometimes just set imageSmoothingEnabled to false to get Retro style blocky pixel.

It sounds like you're agreeing that this feature does not really remove a performance limitation?
 
This also sounds similar to https://drafts.csswg.org/css-images-3/#the-image-rendering - which applies to the whole canvas element. Implementation wise they aren't likely to intersect much?

No, drawing picture to canvas is a independent code process from drawing canvas to the page.

Understood.

Philip Jägenstedt

unread,
Oct 30, 2015, 5:50:22 AM10/30/15
to Rick Byers, Robert O'Callahan, Chris Harrelson, Justin Novosad, Tom Hudson, Owen Min, blink-dev
I share roc's doubts, and would not be surprised to see UA sniffing to pick the mode once testing reveals that one browser's "medium" is about as fast as another's "high". In the best case reverse engineering will sort it out, but it only takes one popular library making assumptions to lock all browsers into their initial, incompatible, interpretations.

However, this is already implemented in WebKit, and both SkFilterQuality and CGInterpolationQuality use none/low/medium/high.

pet...@opera.com tells me that the OpenGL quality levels are GL_LINEAR and friends, and it looks like SkFilterQuality are mapped to these, via GrTextureParams::FilterMode. However, GrSkFilterQualityToGrFilterMode isn't a direct mapping for the medium and high levels.

As far as I can tell, the only way to make this fully interoperable would be to assume that 2D canvas is backed by GL, and to have setting two corresponding to GL_TEXTURE_MIN_FILTER and GL_TEXTURE_MAG_FILTER. That would be weird, and you can already do that using WebGL.

The API is very useful, so unless another API proposal emerges, I'm inclined to say that we should accept the risk. However, I'd like to wait for at least a few more days in case of objections.

Philip

Philip Jägenstedt

unread,
Oct 30, 2015, 5:58:06 AM10/30/15
to Mike Lawther, Owen Min, Justin Novosad, Dongseong Hwang, blink-dev, Chris Harrelson, Tom Hudson
On Fri, Oct 30, 2015 at 12:28 AM, Mike Lawther <mikel...@chromium.org> wrote:
Thanks for the extra info.

On 30 October 2015 at 01:40, Owen Min <zm...@google.com> wrote:
On Wed, Oct 28, 2015 at 8:45 PM, Mike Lawther <mikel...@chromium.org> wrote:
The spec defines 'low', 'medium' and 'high' for image smoothing, leaving it up to the UA to decide what those mean. As an aside, it bugs me a bit that we've got the barnacle of a boolean to switch smoothing on/off, and now a separate enum for quality. I'd prefer a single enum - off/low/medium/high. C'est la vie web.
 
When imageSmoothingEnabled is set to false, the filter we use is nearest neighbor. But we still need to retain the value of imageSmoothingQuiality attribute which becomes enabled when imageSmoothingEnabled is set to true again. 

But only because the spec requires you to :) I still think it's a barnacle, but not a big deal.

We can change the spec, if you have some idea of how to define imageSmoothingEnabled in terms of imageSmoothingQuality. If a "none" state were added it would be pretty easy, I think.

Philip

Justin Novosad

unread,
Oct 30, 2015, 11:39:59 AM10/30/15
to Philip Jägenstedt, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
On Fri, Oct 30, 2015 at 5:50 AM, Philip Jägenstedt <phi...@opera.com> wrote:
I share roc's doubts, and would not be surprised to see UA sniffing to pick the mode once testing reveals that one browser's "medium" is about as fast as another's "high". In the best case reverse engineering will sort it out, but it only takes one popular library making assumptions to lock all browsers into their initial, incompatible, interpretations.

The intent of ImageSmoothingQuality, is to provide a hint --just a hint-- as to whether the implementation should prioritize speed or visual quality. To be honest, I think we could have done without 'medium', but I don't mind it. For years we've been stuck beetween the requirements of game devs who want us to prioritize speed, and most other kinds of apps that want best image quality.  Due to an obsession with benchmarks, we have been leaning more toward making game devs happy, to the detriment of others. In previous debates it was argued that filtering quality was a 'quality of implementation' issue that is up to browser implementors to figure out. That line of thought has led us to a situation where we had to make choices for the whole web, which will inevitably always make one group of devs unhappy. Trying to auto-detect whether an app should prefer quality over speed (as has been suggested in the past) is likely to be perceived by web developers as unpredictable idiosyncratic behavior. ImageSmoothingQuality is an answer to this problem.  I share your concerns about this attribute being abused by by people who will over-analyze it, but I still think it does a lot more good than harm to the platform.  Do you have a better idea?

ImageSmoothingEnabled, on the other hand, is a style choice that is loaded with editorial intent. This is why it passed the 'should this be added to the standard' litmus test with much less debate.

The fact that one is a style choice and the other is a quality *hint* is a good reason to keep these two attributes separate IMHO.

Philip Jägenstedt

unread,
Oct 30, 2015, 3:14:26 PM10/30/15
to Justin Novosad, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
On Fri, Oct 30, 2015 at 4:39 PM, Justin Novosad <ju...@chromium.org> wrote:


On Fri, Oct 30, 2015 at 5:50 AM, Philip Jägenstedt <phi...@opera.com> wrote:
I share roc's doubts, and would not be surprised to see UA sniffing to pick the mode once testing reveals that one browser's "medium" is about as fast as another's "high". In the best case reverse engineering will sort it out, but it only takes one popular library making assumptions to lock all browsers into their initial, incompatible, interpretations.

The intent of ImageSmoothingQuality, is to provide a hint --just a hint-- as to whether the implementation should prioritize speed or visual quality. To be honest, I think we could have done without 'medium', but I don't mind it. For years we've been stuck beetween the requirements of game devs who want us to prioritize speed, and most other kinds of apps that want best image quality.  Due to an obsession with benchmarks, we have been leaning more toward making game devs happy, to the detriment of others. In previous debates it was argued that filtering quality was a 'quality of implementation' issue that is up to browser implementors to figure out. That line of thought has led us to a situation where we had to make choices for the whole web, which will inevitably always make one group of devs unhappy. Trying to auto-detect whether an app should prefer quality over speed (as has been suggested in the past) is likely to be perceived by web developers as unpredictable idiosyncratic behavior. ImageSmoothingQuality is an answer to this problem.  I share your concerns about this attribute being abused by by people who will over-analyze it, but I still think it does a lot more good than harm to the platform.  Do you have a better idea?

That imageSmoothingQuality is just a hint is precisely the source of the compatibility risk. If the underlying frameworks supported it, my idea would be an enum of concrete algorithms, like "nearest", "linear", "bilinear", "bicubic", etc., and a way to test which ones are actually supported. Unfortunately, that doesn't seem like a realistic hope, as per my previous email.

So, unless a better idea comes along real soon now, I will support shipping this.

ImageSmoothingEnabled, on the other hand, is a style choice that is loaded with editorial intent. This is why it passed the 'should this be added to the standard' litmus test with much less debate.

The fact that one is a style choice and the other is a quality *hint* is a good reason to keep these two attributes separate IMHO.

I'm not so sure, the 4 states are in the same enum in both Skia and CoreGraphics, and having 6 combinations corresponding to 4 states is not great. But it doesn't matter, because imageSmoothingEnabled is not going away.

Philip 

Justin Novosad

unread,
Oct 30, 2015, 4:33:07 PM10/30/15
to Philip Jägenstedt, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
On Fri, Oct 30, 2015 at 3:14 PM, Philip Jägenstedt <phi...@opera.com> wrote:
On Fri, Oct 30, 2015 at 4:39 PM, Justin Novosad <ju...@chromium.org> wrote:


On Fri, Oct 30, 2015 at 5:50 AM, Philip Jägenstedt <phi...@opera.com> wrote:
I share roc's doubts, and would not be surprised to see UA sniffing to pick the mode once testing reveals that one browser's "medium" is about as fast as another's "high". In the best case reverse engineering will sort it out, but it only takes one popular library making assumptions to lock all browsers into their initial, incompatible, interpretations.

The intent of ImageSmoothingQuality, is to provide a hint --just a hint-- as to whether the implementation should prioritize speed or visual quality. To be honest, I think we could have done without 'medium', but I don't mind it. For years we've been stuck beetween the requirements of game devs who want us to prioritize speed, and most other kinds of apps that want best image quality.  Due to an obsession with benchmarks, we have been leaning more toward making game devs happy, to the detriment of others. In previous debates it was argued that filtering quality was a 'quality of implementation' issue that is up to browser implementors to figure out. That line of thought has led us to a situation where we had to make choices for the whole web, which will inevitably always make one group of devs unhappy. Trying to auto-detect whether an app should prefer quality over speed (as has been suggested in the past) is likely to be perceived by web developers as unpredictable idiosyncratic behavior. ImageSmoothingQuality is an answer to this problem.  I share your concerns about this attribute being abused by by people who will over-analyze it, but I still think it does a lot more good than harm to the platform.  Do you have a better idea?

That imageSmoothingQuality is just a hint is precisely the source of the compatibility risk. If the underlying frameworks supported it, my idea would be an enum of concrete algorithms, like "nearest", "linear", "bilinear", "bicubic", etc., and a way to test which ones are actually supported. Unfortunately, that doesn't seem like a realistic hope, as per my previous email.

So, unless a better idea comes along real soon now, I will support shipping this.

IMHO, the enum value names should perhaps reflect the intended use and be as non-explicit as possible, to reflect the fact that this is but a hint. Something like: {fastest, medium, nicest};

Philip Jägenstedt

unread,
Oct 31, 2015, 2:28:57 PM10/31/15
to Justin Novosad, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
On Fri, Oct 30, 2015 at 9:33 PM, Justin Novosad <ju...@chromium.org> wrote:


On Fri, Oct 30, 2015 at 3:14 PM, Philip Jägenstedt <phi...@opera.com> wrote:
On Fri, Oct 30, 2015 at 4:39 PM, Justin Novosad <ju...@chromium.org> wrote:


On Fri, Oct 30, 2015 at 5:50 AM, Philip Jägenstedt <phi...@opera.com> wrote:
I share roc's doubts, and would not be surprised to see UA sniffing to pick the mode once testing reveals that one browser's "medium" is about as fast as another's "high". In the best case reverse engineering will sort it out, but it only takes one popular library making assumptions to lock all browsers into their initial, incompatible, interpretations.

The intent of ImageSmoothingQuality, is to provide a hint --just a hint-- as to whether the implementation should prioritize speed or visual quality. To be honest, I think we could have done without 'medium', but I don't mind it. For years we've been stuck beetween the requirements of game devs who want us to prioritize speed, and most other kinds of apps that want best image quality.  Due to an obsession with benchmarks, we have been leaning more toward making game devs happy, to the detriment of others. In previous debates it was argued that filtering quality was a 'quality of implementation' issue that is up to browser implementors to figure out. That line of thought has led us to a situation where we had to make choices for the whole web, which will inevitably always make one group of devs unhappy. Trying to auto-detect whether an app should prefer quality over speed (as has been suggested in the past) is likely to be perceived by web developers as unpredictable idiosyncratic behavior. ImageSmoothingQuality is an answer to this problem.  I share your concerns about this attribute being abused by by people who will over-analyze it, but I still think it does a lot more good than harm to the platform.  Do you have a better idea?

That imageSmoothingQuality is just a hint is precisely the source of the compatibility risk. If the underlying frameworks supported it, my idea would be an enum of concrete algorithms, like "nearest", "linear", "bilinear", "bicubic", etc., and a way to test which ones are actually supported. Unfortunately, that doesn't seem like a realistic hope, as per my previous email.

So, unless a better idea comes along real soon now, I will support shipping this.

IMHO, the enum value names should perhaps reflect the intended use and be as non-explicit as possible, to reflect the fact that this is but a hint. Something like: {fastest, medium, nicest};

To be clear, in the "concrete algorithms" model, it wouldn't be a hint, you would have a way to find out which algorithms are supported and then you would get exactly what you ask for. If that were actually possible to support in all browsers, I doubt that anyone would ask for non-explicit aliases. However, that doesn't seem like a realistic hope.

LGTM3 to implement and ship imageSmoothingQuality low/medium/high.

Philip

Mike Lawther

unread,
Nov 1, 2015, 5:29:08 PM11/1/15
to Philip Jägenstedt, Justin Novosad, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
This is why I was asking above about our communications to developers - what are we telling them about our implementation of this? And will we guarantee that what we do for low/med/high will never change? Or even could the algorithm used change dynamically in response to the device's environment.

I didn't get an answer to those questions.

Sorry if it sounds like I'm harping on this, but the communications are important because there will be a population of devs who want to control the algorithm used. If we make it explicit and upfront that this is a 'hint' only and that they cannot reliably control the algorithm under any circumstances, then at least we're being upfront and predictably unpredictable :)

Philip Jägenstedt

unread,
Nov 2, 2015, 3:53:45 AM11/2/15
to Mike Lawther, Justin Novosad, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
I assume that this could be documented similarly to SkFilterQuality. Documenting the limitations of Chrome OS would also be a good idea.

Philip

Justin Novosad

unread,
Nov 2, 2015, 3:11:31 PM11/2/15
to Philip Jägenstedt, Mike Lawther, Rick Byers, Robert O'Callahan, Chris Harrelson, Tom Hudson, Owen Min, blink-dev
I agree. I find the skia dox are concise and strike an interesting balance between being explicit yet non-committal. The wording that was pulled into the whatwg spec is is similar, though a bit more verby: 

+  <p>The <code>ImageSmoothingQuality</code> enumeration is used to express a preference for the
+  interpolation quality to use when smoothing images.</p>
+
+  <p>The "<dfn><code data-x="dom-context-2d-imageSmoothingQuality-low">low</code></dfn>" value
+  indicates a preference for a low level of image interpolation quality. Low-quality image
+  interpolation may be more computationally efficient than higher settings.</p>
+
+  <p>The "<dfn><code data-x="dom-context-2d-imageSmoothingQuality-medium">medium</code></dfn>" value
+  indicates a preference for a medium level of image interpolation quality.</p>
+
+  <p>The "<dfn><code data-x="dom-context-2d-imageSmoothingQuality-high">high</code></dfn>" value
+  indicates a preference for a high level of image interpolation quality. High-quality image
+  interpolation may be more computationally expensive than lower settings.</p>
+
+  <p class="note">Bilinear scaling is an example of a relatively fast, lower-quality image-smoothing
+  algorithm. Bicubic or Lanczos scaling are examples of image-smoothing algorithms that produce
+  higher-quality output. This specification does not mandate that specific interpolation algorithms
+  be used.</p>

The full merge request is here: https://github.com/whatwg/html/pull/136/files
My only criticism is that the last paragraph could have mentionned mipmapping as an example of a medium/high quality filter for downscaling.

  -Justin


Philip

Reply all
Reply to author
Forward
0 new messages