wxStyledTextCtrl::SetUseAntiAliasing() is broken and does either nothing, or the opposite of what it is supposed do.
I ran into this while working on #26277.
I have done some code archeology to figure out how we got to this state. Here is what I found:
Detailse0e5663 ("Allow an extra flag to be given to Font::Create", 2004-02-11) added extraFontFlag to Scintilla as a dummy boolean field that platform implementations can use how they want. This change was later merged into Scintilla upstream in wxWidgets/scintilla@e635868 and wxWidgets/scintilla@f8c49b2 ("Patch from Robin Dunn to pass an extra flag down to font rendering to allow a platform-specific tweak.", 2004-03-29).
d1558f3 ("Use extraFontFlag to specify if anti-aliased fonts should be used. Use GetPartialTextExtents.", 2004-02-11) added an option to disable anti-aliasing. This was apparently motivated by performance problems with anti-aliasing on Mac. The extraFontFlag field was used to store the setting, with the following meaning: true = AA enabled, false = AA disabled. GetUseAntiAliasing() and SetUseAntiAliasing() methods were added to read and modify the field.
8462a84 ("Removed wxFont::Set/GetNoAntiAliasing() implementations.", 2009-10-09) removed wxFont::SetNoAntiAliasing() and with that, the functionality of SetUseAntiAliasing was lost. After this change, extraFontFlag does nothing in the WX Scintilla platform, although GetUseAntiAliasing() and SetUseAntiAliasing() still read and write the value.
In Scintilla upstream, wxWidgets/scintilla@edd3053 ("Added setting for font quality to allow application to choose anti-aliased, non-anti-aliased or lcd-optimized text.", 2009-11-02) implemented the ability to configure anti-aliasing. It made use of the existing extraFontFlag to store the setting, but in a way incompatible with the previous implementation in WX platform. The field was changed from bool to int, with the following values:
#define SC_EFF_QUALITY_DEFAULT 0 #define SC_EFF_QUALITY_NON_ANTIALIASED 1 #define SC_EFF_QUALITY_ANTIALIASED 2 #define SC_EFF_QUALITY_LCD_OPTIMIZED 3
The SCI_SETFONTQUALITY and SCI_GETFONTQUALITY messages were added for setting and getting the value of the field.
9e96e16 ("Apply patch (plus some additional changes) upgrading Scintilla to version 2.03.", 2010-03-30) updated Scintilla, importing the upstream change. WX platform's use of extraFontFlag, which now conflicts with Scintilla upsream code, was not updated.
The methods SetFontQuality() and GetFontQuality(), generated from Scintilla's interface definitions for the new window messages, are added by this commit.
b936bfe ("Add Direct2D support to wxSTC", 2018-01-25) imported some drawing code form Scintilla win32 platform into Scintilla WX platform. This new code actually makes use of extraFontFlag (which from the removal of wxFont::SetNoAntiAliasing() up to this point had no effect), interpreting it as one of the EFF_QUALITY_* constants.
As of now, there are two pairs of methods for controlling anti-aliasing:
SetUseAntiAliasing/GetUseAntiAliasing - legacy wxWidgets-specific interface, only supports a simple yes/no toggle, brokenSetFontQuality/GetFontQuality - new interface based on upstream SCI_SETFONTQUALITY/SCI_GETFONTQUALITY messagesThe legacy interface has been broken since 2009 (see history above) and currently does the following:
SetUseAntiAliasing(false) sets font quality to EFF_QUALITY_DEFAULTSetUseAntiAliasing(true) sets font quality to EFF_QUALITY_NON_ANTIALIASEDGetUseAntiAliasing() returns true is font quality is set to anything other than EFF_QUALITY_DEFAULTOn most platforms, neither SetUseAntiAliasing nor SetFontQuality has any effect, because the implementation in the Scintilla WX platform is missing, but it does have effect on wxMSW when DirectWrite is enabled.
I believe we should deprecate SetUseAntiAliasing/GetUseAntiAliasing - it is a wxWidgets specific interface, has been broken for more than 16 years, and SetFontQuality/GetFontQuality is a straightforward and less broken replacement.
In addition, the implementation has to be changed. However, it is unclear how to map the true/false value onto the EFF_QUALITY_* constants. We could define some mapping, but there is also another alternative: we could just say that the methods have been broken for so long that there is no point in fixing them now, and make them do nothing (and document that).
After we agree on what the solution should be, I'll do a PR.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
Thanks for looking into this, I had no idea about all the history, so, if nothing else, I learned something new about wx today!
Unfortunately, after a quick search, it looks like SetUseAntiAliasing() is used in a lot of code, so it wouldn't be ideal to deprecate it and if we can make it work by forwarding it to SetFontQuality(useAA ? EFF_QUALITY_ANTIALIASED : EFF_QUALITY_NON_ANTIALIASED), I'd prefer to do this.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
Okay, and what should GetUseAntiAliasing() return when font quality is SC_EFF_QUALITY_DEFAULT?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
IIUC, whether SC_EFF_QUALITY_DEFAULT is anti-aliased depends on OS settings, but usually it will be anti-aliased. Would returning true from GetUseAntiAliasing() be OK in this case, even though anti-aliasing isn't guaranteed?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
Yes, I think this is fine, if we want to be maximally helpful, maybe we should document that it returns true until SetUseAA() is called and then it returns the value passed to the last call to SetUseAA().
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
I've created #26342.
I'm still not sure what should be the behavior of SetUseAntiAliasing(true). You proposed EFF_QUALITY_ANTIALIASED, but I don't think this is quite right - if the OS uses subpixel AA by default, this will revert to grayscale AA. I think the right choice is probably either LCD_OPTIMIZED or DEFAULT.
Could you point me to some of the uses of SetUseAntiAliasing() you found? I would like to know why is it calling this method and what expectations it has on the behavior.
(side note: My original analysis incorrectly stated that SetFontQuality() was added by that commit that upgraded Scintilla version. I've edited the description to fix this error.)
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
I'd be fine with using DEFAULT for useAA = true too, and this would be consistent with returning true from GetUseAA() by default.
I've just had a quick look at https://github.com/search?q=wx+SetUseAntiAliasing&type=code and, considering that not all code is on GitHub, this seems to indicate that this is being used...
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()
It is indeed used, but almost all of the uses I found are SetUseAntiAliasing(true). The reason for some of these uses is clearly that the default value on wxMac was false for some time (due to performance problems), which some people probably found ugly. Another fraction of these uses could be just some developers vaguely thinking that enabling anti-aliasing might be a good idea. In light of this, I my preference is for mapping true to DEFAULT.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.![]()