(This might require redeclaring string16 as being built on top of char16_t rather than uint16 as it is today, with lossless conversions to/from wstring on Windows where today it is a typedef for wstring.)
If u"" doesn't even compile on Windows today then this is a non-starter. I thought that was a language feature.
--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
On Thu, Sep 25, 2014 at 7:54 AM, Avi Drissman <a...@chromium.org> wrote:If u"" doesn't even compile on Windows today then this is a non-starter. I thought that was a language feature.Let's not read that too formally, shell we? The goal of this suggestion was not to enable a particular feature but to make life of people easier via it's use. Yes, "raw" unicode literals are not supported by MSVC and should obviously be banned.But that does not mean that they are useless for us. MSVC always supported UTF-8 strings, it just called them wstrings. And it provided a way to specify them in the code, it just used L"blah-blah-blah" syntax. And it used UTF-16 in it's API quite extensively.Thus we could switch char16 to char16_t on non-Windows platforms and leave it as wchar_t on Windows. Then we could provide something like _T to use with constants (we probably should not use _T itself since it's in reserved namespace, but something like UTF8CONST will work).
Since VS2015 does support this now, can we revive this?
On Thu, Sep 8, 2016 at 4:02 PM, Rachel Blum <gr...@chromium.org> wrote:Since VS2015 does support this now, can we revive this?Is this different from https://groups.google.com/a/chromium.org/d/topic/chromium-dev/2kWQHbbuMHI/discussion?
x.ccx.cc(6): error C2664: 'HANDLE CreateFileW(LPCWSTR,DWORD,DWORD,LPSECURITY_ATTRIBUTES,DWORD,DWORD,HANDLE)': cannot convert argument 1 from 'const char16_t [8]' to 'LPCWSTR'x.cc(6): note: Types pointed to are unrelated; conversion requires reinterpret_cast, C-style cast or function-style cast
Can't we just stop using UTF-16?
Doesn't that (char16_t/wchar_t interchangeability) change depending on if UNICODE is defined or not? (Or some similar arcane incantations?)
On Mon, Sep 12, 2016 at 10:55 PM, Rachel Blum <gr...@chromium.org> wrote:Doesn't that (char16_t/wchar_t interchangeability) change depending on if UNICODE is defined or not? (Or some similar arcane incantations?)I don't think /DUNICODE changes anything (but maybe someone else knows better?). AFAIK, they are both 16-bit unsigned values.In practice Win32 *W() APIs have been defined as taking UTF-16 for a long time. Maybe they were kept as different for UCS-2 legacy reasons, or UTF-16LE vs. BE?
But anyway, I guess from a language pov, wchar_t is __wchar_t and char16_t is ... whatever, and they just ain't the same.
I think the reason for UTF-16 is that Windows and ICU work natively with UTF-16.
On Tue, Sep 13, 2016 at 9:54 AM, Scott Graham <sco...@chromium.org> wrote:On Mon, Sep 12, 2016 at 10:55 PM, Rachel Blum <gr...@chromium.org> wrote:Doesn't that (char16_t/wchar_t interchangeability) change depending on if UNICODE is defined or not? (Or some similar arcane incantations?)I don't think /DUNICODE changes anything (but maybe someone else knows better?). AFAIK, they are both 16-bit unsigned values.In practice Win32 *W() APIs have been defined as taking UTF-16 for a long time. Maybe they were kept as different for UCS-2 legacy reasons, or UTF-16LE vs. BE?AFAIK, the win32 APIs really deal in UCS-2 and not UTF-16, but I could be misremembering.
--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
---
You received this message because you are subscribed to the Google Groups "Chromium-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chromium-dev+unsubscribe@chromium.org.
On Tue, Sep 13, 2016 at 12:09 PM, 'Peter Kasting' via Chromium-dev <chromi...@chromium.org> wrote:On Tue, Sep 13, 2016 at 9:54 AM, Scott Graham <sco...@chromium.org> wrote:On Mon, Sep 12, 2016 at 10:55 PM, Rachel Blum <gr...@chromium.org> wrote:Doesn't that (char16_t/wchar_t interchangeability) change depending on if UNICODE is defined or not? (Or some similar arcane incantations?)I don't think /DUNICODE changes anything (but maybe someone else knows better?). AFAIK, they are both 16-bit unsigned values.In practice Win32 *W() APIs have been defined as taking UTF-16 for a long time. Maybe they were kept as different for UCS-2 legacy reasons, or UTF-16LE vs. BE?AFAIK, the win32 APIs really deal in UCS-2 and not UTF-16, but I could be misremembering.Windows should be UTF-16. It was UCS-2 in the Win9x era and early NT4. They now support UTF-16 surrogate pairs needed to handle code points outside the Basic Multilingual Plane (BMP).
I wonder if there would be a way to rapid-prototype switching string16 to be u16string
On Tue, Sep 13, 2016 at 12:09 PM, Peter Kasting <pkas...@google.com> wrote:I wonder if there would be a way to rapid-prototype switching string16 to be u16string
Don't we just need to replacetypedef std::wstring string16;withtypedef std::ustring16 string16;
to get a basic idea? (And I suspect the basic idea is "this will be painful")
I'd suspect we can ease some of the pain via dcheng's rewrite tools - most of the issues are likely to be "takes LPCWSTR, passing in char16_t*". Anybody with a Windows machine willing to try? :)
the question is what the solution is.