char32_t c = U'\U12345678';
char32_t c_short = U'\u1234';
The plan was to #define _U+ to just U+, maybe I should update the OP? Oh, and + isn't supported in iso646 or di/tri graphs either, so it's pretty much impossible currently.
It's an unreadable mess at best.
I literally just explained the problem with a macro to you.The + symbol can not be part of a Macro's name. the preprocessor literally doesn't allow that.
That's true, but U+ is THE standard way to represent Unicode codepoints in every single language except C and C++
Except Swift, Go, and most of the other languages on the list are based on or inspired by C++.That's like saying we shouldn't do X in fortran because C doesn't support it. Who cares? how is it relevant?