I found that wchar_t on some systems is:
typedef unsigned char wchar_t;
while on others it is:
typedef unsigned int wchar_t;
and according to the (supposedly) ANSI C dictionary in the library:
example:
typedef wchar_t int;
which is obviously an incorrect typedef (should be 'typedef int wchar_t').
As I don't have a copy of the standard I first ask:
Which ones is standard?
And I then ask:
Can you forgive me for not waiting for a copy of the standard...;-)
The reason for a typedef is precisely to allow different implementations
to use different types. I would think that was obvious.
There is no standard--that's why it's specified as wchar_t. I'd say either
unsigned short or unsigned long would be you best option, depending on whether
you need support for two-byte or four-byte glyphs respectively.
>
> And I then ask:
>
> Can you forgive me for not waiting for a copy of the standard...;-)
At US$65 per copy, yes, I can forgive you. B^J
Crispy
--
"It is a question of cubic capacity; a man with so large a brain must have
something inside it."--Sherlock Holmes, _The Adventure of the Blue Carbuncle_
Christian Carey (size 8 hat (USA)) uunet!cucstud!xcarey
Apart from the obviously syntactically-incorrect one, the answer is "all
of them". The definition of wchar_t is implementation-specific.
--
"But this *is* the simplified version | Henry Spencer @ U of Toronto Zoology
for the general public." -S. Harris | he...@zoo.toronto.edu utzoo!henry