is there any way to force the compiler to take an enum as 1 byte - at least
for a specific enum? my problem is, that i have a persistent structure,
written by another compiler, and everything works fine, except the enums.
The problem is, that i cannot change the persistent data layout. of course,
i can read the structure element by element, and read a char instead of the
enum, .... but i would prefer to tell the compiler to take this enum as a
byte.
Sample:
#include <stdio.h>
#define CHK(a) printf("sizeof(%s)=%d\n",#a,sizeof(a));
enum TestEnum
{
en1,
en2
};
struct TestStruct
{
int i1;
TestEnum t1;
TestEnum t2;
};
int main(int argc,char *argv[])
{
CHK(TestEnum);
CHK(TestStruct);
return 0;
}
VC++ :
sizeof(TestEnum)=4
sizeof(TestStruct)=12
OtherCompiler :
sizeof(TestEnum)=1
sizeof(TestStruct)=8
--
mit freundlichen Grüßen
mario semo
or
enum Test : <any ordinal type you want>
{
};
it's a VC++ extension
QbProg
> is there any way to force the compiler to take an enum as 1 byte - at
> least for a specific enum?
I think it is not possible in C++...
> struct TestStruct
> {
> int i1;
> TestEnum t1;
> TestEnum t2;
> };
You may consider using a typedef and #define's to "simulate" the 1-byte
enum, e.g.
// 1-byte enum
typedef BYTE TestEnum;
// Values
#define en1 0
#define en2 1
BTW: I would "protect" the scope of enum constants, using a proper prefix
for them, e.g.
(#define <enum name>_<enum constant> <value>)
#define TestEnum_en1 0
#define TestEnum_en2 1
...
HTH,
Giovanni
There's no standard way to do that. The compiler is free to choose any
integer type that's large enough to hold the range of values.
VC++ 2005 and later support a non-standard extension that lets you specify
the storage used for the enum:
enum [tag] [: type] {enum-list} [declarator];
type is the underlying type of the identifiers. This can be any scalar type,
such as signed or unsigned versions of int, short, or long. bool or char is
also allowed.
http://msdn.microsoft.com/en-us/library/2dzy4k6e.aspx
-cd
/// enum stored using any type instead of default unsigned int
template <class Enum, class Type=unsigned char>
class SizedEnum
{
Type _data;
public:
operator Enum () const {return (Enum)_data;}
SizedEnum( Enum val ):_data(val){}
SizedEnum(){}
};
Cheers
Ondrej
mario semo napsal(a):
What would that #define buy you over
const TestEnum en1 = 0;
const TestEnum en2 = 1;
?
> BTW: I would "protect" the scope of enum constants, using a proper prefix
> for them, e.g.
> (#define <enum name>_<enum constant> <value>)
>
> #define TestEnum_en1 0
> #define TestEnum_en2 1
Or don't use #defines, which makes using real namespaces possible.
> HTH,
> Giovanni
Schobi
--
Spam...@gmx.de is never read
I'm HSchober at gmx dot de
"I guess at some point idealism meets human nature and
explodes." Daniel Orner
It becomes standard in C++0x. What version of Visual C++ has this for
native enums, I thought it could only be specified for enum class.
>
> QbProg
VC2005+ has this.
I thought it might be in C++0x - hadn't had a chance to look it up yet
though.
-cd
The changes are described here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2347.pdf
You can see the base type for enums in the text of the proposed new standard
in section 7.2 here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2521.pdf
>
> -cd
>
>
thx for your tipp! i solved my problem this way. i was not aware of this
language enhancement.
mario.
"Ondrej Spanel" <OndrejSp...@microsoft.com> schrieb im Newsbeitrag
news:O5tmh3nr...@TK2MSFTNGP02.phx.gbl...
> We use a template like this (real code is somewhat longer and more
> complicated to maintain GNU C compatibility):
>
> /// enum stored using any type instead of default unsigned int
> template <class Enum, class Type=unsigned char>
> class SizedEnum
> {
> Type _data;
>
What a nice idea.
but i solved my problem with the : basetype enhancement.
thx for your tipp anyway!
mario.