Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

sizeof(enum) ? force to be 1 ?

830 views
Skip to first unread message

mario semo

unread,
May 4, 2008, 2:35:20 PM5/4/08
to
Hello,

is there any way to force the compiler to take an enum as 1 byte - at least
for a specific enum? my problem is, that i have a persistent structure,
written by another compiler, and everything works fine, except the enums.
The problem is, that i cannot change the persistent data layout. of course,
i can read the structure element by element, and read a char instead of the
enum, .... but i would prefer to tell the compiler to take this enum as a
byte.

Sample:

#include <stdio.h>

#define CHK(a) printf("sizeof(%s)=%d\n",#a,sizeof(a));

enum TestEnum
{
en1,
en2
};

struct TestStruct
{
int i1;
TestEnum t1;
TestEnum t2;
};

int main(int argc,char *argv[])
{

CHK(TestEnum);
CHK(TestStruct);

return 0;
}

VC++ :
sizeof(TestEnum)=4
sizeof(TestStruct)=12

OtherCompiler :
sizeof(TestEnum)=1
sizeof(TestStruct)=8


--
mit freundlichen Grüßen

mario semo

QbProg

unread,
May 4, 2008, 4:02:18 PM5/4/08
to
enum Test : unsigned char
{
A = 1,....
...
};

or
enum Test : <any ordinal type you want>
{

};


it's a VC++ extension

QbProg

Giovanni Dicanio

unread,
May 4, 2008, 4:00:07 PM5/4/08
to

"mario semo" <mario...@hotmail.com> ha scritto nel messaggio
news:E82DB684-05C5-4C00...@microsoft.com...

> is there any way to force the compiler to take an enum as 1 byte - at
> least for a specific enum?

I think it is not possible in C++...


> struct TestStruct
> {
> int i1;
> TestEnum t1;
> TestEnum t2;
> };

You may consider using a typedef and #define's to "simulate" the 1-byte
enum, e.g.

// 1-byte enum
typedef BYTE TestEnum;

// Values
#define en1 0
#define en2 1

BTW: I would "protect" the scope of enum constants, using a proper prefix
for them, e.g.
(#define <enum name>_<enum constant> <value>)

#define TestEnum_en1 0
#define TestEnum_en2 1
...


HTH,
Giovanni


Carl Daniel [VC++ MVP]

unread,
May 5, 2008, 1:15:18 AM5/5/08
to
mario semo wrote:
> Hello,
>
> is there any way to force the compiler to take an enum as 1 byte - at
> least for a specific enum? my problem is, that i have a persistent
> structure, written by another compiler, and everything works fine,
> except the enums. The problem is, that i cannot change the persistent
> data layout. of course, i can read the structure element by element,
> and read a char instead of the enum, .... but i would prefer to tell
> the compiler to take this enum as a byte.

There's no standard way to do that. The compiler is free to choose any
integer type that's large enough to hold the range of values.

VC++ 2005 and later support a non-standard extension that lets you specify
the storage used for the enum:

enum [tag] [: type] {enum-list} [declarator];

type is the underlying type of the identifiers. This can be any scalar type,
such as signed or unsigned versions of int, short, or long. bool or char is
also allowed.

http://msdn.microsoft.com/en-us/library/2dzy4k6e.aspx

-cd


Ondrej Spanel

unread,
May 5, 2008, 3:01:43 AM5/5/08
to
We use a template like this (real code is somewhat longer and more
complicated to maintain GNU C compatibility):

/// enum stored using any type instead of default unsigned int
template <class Enum, class Type=unsigned char>
class SizedEnum
{
Type _data;

public:
operator Enum () const {return (Enum)_data;}
SizedEnum( Enum val ):_data(val){}
SizedEnum(){}
};

Cheers
Ondrej

mario semo napsal(a):

Hendrik Schober

unread,
May 5, 2008, 8:15:24 AM5/5/08
to
Giovanni Dicanio <giovanni...@invalid.com> wrote:
> "mario semo" <mario...@hotmail.com> ha scritto nel messaggio
> news:E82DB684-05C5-4C00...@microsoft.com...
>
>> is there any way to force the compiler to take an enum as 1 byte - at
>> least for a specific enum?
>
> I think it is not possible in C++...
>
>
>> struct TestStruct
>> {
>> int i1;
>> TestEnum t1;
>> TestEnum t2;
>> };
>
> You may consider using a typedef and #define's to "simulate" the 1-byte
> enum, e.g.
>
> // 1-byte enum
> typedef BYTE TestEnum;
>
> // Values
> #define en1 0
> #define en2 1

What would that #define buy you over
const TestEnum en1 = 0;
const TestEnum en2 = 1;
?

> BTW: I would "protect" the scope of enum constants, using a proper prefix
> for them, e.g.
> (#define <enum name>_<enum constant> <value>)
>
> #define TestEnum_en1 0
> #define TestEnum_en2 1

Or don't use #defines, which makes using real namespaces possible.

> HTH,
> Giovanni

Schobi

--
Spam...@gmx.de is never read
I'm HSchober at gmx dot de
"I guess at some point idealism meets human nature and
explodes." Daniel Orner


Ben Voigt [C++ MVP]

unread,
May 5, 2008, 10:23:12 AM5/5/08
to

"QbProg" <tho...@gmail.com> wrote in message
news:34a9d3f0-29a9-4931...@8g2000hse.googlegroups.com...

It becomes standard in C++0x. What version of Visual C++ has this for
native enums, I thought it could only be specified for enum class.

>
> QbProg


Carl Daniel [VC++ MVP]

unread,
May 5, 2008, 11:23:07 AM5/5/08
to

VC2005+ has this.

I thought it might be in C++0x - hadn't had a chance to look it up yet
though.

-cd


Ben Voigt [C++ MVP]

unread,
May 5, 2008, 11:43:38 AM5/5/08
to

"Carl Daniel [VC++ MVP]" <cpdaniel_remove...@mvps.org.nospam>
wrote in message news:%23hxtsPs...@TK2MSFTNGP05.phx.gbl...

The changes are described here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2347.pdf

You can see the base type for enums in the text of the proposed new standard
in section 7.2 here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2521.pdf

>
> -cd
>
>


mario semo

unread,
May 5, 2008, 5:27:02 PM5/5/08
to
> enum Test : <any ordinal type you want>

thx for your tipp! i solved my problem this way. i was not aware of this
language enhancement.

mario.

mario semo

unread,
May 5, 2008, 5:25:35 PM5/5/08
to
Ondrej,

"Ondrej Spanel" <OndrejSp...@microsoft.com> schrieb im Newsbeitrag
news:O5tmh3nr...@TK2MSFTNGP02.phx.gbl...


> We use a template like this (real code is somewhat longer and more
> complicated to maintain GNU C compatibility):
>
> /// enum stored using any type instead of default unsigned int
> template <class Enum, class Type=unsigned char>
> class SizedEnum
> {
> Type _data;
>

What a nice idea.
but i solved my problem with the : basetype enhancement.

thx for your tipp anyway!

mario.

0 new messages