Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

HELP: fixing the GCC enum storage size

1,220 views
Skip to first unread message

kindsol

unread,
Apr 28, 2004, 9:12:29 PM4/28/04
to
I have a problem with GCC 3.2 that I don't have with my MSVC 6.0 compiler.

My library has over 450 enumerations in it, almost all start with an
enumerator of -1. These enumerations can be members of structures that get
written to symmetrical memory structures on a hardware device. My library
(incorrectly) assumes that all enumerations will be 32bits in size.

I have found that if an enum contains a -1 and a positive integer (in my
case a bit mask) with bit 31 set, then the GCC compiler promotes my
enumeration storage size from a 32bit integer to a 64bit integer.

For example with a -1 and bit31 enumerators in my enum...
typedef enum {
TESTValueONE = -1,
TESTValueTWO,
TESTValueTHREE = 0x80000000,
} TESTValue;

...I get 64bit values...

bytes:64 value:0xffffffff
bytes:64 value:0x0
bytes:64 value:0x80000000

However if the -1 or the value with bit 31 set is removed... (here I remove
the bit31 value)

typedef enum {
TESTValueONE = -1,
TESTValueTWO,
TESTValueTHREE,
} TESTValue;

I get 32bit values...
bytes:32 value:0xffffffff
bytes:32 value:0x0
bytes:32 value:0x1

I have found plenty of information why this happens and I think that I
understand why this happens.

My problem is that this offsets my host's structure alignment so it will no
longer match the hardware's memory alignment.
I hoping to find solution that is more elegant than changing 450
enumerations in my library (not a simple search-and-replace)

Is there anyway to force GCC to treat all enums as a 32bit integer? (i.e.
compiler switch, gcc build option, gcc src code change?)

Any insight would be much appreciated!!

Thank you for your time,
Kindsol

FYI: gcc version string: gcc (GCC) 3.2 20020903 (Red Hat Linux 8.0 3.2-7)

Here is my test code:

#include <stdlib.h>

typedef enum {
TESTValueONE = -1,
TESTValueTWO,
TESTValueTHREE,
} TESTValue;

int main() {
printf("bytes:%d\tvalue:0x%lx\n", sizeof(TESTValueONE)*8,TESTValueONE);
printf("bytes:%d\tvalue:0x%lx\n", sizeof(TESTValueTWO)*8,TESTValueTWO);
printf("bytes:%d\tvalue:0x%lx\n",
sizeof(TESTValueTHREE)*8,TESTValueTHREE);

return (0);
}


Ulrich Weigand

unread,
Apr 28, 2004, 11:02:23 PM4/28/04
to
"kindsol" <kin...@hotmail.com> writes:

> typedef enum {
> TESTValueONE = -1,
> TESTValueTWO,
> TESTValueTHREE = 0x80000000,
> } TESTValue;

Since 0x80000000 is not in the range of 'int', this code does
not conform to the ISO C standard.

Changing the code to (int)0x80000000 makes the application
behave as you expect.

>Is there anyway to force GCC to treat all enums as a 32bit integer? (i.e.
>compiler switch, gcc build option, gcc src code change?)

The -pedantic switch emits a warning for each instance of an
enumerator that does not fit into an int.

Interestingly enough, using -pedantic has the (maybe unintended?)
effect of actually treating the too-large values as 'int' ...

--
Dr. Ulrich Weigand
wei...@informatik.uni-erlangen.de

Ulrich Eckhardt

unread,
Apr 29, 2004, 12:54:32 AM4/29/04
to
kindsol wrote:
> I have found that if an enum contains a -1 and a positive integer (in my
> case a bit mask) with bit 31 set, then the GCC compiler promotes my
> enumeration storage size from a 32bit integer to a 64bit integer.

Yes, and I think the compiler is free to do so, so there is nothing to fix.
Anyhow, there are other means to get what you want, especially if you
really mean a set of bits.

> For example with a -1 and bit31 enumerators in my enum...
> typedef enum {
> TESTValueONE = -1,
> TESTValueTWO,
> TESTValueTHREE = 0x80000000,
> } TESTValue;

Typedef uint32_t TESTValue;
TESTValue const TESTValueOne = -1;
TESTValue const TESTValueTwo = 0u;
TESTValue const TESTValueThree = 0x80000000u;

or even this:

struct bits
{
unsigned bit_one:1;
unsigned bit_two:1;
unsigned padding:29;
unsigned bit_last:1;
};
assert(sizeof (bits) == sizeof (uint32_t));

union register_test
{
uint32_t raw;
struct bits bits;
};
assert(sizeof (register_test) == sizeof (uint32_t));

> printf("bytes:%d\tvalue:0x%lx\n", sizeof(TESTValueONE)*8,TESTValueONE);

Bad. Firstly it's 'bits' and not 'bytes'. Secondly, it's '*CHAR_BITS' and
not '*8', for the sake of portability.

Uli

kindsol

unread,
Apr 29, 2004, 5:56:22 PM4/29/04
to
Thank you very much!

"Ulrich Weigand" <wei...@informatik.uni-erlangen.de> wrote in message
news:c6pr7v$eotog$1...@uni-berlin.de...


> "kindsol" <kin...@hotmail.com> writes:
>
> Changing the code to (int)0x80000000 makes the application
> behave as you expect.
>

That was one cast that I did not try. Thank you :)

> The -pedantic switch emits a warning for each instance of an
> enumerator that does not fit into an int.
>
> Interestingly enough, using -pedantic has the (maybe unintended?)
> effect of actually treating the too-large values as 'int' ...
>

That *is* an interesting side effect -- I might have never found this!

Looks like I am just going to have to deal with the warnings for a while so
I can get this library working.

Another solution might be for me to use a compile with the -pendantic flag
to locate any enum violations as park
of our test suite. Then cast any offending enums/defines.

thanks again for your response!


0 new messages