I've had problems leaving bits undefined and then passing them to
Windows API functions. The Windows API has many "flag" arguments that
are easily defined by Ada records. However, the C definition is via
constants with 1 in the appropriate bits. Apparently, the actual API
functions do not always mask out the unused bits.
I've also had problems with comparing test outputs. If you leave bits
undefined, then you get whatever was left in the register from the
previous operation. If you then print this value as hex (for debugging
stream output, say), you may see different values from run to run. I
usually only see different values when changing compilers (GNAT to
ObjectAda), but it could happen when you change optimization levels. The
problem is not that the behavior is undefined, just that the test output
changes, which makes comparing the test results more difficult.
So I've adopted the general policy that all bits must be defined.
--
- Stephe
However, I've shifted my thinking on this. It is certainly less
"cluttered" to avoid declaring unused bits. Also, it is less
misleading. I have encountered code where the programmer declared
all unused bits as "Spare" (or similar) only to discover later
that there really were definitions for the bits in question - just
that they were unused by this particular piece of code. One might
have been tempted to appropriate the bits for something else. So
if they are left completely undefined, someone else comming along
later is not likely to make assumptions about the bits other than
that they are not used by this application.
One thing I do favor is this: Where you are using the rep clause
to model something coming out of hardware or some external source,
I find it a good idea to define all of the fields that are
documented and do so with names as close to what appears in the
document as is possible. This way it is explicit what bits the
hardware is supposed to be setting/using, what bits the hardware
does not care about (if its labeled "reserved" you probably want
to declare it to make clear that the hardware *might* use those
bits) and there is a clear mapping between what someone reads in
the document and what they read in the code. The code may not use
all the bits that are defined - and that's O.K. - but it is clear
when reading the definition, why those bits exist.
It would be nice to have a convention on this one way or the
other.
MDC
Marin David Condic, Senior Computer Engineer Voice: 561.796.8997
Pratt & Whitney GESP, M/S 731-95, P.O.B. 109600 Fax: 561.796.4669
West Palm Beach, FL, 33410-9600 Internet: COND...@PWFL.COM
=============================================================================
"Because that's where they keep the money."
-- Willie Sutton when asked why he robbed banks.
=============================================================================