There are certainly reasons for wanting to cast a pointer to an integer
type, or the reverse. Sometimes there are even /good/ reasons for doing
so. But /if/ you are going to do that, in portable code, it is best to
use uintptr_t (or an OS-specific type, if there is one) rather than
using unsigned int.
>> However, the ABI's for Linux always require "unsigned long" to match the
>> size of a pointer, AFAIK. I believe that applies to all 64-bit Linux
>> systems. Be slightly wary of unusual ABI's like x32 which provide for
>> full 64-bit arithmetic but have 32-bit pointers and long int.
>
> OK, thanks, that's good to know. In any amendments I succeed in making,
> I certainly won't be introducing any new "unsigned long"s which are
> really pointers.
>
>> For systems other than Linux, details may vary. 64-bit Windows has
>> 32-bit long, for example. This is one of the reasons the gcc manual
>> doesn't give details here (though I would prefer it if did) - the sizes
>> depend on the target ABI, not just the processor.
>
> OK, <sigh>. It's just one of these historical things, I suppose, where
> it's a lot easier to criticise in hindsight than to do the right thing at
> the right time.
>
Yes. In particular, there was no "right" type for converting pointers
to an arithmetic type until C99. Prior to C99, the "right" thing would
be for the OS to provide a type in its compatibility layers (along with
things like the endianness of the system, and anything else that needs
to be adjusted for particular targets). But it was certainly common to
use "unsigned long" for the job.
The disadvantages of using "unsigned long" here were clear when moving
to 64-bit systems. For Linux (and indeed, *nix in general, AFAIK), the
size of long was picked to be 64-bit on the basis that the most common
unwarranted assumption about "long" is that it is the same size as a
pointer. In 64-bit Windows, "long" was set at 32-bit on the basis that
of the common unwarranted assumption that "long" is always 32 bits.
Neither of these was always correct, of course, and some code was
written assuming "long" was /both/ 32-bit /and/ the size of a pointer.
As you say, it is a historical thing - all we can do is try to avoid
repeating the same mistakes.