Hi all,
For python-flint I am trying to work out the correct way to typedef
two C types that are integers but of a size/typedef not known until
the C compiler runs. Someone just spent a long time debugging a
problem that could possibly be considered a Cython bug but either way
I want to establish the best way to typedef these types in the Cython
code.
The underlying C library Flint defines the types ulong and slong which
represent unsigned and signed integer types. The general idea with
these types is that they will usually be 32 bit on a 32 bit system and
64 bit on a 64 bit system i.e. they represent signed/unsigned integers
in the system word size (however it is possible to configure them
differently when building...). Either way though their actual typedefs
and sizes are not known until the C compiler gets to #include
<flint/flint.h> so this is not known to Cython apart from our
ctypedefs.
We currently have:
cdef extern from "flint/flint.h":
ctypedef unsigned long ulong
ctypedef long slong
This is correct for supported platforms except for 64 bit Windows
where unsigned/long are 32 bits but ulong/slong should be 64 bits.
There it would be more correct to say:
cdef extern from "flint/flint.h":
ctypedef unsigned long long ulong
ctypedef long long slong
This is correct for supported 64 bit systems but would be incorrect on
all 32 bit systems.
It has not so far been a problem that Cython thought that ulong was an
unsigned long on 64 bit Windows because for the most part Cython
manipulates these types textually and does not use their actual size
for anything. We can have C-level #defines that decide whether long or
long long is being used for C code when needed:
cdef extern from *:
"""
/* FLINT_BITS is not known until C compile time. We need to check if long
* or long long matches FLINT_BITS to know which CPython function to call.
*/
#if FLINT_BITS == 32 && LONG_MAX ==
2147483647
#define pylong_as_slong PyLong_AsLongAndOverflow
#elif FLINT_BITS == 64 && LLONG_MAX == 9223372036854775807
#define pylong_as_slong PyLong_AsLongLongAndOverflow
#else
#error FLINT_BITS does not match width of long or long long.
#endif
"""
slong pylong_as_slong(PyObject *pylong, int *overflow)
Probably these preprocessor checks can be improved to handle more
platforms but they work for all tested platforms and will fail at
build time otherwise.
Someone has just spent a while debugging a Windows-specific failure
though and found that it boils down to this not working correctly:
ulong *V = <ulong *>libc.stdlib.malloc(len(args) * sizeof(ulong))
Apparently this is because Cython inlines ulong to unsigned long in
sizeof(ulong) meaning that malloc allocates half the needed amount of
memory:
/* "flint/types/nmod_mpoly.pyx":695
* cdef:
* ulong res
* ulong *V = <ulong *>libc.stdlib.malloc(len(args) *
sizeof(ulong)) # <<<<<<<<<<<<<<
* if V is NULL:
* raise MemoryError("malloc returned a null pointer") #
pragma: no cover
*/
__pyx_t_6 = __Pyx_PyTuple_GET_SIZE(__pyx_v_args); if
(unlikely(__pyx_t_6 == ((Py_ssize_t)-1))) __PYX_ERR(0, 695,
__pyx_L1_error)
__pyx_v_V = ((ulong *)malloc((__pyx_t_6 * (sizeof(unsigned long)))));
Should that be considered a bug in Cython?
Does anyone have a good suggestion for how to handle the typedefs here?
We have considered marking ulong and slong as opaque struct types but
that breaks a lot because we need to do arithmetic, comparisons,
conversion to Python etc with them.
--
Oscar