Best Build Practices for Cython

1,018 views
Skip to first unread message

pdx_s...@yahoo.com

unread,
Oct 28, 2023, 1:18:35 AM10/28/23
to cython-users
I'm trying to adopt best practices in the build of my Cython-based project as well as I can determine what those are.  :-)  I have a pyproject.toml, a vestigial setup.py, and a src/_custom_build.py.  I have docs written in Sphinx/ReadTheDocs and some tests based on unittest.  Running `python3 -m build` from the top level directory creates a source distribution and wheel.

But I'm probably not setting up the run time and optional dependencies correctly.  I don't seem to be linking the build with binary extensions correctly to a python version.  I have a dependency on an external C library -- is there a way to specify that dependency within the Python build environment?


I would appreciate specific, substantive suggestions.  I'm using setuptools because I think they're the most available / best supported.  If you suggest alternate build tools, please explain why they're better (as opposed to developer taste) and how specifically to use them.  Please help with the run time and optional dependencies, how to link builds with python versions, and how to integrate with something like cibuildwheel to build for many versions.  Thanks to Marcel Martin for pointing to a cibuildwheel example that tests across many versions.

Scott

rainy liu

unread,
Oct 28, 2023, 9:05:44 AM10/28/23
to cython...@googlegroups.com
Maybe you can try meson and meson-python, which also used by numpy and scipy, in this way, you can specify the linked libraries or dependencies.

'pdx_s...@yahoo.com' via cython-users <cython...@googlegroups.com> 于2023年10月28日周六 13:18写道:
--

---
You received this message because you are subscribed to the Google Groups "cython-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cython-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cython-users/f27cc094-06d6-4006-9d72-c9cd9c6cc416n%40googlegroups.com.

Jérôme Kieffer

unread,
Oct 28, 2023, 10:06:11 AM10/28/23
to cython...@googlegroups.com
On Sat, 28 Oct 2023 19:54:59 +0800
rainy liu <rainy...@gmail.com> wrote:

> Maybe you can try meson and meson-python, which also used by numpy and
> scipy, in this way, you can specify the linked libraries or dependencies.

+1: I did it for a couple of projets with substential amount of cython in them and it is:
- much faster to build/compile (native parallel build)
- much easier to handle OpenMP in multi-platform (linux, macos,
windows, x86 x86-64, ppc64le, arm, arm64, ...)
- very good support from the authors

Few anoying things:
- Still limited documentation
- one needs to declare every file, even python files. Apparently this
is not a bug but a feature.

Just have a look at this comment about the compilation time and the
energy needed in each case:

https://github.com/silx-kit/pyFAI/issues/1952#issuecomment-1727248202

It would be 5x greener and 8x faster to use meson instead of setuptools.

Cheers,
--
Jerome

Daniele Nicolodi

unread,
Oct 28, 2023, 10:16:42 AM10/28/23
to cython...@googlegroups.com
On 28/10/23 16:06, Jérôme Kieffer wrote:
> On Sat, 28 Oct 2023 19:54:59 +0800
> rainy liu <rainy...@gmail.com> wrote:
>
>> Maybe you can try meson and meson-python, which also used by numpy and

> Few anoying things:
> - Still limited documentation

What do you think is not well documented?

> - one needs to declare every file, even python files. Apparently this
> is not a bug but a feature.

If you don't like to list all the python modules than need to be
installed, you don't need to do it, just use install_subdir()
passing py.get_install_dir() to the install_dir keyword argument.

https://mesonbuild.com/Reference-manual_functions.html#install_subdir

Cheers,
Dan

Daniele Nicolodi

unread,
Oct 28, 2023, 11:12:33 AM10/28/23
to cython...@googlegroups.com
On 28/10/23 03:21, 'pdx_s...@yahoo.com' via cython-users wrote:
> I have a
> pyproject.toml, a vestigial setup.py,

Why do you need setup.py at all?

> and a src/_custom_build.py.

This should not be necessary to build extension modules.

> But I'm probably not setting up the run time and optional dependencies
> correctly.  I don't seem to be linking the build with binary extensions
> correctly to a python version.

I don't know what you mean with this statement. If the extension modules
you build do not work, evidently there is something wrong with how you
setup the build, but if you use a custom setuptools command to do the
build there is very little to suggest without debugging the
implementation of your custom build command, and this is something that
most do not have time for. It would be much easier if you were to use
the standard infrastructure.

> I have a dependency on an external C
> library -- is there a way to specify that dependency within the Python
> build environment?

What do you mean with "the Python build environment"? You use setuptools
to build your package. setuptools do not offer facilities to discover
which compilation flags you need to link to an external library on a
given platform.

> I would appreciate specific, substantive suggestions.  I'm using
> setuptools because I think they're the most available / best supported.

setuptools is probably the most widely used build system for Python
packages. Among the ones available, it is one of the few that supports
building extension modules. However, it lacks advanced features like the
ability to discover the compilation flags necessary to link to an
external library.

> If you suggest alternate build tools, please explain why they're better
> (as opposed to developer taste) and how specifically to use them.

I think you are asking too much here. I can point you to other
solutions, but you would need to do some reading on your side to decide
which solution solves your use case in the way you like the most.

I'm biased, but as other have suggested, I think that meson and
meson-python, its integration with the Python package build framework,
strike a very nice balance between flexibility, features, and ease of
use. Most packages in the scientific Python stack switched or are
switching to meson-python as their build system (in fact the development
of meson-python was initially driven by the needs of scipy).

> Please help with the run time and optional dependencies,

This is a bit too vague of a formulation to be able to offer concrete
advice. It depends on what the run-time and optional dependencies are.
In the case of libraries, it depends on whether you expect the library
to be provided by the platform or if it should be part of your package.

> how to link
> builds with python versions,

I don't understand what you mean here. Any Python build tool should
produce extension modules that work with the Python version that a
specific build invocation is targeting (except for very fundamental
bugs, that is).

> and how to integrate with something like
> cibuildwheel to build for many versions.

cibuildwheel uses pip or build as Python package build front-ends. In
turn pip and build invoke the build back-end using a standard interface.
Therefore cibuildwheel works with any (PEP517 compliant) build back-end.
setuptools is one such back-end.

Cheers,
Dan

Jérôme Kieffer

unread,
Oct 28, 2023, 2:28:28 PM10/28/23
to cython...@googlegroups.com
Hi Dan,

Don't feel agressed, I just gave a feed-back on my experience.

On Sat, 28 Oct 2023 16:16:34 +0200
Daniele Nicolodi <dan...@grinta.net> wrote:

> On 28/10/23 16:06, Jérôme Kieffer wrote:
> > On Sat, 28 Oct 2023 19:54:59 +0800
> > rainy liu <rainy...@gmail.com> wrote:
> >
> >> Maybe you can try meson and meson-python, which also used by numpy and
>
> > Few anoying things:
> > - Still limited documentation
>
> What do you think is not well documented?

When I migrated FabIO and pyFAI from numpy.distutils to meson-python
(~dec2022) there was very few documentation available on the
meson-python and little projects to get inspiration from.
On the meson side, the documentation is great, I got the source code
compiled in an hour or so, reading the documentation.
It is the binding to PEP517 (meson-python) which was lacking
documentation, especially how to pass options and so. But on the other
hand, I got plenty of help from various developpers.

>
> > - one needs to declare every file, even python files. Apparently this
> > is not a bug but a feature.
>
> If you don't like to list all the python modules than need to be
> installed, you don't need to do it, just use install_subdir()
> passing py.get_install_dir() to the install_dir keyword argument.
>
> https://mesonbuild.com/Reference-manual_functions.html#install_subdir

This is marked as deprecated ... (since 0.45.0, deprecated since 0.60.0)

Cheers,

--
Jérôme

Daniele Nicolodi

unread,
Oct 28, 2023, 2:48:07 PM10/28/23
to cython...@googlegroups.com
On 28/10/23 20:28, Jérôme Kieffer wrote:
> Hi Dan,
>
> Don't feel agressed, I just gave a feed-back on my experience.

I just asked for elucidations on your messages.

>> What do you think is not well documented?
>
> When I migrated FabIO and pyFAI from numpy.distutils to meson-python
> (~dec2022) there was very few documentation available on the
> meson-python and little projects to get inspiration from.
> On the meson side, the documentation is great, I got the source code
> compiled in an hour or so, reading the documentation.
> It is the binding to PEP517 (meson-python) which was lacking
> documentation, especially how to pass options and so.

As with any other project, early adopters have it a bit harder. The
documentation of meson-python has been greatly improved since then. It
is still far from perfect, but it should be enough to get off the
ground. The main reason why I was asking about which part you feel under
documented is to prioritize writing the missing pieces.

>>> - one needs to declare every file, even python files. Apparently this
>>> is not a bug but a feature.
>>
>> If you don't like to list all the python modules than need to be
>> installed, you don't need to do it, just use install_subdir()
>> passing py.get_install_dir() to the install_dir keyword argument.
>>
>> https://mesonbuild.com/Reference-manual_functions.html#install_subdir
>
> This is marked as deprecated ... (since 0.45.0, deprecated since 0.60.0)

The deprecation note refers only to creation of empty directories in the
destination when the source directory does not exist.

Cheers,
Dan

pdx_s...@yahoo.com

unread,
Oct 29, 2023, 12:45:24 AM10/29/23
to cython-users
Thank you everyone.  I will read up on meson and meson-python.  I know many of my questions were vague/open-ended, but that's mostly because I don't know the right questions to ask.  I have a few more specific questions, though.

First, D Woods pointed out that the wheel I uploaded to PyPI is missing a version field.  He writes:
    D Woods> Your current download is "libvna-0.0.3-py3-none-any.whl" which suggests you haven't correctly linked it to a version.
How can I fix that?

What I uploaded to PyPI works only on Python 3.11, Linux, x86_64.  In another thread, I was asking how to support many versions.    I will try to use cibuildwheel to improve version coverage.

D Woods also said:
    You can also distribute a generic fallback wheel that's compile itself. Cython itself is distributed like this. Unfortunately I
    couldn't tell you exactly what to do off the top of my head (but hopefully someone else can).
Can someone point me to a good example of this (perhaps the Cython code)?

The question I asked about optional dependencies specifically was: what sections do I need in my pyproject.toml (if any) for tests and documentation?  On the docs, I have a .readthedocs.yaml, requirements.txt and docs/requirements.txt, so that one is handled in a different way -- it works.  But is the the right way to do it?  [It's fairly consistent with the readthedocs documentation, at least.]

The custom build script generates the list of include directories and library directories dynamically, adding $HOME/.local/{include,lib}, respectively, if the HOME environment variable is defined, and passes the library options on the C libvna library and lib math.  The reason for looking in $HOME/.local was that the ReadTheDocs container build environment runs as a non-privileged user that can't install the C library in /usr/local/lib or other standard place.  Linux has a [relatively new] convention that you can install in $HOME/.local if you don't have admin privilege.  It's a good feature anyway.  Is it straightforward to do things like this run-time logic in meson/meson-python?

I would appreciate links to code that people generally agree reflect good examples.

Scott

Daniele Nicolodi

unread,
Oct 29, 2023, 5:40:47 AM10/29/23
to cython...@googlegroups.com
On 29/10/23 02:32, 'pdx_s...@yahoo.com' via cython-users wrote:
> Thank you everyone.  I will read up on meson and meson-python.  I know
> many of my questions were vague/open-ended, but that's mostly because I
> don't know the right questions to ask.  I have a few more specific
> questions, though.
>
> First, D Woods pointed out that the wheel I uploaded to PyPI is missing
> a version field.  He writes:
>     D Woods> Your current download is "libvna-0.0.3-py3-none-any.whl"
> which suggests you haven't correctly linked it to a version.
> How can I fix that?

The wheel file name is wrong indeed, but this is not because you "didn't
link to a version" (and I am not sure about what this statement is
intended to mean). It is because your implementation of the setuptools
build command is incomplete and it does not inform the setuptools build
system that the wheel contain python version and platform specific
components. I still don't understand why you decided to reimplement the
build command yourself.

> What I uploaded to PyPI works only on Python 3.11, Linux, x86_64.  In
> another thread, I was asking how to support many versions.    I will try
> to use cibuildwheel to improve version coverage.
>
> D Woods also said:
>     You can also distribute a generic fallback wheel that's compile
> itself. Cython itself is distributed like this. Unfortunately I
>     couldn't tell you exactly what to do off the top of my head (but
> hopefully someone else can).
> Can someone point me to a good example of this (perhaps the Cython code)?

There is not such a thing like a wheel that compiler itself. There are
source distributions (sdist) which are simply an archive of the sources
of your package plus some metadata. `python -m build -s` produces an sdist.

> The question I asked about optional dependencies specifically was: what
> sections do I need in my pyproject.toml
> (if any) for tests and documentation?  On the docs, I have a
> .readthedocs.yaml, requirements.txt and docs/requirements.txt,so that
> one is handled in a different way -- it works. But is the the
right way to do it?

The `requirements.txt` mechanism is a pip thing that is unrelated to
building and distributing wheels. If you want to add optional
dependencies to your published package, the mechanism to use it the one
of optional dependencies in in the package metadata. See

https://packaging.python.org/en/latest/specifications/declaring-project-metadata/#dependencies-optional-dependencies

and

https://setuptools.pypa.io/en/latest/userguide/dependency_management.html#optional-dependencies

> The custom build script generates the list of include directories and library directories dynamically, adding $HOME/.local/{include,lib}, respectively, if the HOME environment variable is defined, and passes the library options on the C libvna library and lib math.  The reason for looking in $HOME/.local was that the ReadTheDocs container build environment runs as a non-privileged user that can't install the C library in /usr/local/lib or other standard place.  Linux has a [relatively new] convention that you can install in $HOME/.local if you don't have admin privilege.  It's a good feature anyway.  Is it straightforward to do things like this run-time logic in meson/meson-python?

The location of headers and libraries should be controlled by the users
compiling your package, thus should not be part of the build recipe
definition. For instance, I may want to compile the package pointing it
to a specific version of a dependency installed in a non-standard location.

Conversely, it is responsibility of the build tooling to discover in
which standard location the dependencies are installed, and to add the
required compilation flags. These days, the recommended mechanism to
achieve this is for libraries to distribute a pkg-config file with the
required information.

Meson handles all this transparently and fairly easily.

Cheers,
Dan

Chris Barker

unread,
Oct 30, 2023, 10:11:16 PM10/30/23
to cython...@googlegroups.com
Coming a bit late to this, but just to make it more complicated ;-)

A bit of background:

As of a not too long ago, setuptools was essentially the only option for building packages with compiled extensions. However, it is built as extension to the old distutils, which was not a very extensible (at least not easily) system -- so a lot of us had a lot of ugly hacks to get things to work -- including numpy's custom extensions. 

Anyway -- this was ultimately a dead end, and after a few false starts over the years, there are now 2 reasonable contenders for building packages with compiled extensions. One is meson-python, as is being discussed in this thread,and which is not used in production for numpy/scipy (and other parts of the scipy stack? I'm not sure.

The other is scikit-build:


I have a pretty complex package with a lot of custom C, C++ and Cython, and have decided to go with scikit-build. At the time, it seemed to be a little bit more mature and documented than meson-python, but that can change fast. And, honestly, I made the final choice because I managed to get a core scikit-build developer to give me a hand getting started :-) 

So I can't say whether it's a better or worse option than meson -- but it's powerful, flexible,  well documented, supported, and under active development -- so certainly worth considering.

Here's some Cython-specific docs:


Anyway, meson or scikit-build aside, you also have to deal with the rest of the Python packaging system -- and while PyPA has been working very hard to get things robust and standardized, they really haven't helped with the compiled-package world -- that's still a big challenge, particularly if your package relies on any external libraries that aren't written in Python. Which I bring up because:
 
>   I have a dependency on an external C library -- is there a way to specify that dependency within the Python build environment?

The short answer is no -- the pip/PyPi has no support for dependencies on external libraries. So you options are:

1) Statically link or ship a copy of the dynamic lib with your package -- maybe that's what you are already doing, and for a single, small obscure lib, that's a fine option.

2) use conda -- conda is designed to handle packages of any type, so can be (and is) used for managing not just python packages, but also libraries written in Fortan,C, C++., Rust, etc ... So in this case, you'd make a conda package of the lib you need, and then your conda package depends on that -- if you're lucky, and the lib you need isn't too obscure, maybe someone already has done it on conda-forge:


This can be very, very helpful if you need a not-too-obscure lob, as then you don't have to figure out how to build it on all platforms yourself.

3b) If you only care about linux, you can probably just tell your users they need a given library installed in a standard location -- but that's only really viable in Linux

> What I uploaded to PyPI works only on Python 3.11, Linux, x86_64.

Just a process note -- it's a really bad idea to upload stuff to PyPi before it's ready -- "broken" wheels on PyPI are not helpful to anyone. There is a Test version of PyPI that you can work with while you figure things out:


But honestly, get everything building first -- uploading to PyPI is the last step (and not the hard one)

-Hope I've done more than just muddy the waters ...

-CHB

--

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris....@noaa.gov

mattip

unread,
Oct 31, 2023, 6:17:28 AM10/31/23
to cython-users
cibuildwheel has a repair script[0]]  to call auditwheel (linux) delocate (darwin) and experimentally delvewheel (windows) to detect dynamic libraries that are needed for your c-extension module. These tools know about standard libraries that ship with the OS and will not bundle those. Any other detected dlls/shared objects will be bundled into your wheel. On posix, RPATH will be modified in the c-extension so that loading the c-extension will find the bundled shared objects, on windows delvewheel will package the extra dlls but add_dll_directory() must be used to add the location to the search path. This can be done in the package's __init__.py script.
Matti


On Tuesday, 31 October 2023 at 04:11:16 UTC+2 chris....@noaa.gov wrote:
...
1) Statically link or ship a copy of the dynamic lib with your package -- maybe that's what you are already doing, and for a single, small obscure lib, that's a fine option.
...
-CHB

Oscar Benjamin

unread,
Oct 31, 2023, 6:37:03 AM10/31/23
to cython...@googlegroups.com
On Tue, 31 Oct 2023 at 10:17, mattip <matti...@gmail.com> wrote:
>
> cibuildwheel has a repair script[0]] to call auditwheel (linux) delocate (darwin) and experimentally delvewheel (windows) to detect dynamic libraries that are needed for your c-extension module. These tools know about standard libraries that ship with the OS and will not bundle those. Any other detected dlls/shared objects will be bundled into your wheel. On posix, RPATH will be modified in the c-extension so that loading the c-extension will find the bundled shared objects, on windows delvewheel will package the extra dlls but add_dll_directory() must be used to add the location to the search path. This can be done in the package's __init__.py script.

I have a Cython-based project that uses setuptools, cibuildwheel,
auditwheel, delocate and delvewheel:
https://github.com/flintlib/python-flint

The only complication I had with delvewheel was that I needed to use
strip on MinGW generated DLLs although I am told that this might not
be necessary any more.

I haven't found it necessary to use add_dll_directory() but maybe I am
missing something. I just tell delvewheel where to look for the DLLs
that were built and it bundles them into the wheel and then everything
seems to work after the wheel gets installed:
https://github.com/flintlib/python-flint/blob/9252d179da5116f70b5449c44408768b3b73e183/bin/cibw_repair_wheel_command_windows.sh#L48-L53

When is it necessary to use add_dll_directory()?

Is add_dll_directory() needed at build time or at run time?

See also this new tool called repairwheel:
https://discuss.python.org/t/repairwheel-single-cross-platform-interface-to-auditwheel-delocate-and-delvewheel/31827

--
Oscar

Peter Schay

unread,
Oct 31, 2023, 9:10:01 AM10/31/23
to cython...@googlegroups.com
Hello,

Here are some widely used and relatively small pypi projects which include extension modules.  It's a fairly meager contribution to this discussion, but hopefully someone will find useful ideas from perusing these:

- cffi
- zstandard
- pycryptodome
- uvloop
- cython (already mentioned of course)

My project is a python+cython utility for my company that, unfortunately, I'm obligated to keep closed-source for now, so it's been easiest to build everything, including Python, into a single binary with just a few shared lib dependencies so it runs on any linux distro.  That has been fun and quite interesting to figure out. 
I look forward to publishing some wheels in the future though, and thanks to everyone for all the useful information in this thread which I have bookmarked.

Regards,
Pete

pdx_s...@yahoo.com

unread,
Jan 8, 2024, 12:40:22 AM1/8/24
to cython-users
The comments so far have been very helpful.  I've made many changes since my last post.  I've converted the build environment from setuptools to meson-python.  I'm using matrix builds with cibuildwheel in github workflows to build for multiple operating systems and python versions.  Linux and MacOS builds look good.  But I'm running into problems on Windows.

I'm pretty closely following SciPy as an example, though my project is a bit simpler in that it doesn't use fortran or C++.  I'm install R-tools just for pkg-config (a bit overkill, probably).

Here's the main part of the github workflow script (most options commented out

```yaml
jobs:
  build_wheels:
    name: Build wheel for ${{matrix.python[0]}}-${{matrix.buildplat[1]}} ${{matrix.buildplat[2]}}
    runs-on: ${{ matrix.buildplat[0] }}
    strategy:
      matrix:
        buildplat:
        # - [ubuntu-22.04, manylinux, x86_64]
        # - [ubuntu-22.04, musllinux, x86_64]
        # - [macos-11, macosx, x86_64]
        - [windows-2019, win, AMD64]
        python:
        # - ["cp38",  "3.8"]
        # - ["cp39",  "3.9"]
        - ["cp310", "3.10"]
        # - ["cp311", "3.11"]
        # - ["cp312", "3.12"]
    steps:
    - name: Checkout Code
      uses: actions/checkout@v3

    - uses: actions/setup-python@v4
      with:
        python-version: 3.9

    - name: Install Rtools (Windows)
      # Rtools provides pkg-config on Windows
      if: ${{ runner.os == 'Windows' }}
      run: |
        choco install rtools -y --no-progress --force --version=4.0.0.20220206
        echo "c:\rtools40\ucrt64\bin;" >> $env:GITHUB_PATH

    - name: Install cibuildwheel and twine
      run: |
        python -m pip install --upgrade pip
        pip install cibuildwheel twine

    - name: Install Libvna (Windows)
      if: ${{ runner.os == 'Windows' }}
      run: |
        python3 .github/tools/download-libvna-windows
        New-Item -Path C:\opt -ItemType Directory -Force
        Expand-Archive -Path libvna-windows-x86_64.zip -DestinationPath C:\opt
        echo 'C:\opt\libvna-windows-x86_64\bin;' >> $env:GITHUB_PATH
        Write-Host "GITHUB_PATH=$env:GITHUB_PATH"

    - name: Build Wheels
      env:
        CIBW_BUILD: ${{ matrix.python[0] }}-${{ matrix.buildplat[1] }}*
        CIBW_ARCHS: ${{ matrix.buildplat[2] }}
        CIBW_ENVIRONMENT: "PIP_NO_BUILD_ISOLATION=false PIP_PRE=1"
        CIBW_ENVIRONMENT_PASS_LINUX: RUNNER_OS
        CIBW_PRERELEASE_PYTHONS: True
        CIBW_BEFORE_BUILD_LINUX: |
          echo "WHOAMI=`whoami`"
          echo "UNAME=`uname -a`"
          set -e
          yum install -y gcc libyaml-devel
          pip install requests
          python3 .github/tools/install-libvna-rpm
        CIBW_ENVIRONMENT_WINDOWS: >
          PKG_CONFIG_PATH=c:/opt/libvna-windows-x86_64/lib/pkgconfig
          PIP_PRE=1
          PIP_NO_BUILD_ISOLATION=false
        CIBW_BEFORE_BUILD_WINDOWS:
          pip install numpy>=2.0.0.dev0 meson-python cython!=3.0.3 ninja &&
          bash {project}/tools/test-script &&
          powershell -Command "Write-Host GITHUB_PATH=$env:GITHUB_PATH"
        CIBW_BEFORE_BUILD_MACOS: |
          pip install numpy>=2.0.0.dev0 meson-python cython!=3.0.3 ninja
          brew tap scott-guthridge/extra
          brew install scott-guthridge/extra/libvna
      run: python -m cibuildwheel --output-dir wheelhouse
```

The CIBW_BEFORE_WINDOWS_BUILD step completed successfully.  The test script called there sees my C library installed, and pkg-config is returning reasonable-looking paths to the include and lib directories.

It's failing fairly early on in the build wheels step:

```
The Meson build system
Version: 1.3.1
Source dir: D:\a\pylibvna-cicdtest\pylibvna-cicdtest
Build dir: D:\a\pylibvna-cicdtest\pylibvna-cicdtest\.mesonpy-mc6o1dxn
Build type: native build
Project name: libvna
Project version: 0.0.5-13
Cython compiler for the host machine: cython (cython 3.0.7)
Host machine cpu family: x86_64
Host machine cpu: x86_64
Program python found: YES (C:\Users\runneradmin\AppData\Local\Temp\cibw-run-wcok67kg\cp310-win_amd64\build\venv\Scripts\python.exe)
Run-time dependency python found: NO
..\meson.build:13:13: ERROR: Python dependency not found
```

I suspect it's complaining that it can't find the python development environment.  But comparing with the SciPy build, I don't see what I'm doing differently in this respect.

Here's the beginning of the top-level meson.build:

```
project('libvna', 'cython',
  version: '0.0.5-13',
  license: 'GPL-3',
  default_options: [
    'buildtype=debugoptimized',
    'b_ndebug=if-release',
    'c_std=c99',
  ],
)

py_mod = import('python')
py3 = py_mod.find_installation(pure: false)
dep_py = py3.dependency()                                               <----- line 13 FAILING HERE
dep_vna = dependency('libvna', version: '>=0.3.9')

srcdir = meson.source_root()
testdir = join_paths(srcdir, 'tests')
builddir = meson.current_build_dir()
```

Ideas?

I have some other questions and people may probably comment on other things they see here, but I'm trying to stay focused on the problem at hand for this post.  Likely, there will be other problems after this one.

Scott

pdx_s...@yahoo.com

unread,
Jan 9, 2024, 2:37:34 AM1/9/24
to cython-users
Just to get these into the pipeline, I'm going to ask a couple of my other questions.

(1) My C library uses C99 complex throughout.  I'm building the Windows version of the library in the MSYS2 UCRT64 environment using gcc (that understands C99), and this produces real Windows export and .dll libraries.  After the build, I post-process the header files and documentation to replace every instance of "double complex" with "_Dcomplex" (Microsoft's struct-based implementation).  The two implementations are binary (but not source) compatible, and compiling and running test programs written in the microsoft style in Visual C against the library works.  My question is: how does Cython handle microsoft's implementation of complex when run in the native Windows environment?

My hope is that it automatically converts C99 complex declarations into microsoft-compatible code under Windows.  But my fear is that it doesn't -- that the Cython source has to "know" about the Windows implementation.  One way around it might be to build C++ bindings around the C library, since Visual C is really a C++ compiler.  Will Cython generate C++, or would I have to use pybind11?  It's work I'd rather not have to do, in any case.  Another possibility would be to do the Cython build in the MSYS2 UCRT64 environment instead of the Windows native environment -- then C complex works normally and the generated binaries are still native Windows compatible.


(2) On Linux, I'm building in the manylinux2014_x86_64 container, which should be compatible with most Linux distributions.  Question: do I need to also support musllinux?  What platforms does musllinux support that manylinux does not?

Scott
Reply all
Reply to author
Forward
0 new messages