python binary build fails on linux, suceeds on mac

232 views
Skip to first unread message

Or Enzer

unread,
Dec 28, 2022, 12:28:18 PM12/28/22
to bazel-discuss
Hi,
I have a py_binary rule I'm trying to build and run on a linux VM after successfully building and running it on a local mac machine.

I'm getting the following error:
SystemError: ffi_prep_closure(): bad user_data (it seems that the version of the libffi library seen at runtime is different from the 'ffi.h' file seen at compile-time)

I'm using rules_python for python toolchain and pip packages with version 3.10.
What I've tried:
  1. Changing python to 3.9
  2. Installing libffi-dev
  3. Changing from debian to ubunto
Yet nothing helped.
Any idea where does the libffi library version mismatch coming from? Any help would be appreciated!



Appendix - full stack trace of error:
Traceback (most recent call last):
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/internet/defer.py", line 1693, in _inlineCallbacks
    result = context.run(
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/middleware.py", line 49, in process_request
    return (yield download_func(request=request, spider=spider))
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/utils/defer.py", line 72, in mustbe_deferred
    result = f(*args, **kw)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/handlers/__init__.py", line 75, in download_request
    return handler.download_request(request, spider)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/handlers/http11.py", line 65, in download_request
    return agent.download_request(request)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/handlers/http11.py", line 340, in download_request
    d = agent.request(method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/web/client.py", line 1148, in request
    endpoint = self._getEndpoint(parsedURI)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/web/client.py", line 1132, in _getEndpoint
    return self._endpointFactory.endpointForURI(uri)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/web/client.py", line 1003, in endpointForURI
    connectionCreator = self._policyForHTTPS.creatorForNetloc(
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/contextfactory.py", line 67, in creatorForNetloc
    return ScrapyClientTLSOptions(hostname.decode("ascii"), self.getContext(),
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_scrapy/scrapy/core/downloader/tls.py", line 40, in __init__
    super().__init__(hostname, ctx)
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_twisted/twisted/internet/_sslverify.py", line 1128, in __init__
    ctx.set_info_callback(_tolerateErrors(self._identityVerifyingInfoCallback))
  File "/home/or/.cache/bazel/_bazel_or/d1cd78077b282a8bd1730bb1cdb7f294/execroot/__main__/bazel-out/k8-fastbuild/bin/ai/blix/core/data/sourcer/web_crawl/runner.runfiles/pypi_pyopenssl/OpenSSL/SSL.py", line 1338, in set_info_callback
    self._info_callback = _ffi.callback(
SystemError: ffi_prep_closure(): bad user_data (it seems that the version of the libffi library seen at runtime is different from the 'ffi.h' file seen at compile-time)

Or Enzer

unread,
Dec 29, 2022, 6:04:48 AM12/29/22
to bazel-discuss
Solved the issue.
To close this one, the solution was to add --no-binary=cffi to my requirements.in file.
This way, cffi is built with the same libffi version as the one used in runtime (the prebuilt binary coming from pypi was probably using a different version)

Reply all
Reply to author
Forward
0 new messages