The old pipeline will be deprecated very soon, so please try the new one out in your repo as soon as you can (see that pull request for how to enable it). Your users will benefit from a much more snappy Python experience.
It seems consistent with large-scape monorepo-tooling. I should be able to run tests in e.g. a common library and I shouldn't need to have intimate knowledge of every individual subproject to do it.In this case, I have generic library code, and also some Django projects. Django has its own testing peculiarities, but there is a plugin to make it work with Pytest, and it works fine, except that once Django is "started" it can't be torn down (afaik) and even if it could be torn down, I'm not sure that would still fit into the pytest/unittest wrapper.
So despite being slow, running each python_test target in its own interpreter is a viable solution. Some of that performance hit can also be made up for with sharding/multicore.
Also, I'm not having any luck getting py coverage to work in recent builds, seemed to work fine earlier in the 1.3 series. Is that expected or possibly a configuration issue of mine?
So the issue is not that you have colliding dependencies (e.g., incompatible versions of 3rdparty deps) but that everything runs in a single pytest invocation? Because that, specifically, wouldn't be hard to change. The old "nofast" mode actually created a new chroot for each test target, including resolving all its 3rdparty requirements. So it was horribly slow, but also completely isolated. But it sounds like possibly your requirements are less severe?
Also, I'm not having any luck getting py coverage to work in recent builds, seemed to work fine earlier in the 1.3 series. Is that expected or possibly a configuration issue of mine?How are you specifying that config? We upgraded to non-prehistoric versions of pytest, pytest-cov and coverage, which necessitated a change in how those options work on our end.