Unfortunately, packaging Python is more complicated than just PyPI. I've never seen that talk, but I work on the system in question.
To build Python hermetically, you need to control Python and all the transient native libraries it might load. Each PyPI libraries expands that scope of potential dependencies. Additionally, you must choose whether all of you production environment will support exactly one version of an external library (this is the "vendoring" discussion). These choices extend native code as well - how many versions of zlib do we need? How do you track critical security fixes, etc. For stability, it's not really tenable to support multiple versions. You also need to deal with multiple OS releases and various incarnations of the runtime (glibc etc) that may not be fully compatible. We solve this with a "portable" runtime that makes every Linux OS capable of running binaries produced by Bazel.
Once you have constrained the problem a bit, you can write a program with some simple heuristics to generate accurate BUILD files for python. PyPI and native code generally must be handled by hand as a one-time cost. The tools assist you, but you must still resolve some issues by hand - (ambiguous imports).
The by-product of these constraints is that I don't think our rules for Python/Go/Typescript/etc are portable or externally useful in their current form. A lot of decisions made were a influenced by the timeline and internal environment. I'm not sure there is a clear path to open sourcing something like that in a way that's actually helpful for the community since the scope is so large.