It is sometimes possible for a Python module defined by (or depended on by) the build to override a module in the standard library.
This can lead to errors. For example (paraphrasing katzdm), you could indirectly depend on the "enum34" PyPI package, which defines an "enum" module intended for older versions of Python that lack their own enum in the standard library. This breaks newer versions of Python unless the standard library is allowed to take precedence over enum34 in the PYTHONPATH.
(Or, from my own experience teaching programming to kids, you might name your local utility file "turtle.py" and suddenly get confusing errors when you "import turtle" in your main program.)
PR
#6532 would prevent this by having Bazel
append custom import paths to PYTHONPATH, rather than
prepend them. But since this is an observable difference to the user, I'd like to get some feedback on whether this is better viewed as a bugfix, or as a feature that needs to be controllable by the user, perhaps via an attribute on py_library or py_binary. Are there legitimate use cases for overriding the standard library / the system PYTHONPATH?
My suspicion is that we can get away with making this change unconditional, and require users who want the old behavior to hack sys.path at startup, or (in the future) customize the launcher stub script.
Thoughts?