Alternative(?) approach for sharing modules between applications

14 views
Skip to first unread message

Günther Jena

unread,
Jul 31, 2015, 9:26:57 AM7/31/15
to pypa-dev
Hi,

in our company we're using subversion. We have different python modules (own and third party)  in different versions in use. The various applications we develop have various dependencies regarding the version of the shared modules.

One possibility is using virtualenv installing the modules from a local pypi server. So on every initial checkout we need to create a virtualenv, activate it and install dependent modules from requirements.txt.

Disadvantages:
* Relatively complex operation for a simple task like checkout and run
* Your able to miss the creation of the virtualenv and your working with the modules installed in site-packages
* Need for a local pypi server (ok, your otherwise able to use urls pointing to your vcs)

So we came up with another solution and I ask for your opinion:
In the path of the application we use svn:externals (aka git submodules) to "link" to the specified module (from it's release path and with specified revision number to keep it read only), so the module will be placed locally in the path of the application. A "import mylib" will work as it was installed in python site-packages or in the virtualenv. This could be extended to even put a release of wx, numpy and other often used libraries into our repository and link them locally.

The advantages are:
* After the initial checkout your ready to run (really important point for me)
* version dependencies are fixed (like requirements.txt)

I never saw such a solution, so maybe we miss a point?

Best Regards
  Günther Jena
Reply all
Reply to author
Forward
0 new messages