Moving complete Trac installation to pip

71 views
Skip to first unread message

Mo

unread,
Nov 14, 2019, 3:21:51 AM11/14/19
to Trac Users
Hi,

our current installation on Gentoo Linux was done by the linux distribution package manager installing the core Trac, and creating an .egg for each of the plugins manually, copying the .egg to the $PROJECTDIR/plugins.

Updates of the Trac version are always behind in the package manager and I started to create my own packages for there.
Updating the addons required to 'git pull' every single plugin to see if there are changes and update the .egg.

I learned how to install local python libraries inside the ~trac/ home directory like

pip2.7 install --user xhtml2pdf

and how to make uwsgi use that:

# cat /etc/uwsgi.d/trac-pp.ini
[uwsgi]
plugins
= python27
chown
-socket = trac:nginx
uid
= trac
gid
= trac
workers
= 6
socket
= /run/uwsgi/%n.sock

env
= TRAC_ENV=/mnt/data/trac/projects/trac-pp
env
= PYTHON_EGG_CACHE=/mnt/data/trac/.python-eggs
module = trac.web.main
callable
= dispatch_request

pythonpath
= /mnt/data/trac/.local/lib64/python2.7/site-packages

Now migrating from Trac-1.2.3 to Trac-1.4.x I checked the Installation guide about pip and see, all the Trac releases are there and also lot of the plugins.
Is it a good idea to move all the Trac installation towards using pip and install into ~trac/ home? How would I check all user packages for updates, how would I update all?

About Python-2.7. There is still the requirement of version ≥ 2.7 and < 3.0, which currently is 2.7.16 here on Gentoo stable. How long? Even pip2.7 warns about "Python 2.7 will reach the end of its life on January 1st, 2020".

Best regards.

Mo

unread,
Nov 14, 2019, 5:09:39 AM11/14/19
to Trac Users
Am Donnerstag, 14. November 2019 09:21:51 UTC+1 schrieb Mo:

Is it a good idea to move all the Trac installation towards using pip and install into ~trac/ home? How would I check all user packages for updates, how would I update all?

At least for the plugins it does not seem to be a good solution, as only half of the list of our plugins is available in pip, and there some of the plugins are not the latest version like  TracAccountManager-0.5.1.dev0-py2.7.egg v.s. TracAccountManager (0.5.0) in pip.

Ryan Ollos

unread,
Nov 14, 2019, 5:55:54 AM11/14/19
to Trac Users
Correction of your terminology - TracAccountManager 0.5.0 is available on "PyPI" and passing a package name to pip like "pip install TracAccountManager" will install from PyPI.

But you can also install directly from a source code repository with pip. See (1) for examples:


* Replace pip with pip2.7 as needed
* For Git repositories, use "git+" prefix rather than "svn+"

Mo

unread,
Nov 14, 2019, 9:03:17 AM11/14/19
to Trac Users
Am Donnerstag, 14. November 2019 11:55:54 UTC+1 schrieb RjOllos:


* Replace pip with pip2.7 as needed
* For Git repositories, use "git+" prefix rather than "svn+"

Thanks. But when installing from a custom link, for later updates I need to do another pip install -U from exactly the same link or does the installation remember it's source URI?

Btw. all the updating via pip is not optimal. From what I learned on #python@freenode, pip has no 'pip update' and no 'pip update all' which every package manager should have a convenience function for. There is only
pip2 install --user -U <pgklist>
and
pip2 list --user --outdated
which I could use as input for the first.
I could also maintain a <requirements file> but that would mean I need to maintain that for every new package.

Finally after talking about these shortcomings on pip I was told, that "poetry (for projects) and pipx (for executables) are more suitable tools for end users", and that some " 'update all' is woefully ill-advised misfeature in library managers like pip and npm.
I'm no Python dev yet and a bit confused about... so pip is more a library manager than a package manager...
But still Trac advises to use pip if not doing the "python2.7 setup.py bdist_egg" manually.

Ryan Ollos

unread,
Nov 14, 2019, 9:35:38 AM11/14/19
to Trac Users
On Thu, Nov 14, 2019 at 6:03 AM Mo <burcheri...@gmail.com> wrote:
Am Donnerstag, 14. November 2019 11:55:54 UTC+1 schrieb RjOllos:


* Replace pip with pip2.7 as needed
* For Git repositories, use "git+" prefix rather than "svn+"

Thanks. But when installing from a custom link, for later updates I need to do another pip install -U from exactly the same link or does the installation remember it's source URI?

No, I don't think it remembers.

Instead, check-out/clone the repository and run "pip2 install ." from the root of the repository (or you may be able to provide a path rather than using "." from the root).
 
Btw. all the updating via pip is not optimal. From what I learned on #python@freenode, pip has no 'pip update' and no 'pip update all' which every package manager should have a convenience function for. There is only
pip2 install --user -U <pgklist>
and
pip2 list --user --outdated
which I could use as input for the first.
I could also maintain a <requirements file> but that would mean I need to maintain that for every new package.

Sure, but writing a requirements.txt that lists all of your installed packages is not much work at all, in fact it will save you time. "git+https" and "svn+https" URLs can be used in the requirements file. I believe you can also use local paths to a repository checkout/clone (check the docs on that).

Then, "pip2 install -Ur requirements.txt", and you never have to worry about listing packages to update.

That gets you the same result as if the installation remembered its source URI. And requirements.txt serves as documentation for your installed packages.
 
Finally after talking about these shortcomings on pip I was told, that "poetry (for projects) and pipx (for executables) are more suitable tools for end users", and that some " 'update all' is woefully ill-advised misfeature in library managers like pip and npm.
I'm no Python dev yet and a bit confused about... so pip is more a library manager than a package manager...

pip is a package installer. It can update packages from PyPI and will update to the latest version. When installing from a repository, I believe it rebuilds the package every time - it may not be able to determine if the installed version is the latest relative to the repository.

I haven't used poetry or pipx. There are all sorts of derivatives and projects that build on pip that appear to provide minor conveniences.
 
But still Trac advises to use pip if not doing the "python2.7 setup.py bdist_egg" manually. 

Yes, pip is better than easy_install, as it installs in the newer wheel format and has an "uninstall" feature. But you can use easy_install or "python setup.py install" if you prefer.

The other issue is that plugins installed to the environment or shared "plugins" directory must be eggs. I believe you may be able to force-build an egg using pip, but by default it will install a wheel package. Wheel packages are fine, and preferred, if installing to the global or virtual environment.

The easiest thing to do in my opinion is:
* Write a requirements.txt
* Install from PyPI as much as possible. If a package isn't on PyPI or needs to be updated on PyPI, just ask the author to publish it.
* Use a virtual environment. You can also delete the virtual environment and recreate it with all plugins in seconds if you have a requirements.txt

- Ryan

Mo

unread,
Nov 15, 2019, 2:42:22 AM11/15/19
to Trac Users
Thank you very much.
I think it's a good idea to use such a requirements_file, eventhough I still think it's part of a package manager to populate such a file after a 'pip install', just like every rpm, apt, yum or portage does.

Who is reponsible for publishing to PyPI? Does every track-hacks plugin author need to push to PyPI and such a request to do so I should file on track-hacks? The usual plugin wiki does mention the SVN source, but no comments about availability on PyPI.
I don't know much about PyPI, but if there was kind of repositories for pip it would be better to find all plugins on a separated trac repository instead of pushing every single trac plugin to the big PyPI repo. But it seems to be one big repository and Trac components usually are names like "Trac*". Though it is hard to find the packages, as for instance 'pip2 search AccountManager' does not find it while 'pip2 search TracAccountManager' finds the (outdated) 0.5.0 version.

You say it is also possible to use 'pip2 install .' from the root of a svn or git cloned repo. That would only replace my .egg building and copying. I still need to pull or sync my repos. I guess maintaining a requirements_file is the best as you say. 'pip2 list --user --outdated' shows me the available updates and 'pip2 install --user -Ur requirement_file' applies the updates. There would only be the gap if there is a new version on SVN that is not published on PyPI yet, so I still need to monitor those a bit.

About the virtual environment that was recommended many times to me I don't see the advantage yet. As I have all the Trac installation concentrated to the ~trac/ now, this $HOME is isolated from the root installation anyway. Then I'm daily btrfs-snapshotting the home directories.
That virtualenv sounds like perlbrew, having a portable complete installation with interpreter and libraries. This would be helpful if I need separated installations for the trac user account. Do I need this? My isolated python environment is ~trac/.local/ managed by 'pip2 install --user'.

For now, only packages like dev-lang/python-2.7.17, dev-python/pip-19.3.1,dev-python/setuptools-41.5.1 are provided by the linux root installation. Trac itself and Trac required python packages like pymills, reportlab or xhtml2pdf are installed by pip to the ~trac/ home, and Trac plugins are installed as .eggs to ~trac/projects/trac-pp/plugins/.  I'm going to reduce the local .eggs and move that to pip as well.

Some parts I found from the linux distribution like dev-python/simplejson-3.16.0 I installed as root, but I guess if those are only requred by Trac I would also move them to the ~trac/ home, as PyPI could be more up to date as some Linux distribution.

Jonathan Laufersweiler

unread,
Nov 15, 2019, 3:19:20 PM11/15/19
to trac-...@googlegroups.com
You say it is also possible to use 'pip2 install .' from the root of a svn or git cloned repo. That would only replace my .egg building and copying. I still need to pull or sync my repos. I guess maintaining a requirements_file is the best as you say. 'pip2 list --user --outdated' shows me the available updates and 'pip2 install --user -Ur requirement_file' applies the updates. There would only be the gap if there is a new version on SVN that is not published on PyPI yet, so I still need to monitor those a bit.  

With compatibility between various plug-ins and Trac-versions being highly version sensitive, packages on PyPI etc being sporadically updated,  and occasionally needing to make some instance-specific customizations at the module-level, I've been using Pip's VCS capability to target specific branches and tagged versions and install the sources rather than eggs or wheels.
Requirements files support specifying VCS options as well: https://pip.pypa.io/en/stable/reference/pip_install/#requirements-file-format
Pip can still fetch supporting libs and such from PyPI, if needed.

About the virtual environment that was recommended many times to me I don't see the advantage yet. As I have all the Trac installation concentrated to the ~trac/ now, this $HOME is isolated from the root installation anyway. Then I'm daily btrfs-snapshotting the home directories. That virtualenv sounds like perlbrew, having a portable complete installation with interpreter and libraries. This would be helpful if I need separated installations for the trac user account. Do I need this? My isolated python environment is ~trac/.local/ managed by 'pip2 install --user'.  

It sounds like your setup fulfills many of the same needs that a virtual environment would. If it's working for you I wouldn't see much point in scrapping it. Virtual environments can offer further capabilities in terms of other Python tooing interacting with them programatically, such as using Pipenv to make reproducible builds that include a v. e. and invoke things within its context. A good tool to have in the toolbox, even if you stick with what you've got for your Trac setup.

Some parts I found from the linux distribution like dev-python/simplejson-3.16.0 I installed as root, but I guess if those are only requred by Trac I would also move them to the ~trac/ home, as PyPI could be more up to date as some Linux distribution.  

I recommend isolating any Python 2 dependencies Trac has from  your system Python sooner rather than later. Gentoo gives you a great deal of flexibility there, of course, but distro maintainers generally are moving faster on transitioning to Python 3 than Trac is (understandable with Trac's small developer pool and the DB API & Jinja transitions also demanding their attention).

Best,
--Jonathan Laufersweiler

--
You received this message because you are subscribed to the Google Groups "Trac Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to trac-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/trac-users/b7e36fb6-c903-4c43-83cf-a467c46fb21c%40googlegroups.com.

RjOllos

unread,
Nov 17, 2019, 3:25:04 PM11/17/19
to Trac Users


On Thursday, November 14, 2019 at 11:42:22 PM UTC-8, Mo wrote:
Thank you very much.
I think it's a good idea to use such a requirements_file, eventhough I still think it's part of a package manager to populate such a file after a 'pip install', just like every rpm, apt, yum or portage does.

Who is reponsible for publishing to PyPI? Does every track-hacks plugin author need to push to PyPI and such a request to do so I should file on track-hacks?

Yes
 
The usual plugin wiki does mention the SVN source, but no comments about availability on PyPI.

Several plugins are available on PyPI. Here is the list:


 

 
I don't know much about PyPI, but if there was kind of repositories for pip it would be better to find all plugins on a separated trac repository instead of pushing every single trac plugin to the big PyPI repo. But it seems to be one big repository and Trac components usually are names like "Trac*". Though it is hard to find the packages, as for instance 'pip2 search AccountManager' does not find it while 'pip2 search TracAccountManager' finds the (outdated) 0.5.0 version.

 
You say it is also possible to use 'pip2 install .' from the root of a svn or git cloned repo. That would only replace my .egg building and copying. I still need to pull or sync my repos. I guess maintaining a requirements_file is the best as you say. 'pip2 list --user --outdated' shows me the available updates and 'pip2 install --user -Ur requirement_file' applies the updates. There would only be the gap if there is a new version on SVN that is not published on PyPI yet, so I still need to monitor those a bit.

I suggest trying to get everything published to PyPI and install from there. If you need some changes not yet on PyPI, install from the SVN repository manually for that case. Often the SVN repository won't be stable, but changes pushed to PyPI are presumably tested and stable.

Mo

unread,
Nov 18, 2019, 7:47:15 AM11/18/19
to Trac Users
Am Sonntag, 17. November 2019 21:25:04 UTC+1 schrieb RjOllos:

Ok, for instance there is
https://pypi.org/project/TracXMLRPC/
Which I found in the wiki.

But I can't find it in any search:
https://pypi.org/search/?q=tracxmlrpc&o=&c=Framework+%3A%3A+Trac
It seems to have the wrong framework as without framework I can find it:
https://pypi.org/search/?q=tracxmlrpc

The command 'pip2 search TracXml' does not find it either, only if I know the exact name like 'pip2 search TracXmlRpc'.
How can I search by patterns or RegExp?

RjOllos

unread,
Nov 18, 2019, 11:08:37 AM11/18/19
to Trac Users


On Monday, November 18, 2019 at 4:47:15 AM UTC-8, Mo wrote:
Am Sonntag, 17. November 2019 21:25:04 UTC+1 schrieb RjOllos:

Ok, for instance there is
https://pypi.org/project/TracXMLRPC/
Which I found in the wiki.

But I can't find it in any search:
https://pypi.org/search/?q=tracxmlrpc&o=&c=Framework+%3A%3A+Trac
It seems to have the wrong framework as without framework I can find it:
https://pypi.org/search/?q=tracxmlrpc

The classifiers was missing in plugin metadata. Fixed in:

If you notice the issue for other plugins, please report it.

TracXMLRPC 1.1.8 has been published to PyPI. The Framework is now shown on TracXMLRPC page, but it looks like filtered results are cached so we'll have to check later whether TracXMLRPC shows when filtering by Framework::Trac.


Screen Shot 2019-11-18 at 08.04.37.jpg



Searching for TracXMLRPC still shows 1.1.7, even though the link directs to version 1.1.8. That is why I think there must be caching involved.


Screen Shot 2019-11-18 at 08.07.32.jpg



For finding plugins on PyPI, you can check the plugin name in setup.py and search by the explicit name.

Mo

unread,
Nov 21, 2019, 6:05:06 AM11/21/19
to Trac Users
Hi again,

I know that .egg files in $ENV/plugins just are loaded. But what about plugins installed via pip2 to ~/.local/lib64/python2.7/site-packages/ ?
I just removed TracIncludeMacro-3.2.0.dev0-py2.7.egg and installed the (older) TracIncludeMacro via pip2:
TracIncludeMacro   3.1.0  
This installed ~/.local/lib64/python2.7/site-packages/TracIncludeMacro-3.1.0.dist-info/
but the plugin does not load anymore in Trac.

Best regards
- Mo

Mo

unread,
Nov 21, 2019, 9:06:08 AM11/21/19
to Trac Users
Am Montag, 18. November 2019 17:08:37 UTC+1 schrieb RjOllos:

If you notice the issue for other plugins, please report it.

Please find all PyPI related tickets here:
https://trac-hacks.org/query?status=!closed&keywords=~pypi
 
Best regards,
- Mo

RjOllos

unread,
Nov 21, 2019, 12:21:10 PM11/21/19
to Trac Users


On Monday, November 18, 2019 at 8:08:37 AM UTC-8, RjOllos wrote:

TracXMLRPC 1.1.8 has been published to PyPI. The Framework is now shown on TracXMLRPC page, but it looks like filtered results are cached so we'll have to check later whether TracXMLRPC shows when filtering by Framework::Trac.

Mo

unread,
Nov 22, 2019, 5:07:46 AM11/22/19
to Trac Users

RjOllos

unread,
Nov 22, 2019, 10:46:12 AM11/22/19
to Trac Users
Plugins installed to the environment plugins directory do not need to be explicitly activated, but those installed to site-packages do need to be explicitly activated.

You can see which plugins are activated on the /about page, or the plugin admin page. 

- Ryan

Mo

unread,
Nov 25, 2019, 6:34:36 AM11/25/19
to Trac Users
Am Sonntag, 17. November 2019 21:25:04 UTC+1 schrieb RjOllos:

I suggest trying to get everything published to PyPI and install from there.
 
Is there any reason for some plugins not being published on PyPI like https://trac-hacks.org/ticket/13690#comment:1 ?
Maybe comment directly on that ticket...

Ryan Ollos

unread,
Nov 25, 2019, 10:39:05 AM11/25/19
to Trac Users
He says in the ticket, that he doesn't want to take the time to publish to PyPI, so I think that's the answer. 

Mo

unread,
Nov 27, 2019, 3:10:51 AM11/27/19
to Trac Users
Am Montag, 25. November 2019 16:39:05 UTC+1 schrieb RjOllos:

He says in the ticket, that he doesn't want to take the time to publish to PyPI, so I think that's the answer. 

Is it possible to host multiple versions on PyPI like a development and a release version?
Would it be possible for such plugins that have no release cycle, to have a generic development version on PyPI that is always a mirror of the latest SVN revision? I mean does the author really need to run a command for every PyPI release?

Ok, alternatively this would be one of those plugins where I'm going to add such a svn+http URI to the requirement_file pointing to the SVN repository. But I guess with those links a 'pip2 list --user --outdated' or 'pip2 install --user -Ur requirements_file' would not work, because there is no easy way to check if there is an update on svn+http, only by re-installing.

Best regards,
- Mo

Mo

unread,
Dec 3, 2019, 7:25:02 AM12/3/19
to Trac Users

As I understand, he just doesn't like to publish to PyPI:
https://trac-hacks.org/ticket/13706

Jun Omae, would you mind that someone from the other developers just add the release to PyPI?

Another idea could be that there is some developer would maintain a "Trac distribution" with a central package management, and not every single plugin developer needs to publish to PyPI. I mean after a developer has released a code on a repository, it's up to the Linux distributors filling their package repositories with stable and tested versions. So maybe someone would maintain all the PyPI distributions and not let every plugin developer need to push to PyPI.

The situation that Trac packages are available on a general Python index called PyPI is not really required, can pip use a dedicated repo URL for install and search?
Using svn+https://.. with pip has the disadvantage, that there is no '--list --outdated' possible and every 'install' needs to re-install everything. Therefore I'm going to try to replace the last svn+https URLs from my requirement file and requesting for a release on PyPI.

Best regards,
- Mo

Dimitri Maziuk

unread,
Dec 3, 2019, 10:54:29 AM12/3/19
to trac-...@googlegroups.com
On 12/3/2019 6:25 AM, Mo wrote:
...
> Another idea could be that there is some developer would maintain a "Trac
> distribution" with a central package management, and not every single
> plugin developer needs to publish to PyPI. I mean after a developer has
> released a code on a repository...
Why not just roll a Dockerfile is what I don't get? You can `RUN git
pull` from that repository and not bother with pypis, virtualenvs,
maxicondas, distro packages, or any of that python mess. And there's
like half a dozen of them out there already.

Dima

Mo

unread,
Dec 4, 2019, 3:48:41 AM12/4/19
to Trac Users
Am Dienstag, 3. Dezember 2019 16:54:29 UTC+1 schrieb Dimitri Maziuk:
Why not just roll a Dockerfile is what I don't get?

Could you please describe what problem you are trying to solve with a docker container? Usually I use a $HOME, chroot, virtualenv, perlbrew or whatever to have separate environments. Then I use containers, if $HOME, chroot... etc. is not sufficient. Then I use Virtual Machines, if containers are not sufficient. I tend to sort the solutions in that order.
What we are trying to solve is having a package manager to maintain the Trac installation with all the plugins that nobody would like to have installed all together (while I tend to go in that direction...). So what would you solve with the container?
This could solve to have separate Trac installations. But even then I still prefer different TRAC directories, different $HOME or something. I have no issue separating the different Trac installations here.

Best regards,
- Mo

Dimitri Maziuk

unread,
Dec 4, 2019, 11:05:02 AM12/4/19
to trac-...@googlegroups.com
On 12/4/2019 2:48 AM, Mo wrote:

> Could you please describe what problem you are trying to solve with a
> docker container?

I am not the one having a month-long thread, in year 2020, on how to
distribute an application with all its dependencies.

No problem here, moving right along,
Dima

Mo

unread,
Dec 5, 2019, 4:56:28 AM12/5/19
to Trac Users
Am Mittwoch, 4. Dezember 2019 17:05:02 UTC+1 schrieb Dimitri Maziuk:
I am not the one having a month-long thread, in year 2020, on how to
distribute an application with all its dependencies.

Thanks for your reply. So you like to distribute an application with all its dependencies. Does that mean you like to distribute Trac with all Plugins? Does that mean you like to distribute a Trac instance with all Plugins installed as a Docker container? I don't think that was the initial intention of this month-long thread in 2019.
Best regards,
- Mo

RjOllos

unread,
Dec 5, 2019, 8:11:35 PM12/5/19
to Trac Users
You mention more than once "some developer would maintain". The central premise of your requests seem to be moving work from yourself to this "other developer" so that you can run a simple command to update and maintain your site.

Here is the hard truth: There are very few of us committing our time to the Trac project and ecosystem. Our time is valuable. You have built a custom site using Trac and a collection of plugins, and it's on you to maintain the site and make everything work together. It's up to you to track the changes that have been released, test (a staging site would be recommended), and update the plugins for your site. There is no central repository of packages that is maintained by "some developers", aside from what is on PyPI, and it takes effort to publish packages.

We can dream of other solutions that would be better, but I don't have many hours available, let alone the hundreds of hours that are required to implement them. If you want to implement them, by all means go ahead.

You will have to do the work. If that's not okay with you, consider finding a company to host your Trac and do the work, or migrate away from Trac to a (paid) solution. Or maybe you'll find an open source tool you like better. You may even find a better community.

If you want my recommendation: write a requirements.txt, write some scripts for maintaining/upgrading your site and setup an RSS feed so you can track changes to plugins as they happen on trac-hacks and GitHub. Setup a staging site, pull in changes, test them and then deploy them to your production site. That's what I do for the several track sites I maintain. It is work.

Dimitri Maziuk

unread,
Dec 6, 2019, 11:09:29 AM12/6/19
to trac-...@googlegroups.com
On 12/5/2019 7:11 PM, RjOllos wrote:
>
> If you want my recommendation: write a requirements.txt, write some scripts
> for maintaining/upgrading your site and setup an RSS feed so you can track
> changes to plugins as they happen on trac-hacks and GitHub. Setup a staging
> site, pull in changes, test them and then deploy them to your production
> site.

Or as I implied earlier, write them down as 'RUN wget <some_plugin>'
lines in your Dockerfile and deploy with `docker build` and test with
`docker run`. And tag them and roll back to older version if there's a
problem you didn't catch in testing.

Dima
Reply all
Reply to author
Forward
0 new messages