pip2.7 install --user xhtml2pdf
# cat /etc/uwsgi.d/trac-pp.ini
[uwsgi]
plugins = python27
chown-socket = trac:nginx
uid = trac
gid = trac
workers = 6
socket = /run/uwsgi/%n.sock
env = TRAC_ENV=/mnt/data/trac/projects/trac-pp
env = PYTHON_EGG_CACHE=/mnt/data/trac/.python-eggs
module = trac.web.main
callable = dispatch_request
pythonpath = /mnt/data/trac/.local/lib64/python2.7/site-packages
Is it a good idea to move all the Trac installation towards using pip and install into ~trac/ home? How would I check all user packages for updates, how would I update all?
$ pip install svn+https://trac-hacks.org/svn/accountmanagerplugin/trunk* Replace pip with pip2.7 as needed* For Git repositories, use "git+" prefix rather than "svn+"
pip2 install --user -U <pgklist>
pip2 list --user --outdated
Am Donnerstag, 14. November 2019 11:55:54 UTC+1 schrieb RjOllos:$ pip install svn+https://trac-hacks.org/svn/accountmanagerplugin/trunk* Replace pip with pip2.7 as needed* For Git repositories, use "git+" prefix rather than "svn+"
Thanks. But when installing from a custom link, for later updates I need to do another pip install -U from exactly the same link or does the installation remember it's source URI?
Btw. all the updating via pip is not optimal. From what I learned on #python@freenode, pip has no 'pip update' and no 'pip update all' which every package manager should have a convenience function for. There is onlyand
pip2 install --user -U <pgklist>which I could use as input for the first.
pip2 list --user --outdated
I could also maintain a <requirements file> but that would mean I need to maintain that for every new package.
Finally after talking about these shortcomings on pip I was told, that "poetry (for projects) and pipx (for executables) are more suitable tools for end users", and that some " 'update all' is woefully ill-advised misfeature in library managers like pip and npm.
I'm no Python dev yet and a bit confused about... so pip is more a library manager than a package manager...
But still Trac advises to use pip if not doing the "python2.7 setup.py bdist_egg" manually.
You say it is also possible to use 'pip2 install .' from the root of a svn or git cloned repo. That would only replace my .egg building and copying. I still need to pull or sync my repos. I guess maintaining a requirements_file is the best as you say. 'pip2 list --user --outdated' shows me the available updates and 'pip2 install --user -Ur requirement_file' applies the updates. There would only be the gap if there is a new version on SVN that is not published on PyPI yet, so I still need to monitor those a bit.
About the virtual environment that was recommended many times to me I don't see the advantage yet. As I have all the Trac installation concentrated to the ~trac/ now, this $HOME is isolated from the root installation anyway. Then I'm daily btrfs-snapshotting the home directories. That virtualenv sounds like perlbrew, having a portable complete installation with interpreter and libraries. This would be helpful if I need separated installations for the trac user account. Do I need this? My isolated python environment is ~trac/.local/ managed by 'pip2 install --user'.
Some parts I found from the linux distribution like dev-python/simplejson-3.16.0 I installed as root, but I guess if those are only requred by Trac I would also move them to the ~trac/ home, as PyPI could be more up to date as some Linux distribution.
--
You received this message because you are subscribed to the Google Groups "Trac Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to trac-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/trac-users/b7e36fb6-c903-4c43-83cf-a467c46fb21c%40googlegroups.com.
Thank you very much.
I think it's a good idea to use such a requirements_file, eventhough I still think it's part of a package manager to populate such a file after a 'pip install', just like every rpm, apt, yum or portage does.
Who is reponsible for publishing to PyPI? Does every track-hacks plugin author need to push to PyPI and such a request to do so I should file on track-hacks?
The usual plugin wiki does mention the SVN source, but no comments about availability on PyPI.
I don't know much about PyPI, but if there was kind of repositories for pip it would be better to find all plugins on a separated trac repository instead of pushing every single trac plugin to the big PyPI repo. But it seems to be one big repository and Trac components usually are names like "Trac*". Though it is hard to find the packages, as for instance 'pip2 search AccountManager' does not find it while 'pip2 search TracAccountManager' finds the (outdated) 0.5.0 version.
You say it is also possible to use 'pip2 install .' from the root of a svn or git cloned repo. That would only replace my .egg building and copying. I still need to pull or sync my repos. I guess maintaining a requirements_file is the best as you say. 'pip2 list --user --outdated' shows me the available updates and 'pip2 install --user -Ur requirement_file' applies the updates. There would only be the gap if there is a new version on SVN that is not published on PyPI yet, so I still need to monitor those a bit.
You can filter by framework:
Am Sonntag, 17. November 2019 21:25:04 UTC+1 schrieb RjOllos:You can filter by framework:
Ok, for instance there is
https://pypi.org/project/TracXMLRPC/
Which I found in the wiki.
But I can't find it in any search:
https://pypi.org/search/?q=tracxmlrpc&o=&c=Framework+%3A%3A+Trac
It seems to have the wrong framework as without framework I can find it:
https://pypi.org/search/?q=tracxmlrpc
If you notice the issue for other plugins, please report it.
TracXMLRPC 1.1.8 has been published to PyPI. The Framework is now shown on TracXMLRPC page, but it looks like filtered results are cached so we'll have to check later whether TracXMLRPC shows when filtering by Framework::Trac.
I suggest trying to get everything published to PyPI and install from there.
He says in the ticket, that he doesn't want to take the time to publish to PyPI, so I think that's the answer.
Why not just roll a Dockerfile is what I don't get?
I am not the one having a month-long thread, in year 2020, on how to
distribute an application with all its dependencies.