I've searched for an answer to this question, but didn't find it. Why Salt's execution modules heavily rely on programs even when python libraries are available?
Isn't that easier to write several lines of code:
# some pre-processing
result = pip.install(...)
# some post-processing
instead of 300 which we have now (https://github.com/saltstack/salt/blob/develop/salt/modules/pip.py#L158) containing bunch of simple lines like:
Moreover, firstly, pip is a Python library! It seems to me that this connection:
python code (salt) <-> python code (pip)
would be much faster, portable and safer than this:
python code (salt) -> shell (unknown shell, probably sh) -> python code (pip)
What is the reason to not wrap python libraries? I think that I can find only one reason: dependencies. Yes, I agree, it's a problem sometimes. But wait, programs that Salt relies on should be installed too! I think that there is no difference for the final user what to install: python library or a program, it's all just package. OK, I don't count Windows as an operating system; Windows users struggle anyway... Also, speaking of Windows users. They can install python packages (but not programs) without a pain with pip.
Well, it seems that this tightly bound with...
Packaging
There is two ways to package Salt in this case:
1. Put all dependencies into setup.py install_requires.
2. Make separate packages for separate modules.
The first approach is very easy. A big problem is that in this case this would end up with the fact that Salt (with dependencies) would require several hundreds of MiB of HD.
The second is a little bit tricky, but I think it's better. Create a package for every execution module (salt-modules-pip) and fill classifiers. Base salt installation would require only most common and universal modules. Other modules can be installed later.
"Insanity! I don't want to install every module of Salt by hand." — you could say. Well, just create salt-modules package with setup.py that will determine installed OS, appropriate modules (you don't need win_repo on Debian, sure), look on PyPi for all available modules for this platform (classifiers, do you remember them?) and install them all for you. That would provide real granularity for Salt in terms of packaging.
I think it's quite easy to automate python package modules creation. I don't remember the name of tool, but it builds install_requires based on imports. Classifiers can be built in modules or they can be converted from grain values.
This could potentially solve other problem. I tired to wait until 2014.7 release. I can't wait anymore because 2014.1 don't contain some bug fixes or required modules. I've finished up with backporting by hand several modules (or patches). That's a definitely wrong. My proposal fixes this by splitting one big package into a bunch of small modules.
To sum up.
Pros:
1. Less code writing.
2. More stable.
3. Faster.
4. Modules are updated more frequently.
Cons:
1. Need to build tools for packaging.
2. Need to maintain individual versions of each module package.
I'm waiting for a hot discussion.
--
Regards, Roman Inflianskas
--
You received this message because you are subscribed to the Google Groups "Salt-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to salt-users+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.