On Saturday, 2 February 2013 02:47:14 UTC+1, Richard Crowley wrote:
I think the single-file form is a huge win for Blueprint (among other
things, it enabled the use of the blueprint(5) file format in AWS
CloudFormation). Which brings us to your YAML suggestion.
I agree, it has many advantages.
My usual kneejerk reaction to someone suggesting YAML is to point out
that it's impossible to tell if it's been truncated. That's a low
risk in this case, however.
The next concern is in adding a dependency that's not a part of the
Python 2.6 standard library. This can be addressed: I'm not opposed
to distributing a YAML implementation with Blueprint.
Yes, I agree, truncation can be a problem, but here is a very low risk.
I don't think that PyYAML is a big dependency problem, it's available in most of distribution repos.
btw. Ansible, a lightweight python configuration management tool, which is becoming popular, it also uses YAML. It would be nice to have an export ability from blueprint (which may also use this library).
One further proposal: does the workflow that's frustrating you get
better with a blueprint-diff-file(1) tool that works like git-diff(1)
to display the differences in a file's contents in two different
blueprints?
Sure, you can write "wrapper" scripts around git-diff, around editors (decode json file contents to a temporary location, open in editor, encode result, remove tmp file) and everything else you might need, but that complicates things, you depend on this scripts and you lose ability to use just standard tools.
I'm not sure what exactly project goals are, but I will try to explain my point of view.
Blueprint works great as it is, if you use it as reverse engineering tool for a server.
But, if you take the opposite approach - now I have a blueprint "source file", which describes target server state.
I can use git to version this "source file" (which blueprint already does), which I can then checkout on my development machine (maybe even integrate as a subtree/submodule in my project - which will need this server configuration).
Then, for example, just want to set PHP memory_limit variable to 1GB in php.ini in blueprint source inside my repo, push that to couple of servers and they will automatically restart deamons. I don't want to connect to a server, do change there, do reverse engineering and pull changes in my repo (though, that process may work better when you need to explore different options, edit multiple files, etc.).
Blueprint already can do all that, just it's hard to edit/track its "source files".
I could use exported chef/puppet files instead of blueprint json files, track/edit them in a separate repo, but this creates another (unnecessary) layer and this tools are bloated, slow and have huge dependencies.
Ansible exporter could solve some of this problems, but blueprint still has a big advantage - it can be converted to anything else when needed, including shell script.
I think that can have huge potential in a configuration management. In a simple scenario - pacakges list, files(+templates) and services definitions are all you need. For a more complex scenario, that can be extended with some kind of external callbacks to scripts which can be written in any language.