I looked at the git module for this, but my use case is more
expansive: I am investigating using ansible to replace our
masterless puppet infrastructure, which makes extensive use of file
resources, but we are limited (almost hamstrung) by puppet's
requirement that file sources be on the local file system. We have
about 1G worth of files, mostly binaries, almost all tiny, some
which churn quite a bit, and most of which come from third party
vendors, so using a VCS isn't a good match for this. What I would
need is the ability to put our files behind a fast but dumb
load-balanced httpd instance.
My main concern about a module of this type are performace- and
efficient-related: every invocation would require either pulling
down the file to compare with the local copy, or that the remote
server be able to transmit the checksum to the client (which makes
the server less dumb). It would be nice to support all the options
that copy supports (like first_available_file) but that might not be
reasonable.
If other people see value in a module like this, I can attempt to
create one, although I would have to strongly resist the temptation
to make it simply do os.system("wget") ...
* Michael DeHaan <michael.dehaan at
gmail.com> [2012/07/19 08:30]: