On Sep 8, 2014 2:21 PM, "Brett Swift" <brett...@gmail.com> wrote:
>
> why isn't puppetlabs_spec_helper installing dependencies of my dependencies?
...
> but puppetlabs_spec_helper doesn't. <grumble grumble>
>
> I didn't see a ticket for this ontickets.puppetlabs.com. Is this a feature request, a defect, or pebcak ?
Assuming you're talking about modules installed with "forge_modules" (which I wrote the first cut of) instead of "repositories", I consciously added the flag to ignore dependencies with the PMT with the expectation that you should know what your dependencies are and be explicit about them.
That said, I have found myself annoyed with having to remember to add all of the (>1)th-order dependencies, especially for our mass of internal modules.
It also brings up the broader question of whether you really should need to track the transitive closure of your dependencies. Other packaging systems don't make you, so should you really have to here?
It could be added as a configuration parameter, but then the next question is where should that be? A configuration section in .fixtures.yml? An environment variable? Then I get side-tracked thinking that maybe the name of .fixtures.yml itself should be selectable by environment variable so you could test with different combinations of versions of dependencies, etc.
Wil
I see your points on wanting to know what your dependencies are, and maybe I'm missing something on how puppet's module path works. As I understand Puppet will resolve the first module on the path if it sees multiple.Modules should use semantic versioning, so lets assume that in a hypothetical.Also lets assume that I'm developing a module with two direct dependencies, that each depend on stdlib. However one relies on 3.x.x and one 4.x.x. I assume that Puppet can only use one of those.. so which one?
It does beg one simple question though. Why wouldn't rspec_puppet_helper forgo .fixtures.yml, instead of using metadata.json? It's a tight coupling.. but maybe a coupling that would be a good idea?