----- Original Message -----
> Currently the master is responsible for validating resource parameters
> and properties. That is, it checks that the user doesn't attempt to
> use a parameter or property that doesn't exist (for example, 'group {
> foo: path => bar }'), and it checks that the user doesn't attempt to
> specify an invalid value for a parameter or property (for example,
> 'file { "/tmp/foo": recurse => "cheese" }').
>
>
> Jesse Wolfe and I have been thinking about this in connection with
> http://projects.puppetlabs.com/issues/4409 , and we propose changing
> this so that validation is done on the agent rather than the master.
>
>
> Advantages we're anticipating:
> - It is no longer necessary for the master to be aware of types at
> all, so when a module defines its own native type, it is not necessary
> to copy it into the master's lib directory
> - It becomes possible to use different types in different
> environments; this is especially important when using a "test"
> environment to try out changes to a native type on a limited set of
> nodes before pushing them to all nodes.
>
>
> Disadvantages:
> - If a catalog fails due to a type validation error, it will be an
> execution error on the agent rather than a compilation error, so the
> agent will not be able to fall back to the previous catalog.
We wont be caching these failing catalogs to disk right? the Previous good
catalog will still be in the client cache and be used?
Unfortunately, we validate on the client only after the catalog is written to disk. At least, that's how it looks to me.
--
While one person hesitates because he feels inferior, the other is
busy making mistakes and becoming superior. -- Henry C. Link
---------------------------------------------------------------------
Luke Kanies -|- http://puppetlabs.com -|- +1(615)594-8199
If we can't find a way around that I'd say this change would be a big
step backwards
--
R.I.Pienaar
The question is whether that step backwards is larger than the step
forwards... namely being able to reliably distribute types/providers
in modules in environments.
We could also make the change contingent on fixing the validate-before-cache behavior.
--
Every generation laughs at the old fashions, but follows religiously
the new. -- Henry David Thoreau
At basic face value I'd say running old cached catalogs are a big deal
and we really do not want to loose this capability.
But there's a second dimension - I have for a long time recommended people
disable the cached catalog because it can cause huge screwups. Imagine
you've made a big change - lets say new version of apache that requires
new config files. You made a typo somewhere, your manifest isnt compiling
but your machines are now getting the new config file, and restart and
then fail because new apache needs new config. Contrived example but that
basic pattern - cached catalogs are a huge fail.
So I am not too concerned about them but I do like that the cached catalog
age tells me about failing compiles and alerts my monitoring about manifest
errors. We should extend the --summarize output as well the new ones being
cached to disk to include the compile state so that we can monitor that way
perhaps.
--
R.I.Pienaar