I've been maintaining some Solaris packages for Facter and Puppet for
a while now. These are dependant on, and are part of, the "Blastwave"
or "Community Software (CSW)" Project. Unfortunately this project has
gone into some sort of fork/coup d'etat/meltdown which meant that for
much of last week the owner of the Blastwave pulled webservers and
mailing lists down. Mirrors were still working. It is now highly
probable that a new "opencsw" project will launch, and I'm sure
efforts will be made to keep Blastwave CSW running too.
I'm not sure how this is going to pan out. Nor am I sure which, if
any, of the above projects I'm going to want to make the effort to
support in future. However, if you're looking for the binaries, and
supporting Ruby versions, for Solaris they are still on the mirrors,
and the most recent unreleased versions are on my private web server
-- http://garylaw.net/packages/
The spat is a nice illustration of how not to run a community effort,
how to alienate and demotivate large numbers of volunteers and of the
dangers of having one person in charge of the domain names, and
another in charge of the gpg key used to sign software packages. Sigh.
Gary
--
Gary Law
Email/googletalk: gary...@gmail.com
iChat/jabber/AIM: gary...@mac.com
> Anyone want to share experiences on how they might get packages pushed
> down to hundreds of globally-distributed heterogeneous Solaris boxes
> that need to be micromanaged? We use a mix of pkg-add and sparc64
> RPMs.
My few systems are neither globally-distributed nor terribly
heterogeneous. That having been said, here's a link on how I made my own
repository compatible with pkg-get:
http://blogs.cae.tntech.edu/mwr/2008/05/21/making-solaris-packages-from-commercial-software/
My setup isn't complete yet, since pkg-get doesn't allow for multiple
repositories. But I'd expect with my Maple example given in the above
link, something like:
exec { "install-maple11":
creates => "/opt/maple/11/bin/maple",
command => "pkg-get -s ftp://host/path/to/repository/ -U ; pkg-get
-s ftp://host/path/to/repository/ install maple11"
}
would be a first step. And if you weren't already using blastwave or a
similar service with another pkg-get repository, then it should be as
simple as:
package { "maple11":
provider => blastwave,
ensure => installed
}
--
Mike Renfro / R&D Engineer, Center for Manufacturing Research,
931 372-3601 / Tennessee Technological University
On 12 Aug 2008, at 02:36, Dave Thomas @ Tandberg Television wrote:
>
> Oh that's you. Just tried them with CSW on some kind of Solaris 11
> beta -- being so used to Linux I thought the pkg manager would scream
> not to install another Ruby but we did and ended up with two. haha.
Yeah, it's arguable what's the best policy. Blastwave tries (or tried)
to have dependencies only on itself, across all versions of Solaris
from 8 -> 10. And maintain the same exact revisions of distributed
software across versions. It's taxing to meet this requirement, and
means we end up ignoring Sun's Ruby. However, it means you avoid
getting locked into poorly compiled or out of date vendor distributed
versions of ruby, supporting libraries and so on.
>> Unfortunately this project has
>> gone into some sort of fork/coup d'etat/meltdown which meant that for
>> much of last week the owner of the Blastwave pulled webservers and
>> mailing lists down.
>
> Scary... I'm new to Puppet but am making a big push so see if I can
> win over hearts and minds. We are a 50-50 Solaris/ Linux shop. Over
> the long term should we have contingency plans to maintain everything
> from source?
No. My intention is to maintain a Solaris package for puppet and
facter, based on the stable version from . At the moment, that's
through the old CSW/Blastwave: in future it might be through the 'new'
Blastwave, the forked OpenCSW, Sunfreeware, or some other method. I
want to avoid maintaining multiple versions in multiple repositories,
and I want to avoid Solaris users having to compile from source or use
different versions of Ruby.
pkgadd can use a URL as the source for the package "device". I wound up
creating a simple pkgadd define thus:
define pkgadd {
$pkgrepo="http://pca.itgasiapac.com/pkgs"
package{"$name":
source => "$pkgrepo/$name.pkg.$hardwareisa",
ensure => installed,
adminfile => "puppet",
require => File["/var/sadm/install/admin/puppet"]
}
}
The admin-file mentioned is pushed out via our "base" class which is used on
all hosts (we're all-Solaris).
So to make sure a package goes out:
pkgadd{ITGlsof:}
Does the trick.
At the moment we're pulling all packages from a single host, but it'd be
easy enough to make $pkgrepo vary by geographic location as we've also got a
custom fact that figures that out for us based on IP address. At that point
you'd just rsync the package repo to local caches.
Obviously this is only something we use with locally-built packages, but
very few of our production hosts use Blastwave or any other third-party repo
so this works for us.
We also have a simple pkgrm define which we can call from e.g., our postfix
module to remove sendmail.
Matt
Matt