Through many conversations with the community and at work, it is clear
that the DevOps movement is doing an excellent job of improving the
status quo when it comes to systems adminstration (even if I do
personally feel it's a bit too much "dev" and not enough "ops")
through tools such as Chef, Puppet and all the other code that has
been written.
Where I feel we are missing a huge area is that of canonical sources of truth.
At present on my network I have the following systems which store
information about my servers/AMI instances/Printers/Desktops etc:
1) Nagios
2) Our "homebrew" inventory system
3) Puppet/Cobbler
4) Legacy Build System
5) Legacy "command and control" shell scripts
6) Change Management System
7) Support Ticket System
This means than in order to add a new host to the network and be
confident that it will get picked up by all of our systems and
scripts, I need to update seven places which seems to be crazy to me!
I am planning on starting to write a very basic API-based system which
can act as a source of truth for many of the above systems, however
before I do so, I thought I'd run the idea past the community to see
what you all feel would be useful in a system such as this.
At present, I only plan on including the following information:
hostname
Asset ID (Internal Asset Tag where appropriate)
Support Service Id (Dell Service Tag/AWS Instance Id etc)
ip address(es)
Datacentre (Telehouse/Amazon AWS/RackSpace/etc.)
Location (Room/Rack/AWS Zone/etc.)
Owner (the primary user - probably pulled from LDAP or similar -
responsible for this system)
The idea being that this system would expose information over the API
which could then be plugged in to monitoring, Puppet/Cobbler, Change
Management Systems and all the other wonderful systems that are
around.
What do people think about this? Good idea? Bad Idea? :P
All comments welcome,
Matt
Be a key-value store for each host; and each host identified by a unique
key for that host (perhaps the MAC address of the on-board ethernet port).
There'd be some way for users to perform CRUD operations on a host and its
key/value pairs; probably a web frontend.
There'd be a way for automated processes to put data into the SOTDB, for
example facter could populate the host's entry with key/value pairs of
the facts from that host.
There'd be multiple ways to get the data out of the SOTDB; it'd have a
template engine, to produce config files (nagios configs, etc). It'd have
a command line tool to produce a list of hosts, or host values, based on
query strings, and piped for shell scripts; i.e.
$ sotls datacenter=pao1 service=webserver
foo-1
foo-2
foo-3
It'd have a web-api, REST or whatever. The automation tool, puppet, chef,
facter, etc.. would be able to access data about the host within its own
context.
Nice-to-haves:
Some trigger mechanism where if a record was updated it'd kick off certain
actions (regenerate DNS, notify a kickstart server, etc..)
Open issues:
How do you describe dependencies / topology in a generic-as-possible
manner so it can be broadly used in different contexts.
Which is authoritative? What's discovered on your network or what's in
your SOTDB? If the SOT says a system should be one thing, but the machine
claims it is something else - do you change the machine or the SOT?
I might go see what's out there and see if any of them could be expanded.
I looked at iClassify at one point, it is a rails-based key-value store
with a web-api - seemed like it would be a good starting point for
building out something more powerful.
Best Regards,
-- Greg
I am planning on starting to write a very basic API-based system which
can act as a source of truth for many of the above systems, however
before I do so, I thought I'd run the idea past the community to see
what you all feel would be useful in a system such as this.
I have given the same requirement some thought previously as well, and
have often wondered if this could not be solved through the use of
LDAP? Granted that one would probably have to spend some time to
extend/develop additional schemas to satisfy all the requirements. I'd
also be interested to hear what other folks think. My reason for
thinking that LDAP would make a good "single truth" is that most
unix/linux flavours have a basic set of tools for querying LDAP, and
that most development and scripting languages have fairly well
established API's
Any thoughts?
Chris
On 29 September 2011 08:33, Matthew Macdonald-Wallace
> --
> You received this message because you are subscribed to the Google Groups "Agile System Administration" group.
> To post to this group, send email to agile-system-...@googlegroups.com.
> To unsubscribe from this group, send email to agile-system-admini...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/agile-system-administration?hl=en.
>
>
--
Chris Mulder
+27 82 040 6434
--
In theory, there is no difference between theory and practice.
In<fnord> practice, there is. .... Yogi Berra
It occurs to me that there's a good model that comes out of the
distributed simulation world - rather than try to maintain a centralized
"world model," each simulator maintains a local copy - which are kept
synchronized by a publish-subscribe protocol whereby each node publishes
updates, other nodes subscribe to the information streams that they're
interested in. DIS and HLA are the primary protocols used, DDS is
similar and has a lot of traction in places like the Navy, for linking
distributed sensors to distributed weapons.
Aaron Nichols wrote:
>
> I know the concept of a central API to query these facts may have some
> gaps - but that seems like an easy problem to solve if it isn't already
> solved.
>
Puppet has a central API for querying fact data called the inventory
service (introduced in 2.6.7). The documentation is here:
http://docs.puppetlabs.com/guides/inventory_service.html.
Regards
James Turnbull
- --
Author of:
* Pro Puppet (http://tinyurl.com/ppuppet)
* Pro Linux System Administration (http://tinyurl.com/linuxadmin)
* Pro Nagios 2.0 (http://tinyurl.com/pronagios)
* Hardening Linux (http://tinyurl.com/hardeninglinux)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (Darwin)
Comment: GPGTools - http://gpgtools.org
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iQEcBAEBAgAGBQJOhzVeAAoJECFa/lDkFHAyirYH/AnoG8MDs79hewmI/IWHO8JJ
1I4ypF/C3IOQmul6KU7KxqM/Be+y2hLPnQKpDhs+qwsTXIpgJ8KAypPgPYhOyNVN
zoIuxFAWDtHS+f8BoAphbvXIeilEHIdd9JwMJQq1JsZxrmrXUPfKXnMB+Widwyf0
MhMYkOkgxtmeK+9KP3e/L9BsdbKZYZZnQsdDHb5xxgZB7aRu9GOgX8kbN6nSv9eF
e3B1jZ2RIaohEJbNfi1zucg7YFwCx0Rkuv4DFZHS+53A7aPcCHhGWANp9CLsQrE0
5X4RjR5qjkn9mGhcUOwBONlE26Vc+CasParczGGUH/TZ0O8gcOfvsqBK0A9y9ng=
=wC75
-----END PGP SIGNATURE-----
Thanks for all the responses.
I was going to go down the home-brew route and possibly even start a
project to define a "standard" format for these kinds of things to be
adopted by other open-source solutions, however it's become clear to
me that this is quite possible a "horses for courses" issue and there
cannot be a single solution.
I'll check out nventory and the other recommendations made on the
thread and write up a blog post summarising this in the next few days.
Kind regards,
Matt
On 29 September 2011 07:33, Matthew Macdonald-Wallace
<mattm...@gmail.com> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Puppet has a central API for querying fact data called the inventory
Aaron Nichols wrote:
>
> I know the concept of a central API to query these facts may have some
> gaps - but that seems like an easy problem to solve if it isn't already
> solved.
>
service (introduced in 2.6.7). The documentation is here:
http://docs.puppetlabs.com/guides/inventory_service.html.