On Fri, 22 Feb 2013 00:55:13 -0500
Doug Tangren <
d.ta...@gmail.com> wrote:
> Still wrapping my head around Marks brain dump (
>
https://github.com/sbt/adept/wiki/NEScala-Proposal ) and am currently
> thinking about the metadata side of things.
>
> I'm totally for the split between metadata repo and artifact repo. +1 on
> that.
>
> What I am not clear on though is if the thought was for one have metadata
> repo others push changes to or many metadata repos. I feel like having one
> uber metadata repo containing metadata about every library adds a bit of
> friction to publishing and management. It introduces questions like who can
> push, who manages/oversees this pushings and pullings. It creates a sense
> of stress for new authors to are all ready to go but need access to this
> "blessed" publishing circle. It took months for me to get blessed into
> scala tools, thankfully less for sonatype ( though it felt like a 1000 step
> process ). I believe that in order to have a flourishing community of scala
> libraries you want to remove that kind of friction.
We are definitely in agreement on having less than 1000 steps in the publishing process.
> So then I started thinking about many metadata repos and what that means.
>
> When thinking about a frictionless model for hosting metadata for
>
implicit.ly ls that still guarantees authenticity. I came up with a schema
> where the user used a tool (sbt) to serialize metadata information within
> their projects repo, commited it, pushed it to github, then told the ls
> service to synchronize with this (hosted version on github). This system
> has some of the properties Mark outlined as pluses using dcvs to handle
> many tasks out of the box. This metadata is now versioned, hosted, and
> authentication is handled via implicit knowledge of push access to a given
> repo.
The authentication is weak, though. It isn't part of the metadata (no auditing, for example), is dependent on the hosting service, and isn't granular (can't trace to the actual individual who posted it). I like signing git commits, but it does have the problem of how to deal with merging.
> I was wondering if this could be expanded to fit marks vision of a local
> repo of metadata. What if publishing meant storing metadata in a specific
> location ( or branch ) of a author owned git repo and pushing to a remote
> like github. Then instead of telling a remote service to sync with that
> repo, just register your git url with it once. Adept could grab git urls
> for repos containing metadata from that service, clone them then it could
> just git pull to fetch changes from locally cloned repos. This is kind of
> like how bower (
http://twitter.github.com/bower/ ) works. The bower
> service basically just tracks names and git repos.
I don't think storing the metadata in the actual repository will work because you have to clone the whole repository to get at the metadata. It also means you have a tool like adept writing to your git repository directly. (Of course we won't have any serious bugs, but I'd still rather not touch people's source repository.)
I like the general idea of author managed repositories and aggregating repositories, though. We probably need to aggregate binaries as well as metadata. I think that is what the idea behind bintray is (we'll have to wait for Josh to confirm).
> Thoughts on one uber repo vs many author managed repos?
I'm interested in alternatives to a single repository. They are expensive to host, harder to mirror, block work when they go down, etc... I think some things already proposed for adept may mitigate this and make centralized more feasible, but I'd like to see where torrents and author managed repos or at least less centralized repos go.
-Mark
> -Doug Tangren
>
http://lessis.me