Any one given any thoughts on inter-repo dependencies?
A simple approach would be to have a url to the other repository when declaring the artifact.
This means that a repository would depend on another repository and would have be pulled as well to be able to have all the metadata offline.
The issue is that we would be doing the same thing with metadata as ivy does with dependencies.
What I am doing now is to hash org,name,version and the contents of each artifact creating a unique id (hopefully).
When adding an artifact to your local repository you would add it with the hash along with the dependencies. The dependencies could be now be artifacts/modules which you have from other repositories. These dependencies would then be put into your local repository as well.
When you push back to the remote repository, you push the artifact and its dependencies.
The pro is that from the time you have the repository where you found the artifact you wanted, you all the metadata that you need. The con is that you actually have to include artifacts from other repositories.
Any thoughts?
Any one given any thoughts on inter-repo dependencies?
Whether the metadata, auth, and file are centralized or de-centralized,it'd be nice to let one instance of adept act as the metadata/auth/file server for the others (like git).This allows locking down network access in firewalled environment,or locking down SNAPSHOT to a specific artifact in a single location.
At the most basic, you'd have a tool that merges and splits repositories.
For git repositories, it is mostly straightforward to merge/split. If you want things to stay signed, the merge tool has to verify the commits being merged are signed and then sign the merge commit. If a human does the merge, they can sign it. Automated signing by a machine is a bit more complicated I think.
A simple approach would be to have a url to the other repository when declaring the artifact.
This means that a repository would depend on another repository and would have be pulled as well to be able to have all the metadata offline.
When adding an artifact to your local repository you would add it with the hash along with the dependencies. The dependencies could be now be artifacts/modules which you have from other repositories. These dependencies would then be put into your local repository as well.
When you push back to the remote repository, you push the artifact and its dependencies.
The pro is that from the time you have the repository where you found the artifact you wanted, you all the metadata that you need. The con is that you actually have to include artifacts from other repositories.
Note: why not keep a public key ring associated with the repository and just add bobs key directly too it. Then users can opt-in to finer grained security later if they want?
When merging two repos, you need to merge the key store as well (or just keep it as flat keys named by pgp id)
> # Two methods I mentioned in some other thread was using hard-coded github repo to advertise key,
> and making it available online.
>
> Today, to sign a jar with PGP we create foo.jar.asc file.
> To sign this automatically, an adept server can verify the signature, and add a secondary signature foo.jar.467cc13.asc upon merge
> where 467cc13 part is a unique id per adept instance like Adept One.
> Team Proxy can find out Adept One's id, so it can go straight to verifying foo.jar.467cc13.asc,
> and not bother with Bob's signature.
Actually, I believe we can sign the signature, even more meta and crazy...
>> A simple approach would be to have a url to the other repository when declaring the artifact.
>
> Does this mean we would have separate namespace of artifacts for each metadata repository?
> Or are you thinking more like resolver += "adept.foo.com"?
>>
>> This means that a repository would depend on another repository and would have be pulled as well to be able to have all the metadata offline.
>
> Similar to git, if the local repository worked as a server (a remote repo) for others, it would at least be able to grab all known artifacts.
> It would also be nice to proxy pulling and pushing too.
>>
>> When adding an artifact to your local repository you would add it with the hash along with the dependencies. The dependencies could be now be artifacts/modules which you have from other repositories. These dependencies would then be put into your local repository as well.
>>
>> When you push back to the remote repository, you push the artifact and its dependencies.
>
> Why are the deps pushed back to the remote repo?
>
>> The pro is that from the time you have the repository where you found the artifact you wanted, you all the metadata that you need. The con is that you actually have to include artifacts from other repositories.
>
> Since files are addressed by hash, hopefully we'll see less of cache corruption issues.
> I don't see the need to move deps files around.
>
Yeah, the cache can even be self cleaning. We do this for scala now. No major reported issues in a while.
I think the only corruption/issues we have to worry about are the metadata repositories and any local minded/db we make on that...
Note: why not keep a public key ring associated with the repository and just add bobs key directly too it. Then users can opt-in to finer grained security later if they want?
Yeah, I see what you're saying. Check out the lib I have in the sbt-pgp plugin and the check-pgp-signatures implementation.
If we sign other users keys, we still require those keys to be available when verifying, I think.
Let me try some of this out and get back to you.
The real question is: what do we want for the ux of security? What use cases and scenarios are we trying to solve? I can document the ones I think are high priorities.
On Mon, 25 Feb 2013 01:40:55 -0500One comment: if the metadata is signed, the jars don't need to be. The metadata contains a strong hash of the jars.
eugene yokota <eed3...@gmail.com> wrote:
> Here are some of the scenarios:
> https://github.com/sbt/adept/wiki/Security-Scenarios
> The ux should be seamless from the build users point of view.