Publishing artifacts

24 views
Skip to first unread message

Fredrik Ekholdt

unread,
Aug 7, 2013, 5:01:05 AM8/7/13
to adep...@googlegroups.com
Currently on the adept-dm/adept repo, publishing artifacts means that you publish to a git repo (so you have credentials for that) and somewhere else for the files (if they are not published already).

The optimal solution has to be something which  preferably has existing infrastructure and most importantly is easy to setup and requires only one set of credentials and one repository to relate to.

One solution could be to have Adept daemons and a custom protocol. Another solution could be to have metadata not only in git repositories and use a simple file server.

What is the best solution? Anybody else got any other ways of doing this? 

Josh Suereth

unread,
Aug 7, 2013, 8:29:46 AM8/7/13
to adept-dev
YOu have a lot of potential options here.  I think the core requirements are:

(1) A git repository of metadata, where users can select a specific commit to use for their resolution.
(2) An accessible location to grab files (using some protocol).

Here's a few ideas to simplify hosting:

  • Packaging a local repo for simple-file-serve-hosting
    • Here you can migrate files into a directory for simple http file server and update the metadata appropriately
    • You can compress/zip the git repository into a single file which this repo uses when obtained
    • Everytime you want to update the repository, you need redo this process.
    • If you preserve commits in the zipped git repository, then adept can treat the zip file as the entire repo.  It can download it to a throwaway cache and import from it.   THe downside is, you'd need to constantly check for new versions of this file and it defeats the caching we wanted, however I see repository imports being an infrequent thing (like running apt-get update every week to check for new releases/fixes).
  • A repository manager solution
    • Here, we need a way to "import" the metadata from one adept repository into another.  This kind of finagling needs to be mostly automated (if possible).   This is pretty much a requirement for adapt anyway.  The person "importing" metadata, would be responsible for any git merges and signing the commits as trusted once complete, etc.
    • The repository manager owns the git repository.  When you publish, you're issuing a "merge this part of my metadata into the remote repo" request.   The repository manager is responsible for doing authentication and understanding if this metadata is ok for you to manipulate.
    • The repository manager can migrate the file locations specified in the metadata to be locations which he can host.  Alternatively, we start using something like bittorrent and the new repoistory manager updates the torrent with itself as a seed.
  • Split hosting
    • A low tech solution.  For this we use Github to host metadata and use pull requests to update, and Bintray for hosting the raw artifacts.   This one feels wrong for convenience.

IN any case, I think *CORE* to the ability to have a central repo are these two use cases:

  • Merging two repositories together (metadata)
  • Merging in *partial* metadata from one repository into another (I want to push "my" organization's pakages to adept-central in one lump command, or my project's packages to my company's central repo.).
  • Automatically modifying the artifact location description in some bulk fashion, like a hook in the framework.

At least, that's my $.02


Reply all
Reply to author
Forward
0 new messages