no, & nothing comes up in searches...
> I found this Google Summer of Code idea [2] ("Metalink integration
> with Proxy/Cache" heading) for intermediate caching proxies to support
> Metalink
>
> And I found these two related Metalink discussion threads [3] [4]
>
> I think a minimum viable product would add support for RFC 6249:
>
> 1. If the response status code is 3XX
> 2. Scan "Link: <...>; rel=duplicate" headers for URL that already
> exist in the cache
2b. Scan "Digest:" header fields for matches? that is, files with the same hash.
> 3. If found, update "Location: ..." header with this URL and pass on
> response
>
> I volunteer here at the Agahozo-Shalom Youth Village in Rwanda. We use
> Apache Traffic Server to overcome slow/unreliable downloads (and save
> bandwidth). But because many sites distribute the same files from
> different mirrors, it's frustrating for users to predict whether a
> download will be fast (cache hit) or slow (cache miss). Following the
> same link may be fast one time and slow the next
>
> I think Apache Traffic Server could achieve better user experience (in
> some cases) by implementing RFC 6249
do you live in Rwanda? you're currently a college student?
I agree! this would be an excellent solution. please apply! :)
but first, you'll want to contact the project, since no one from
metalink has been in contact with them. we'd need someone from their
project that is familiar w/ it to mentor the project.
try an introductory post on their dev list:
"Impress developers or help others by participating on our dev
discussion list or follow the latest development on our commits list.
Report issues or bring patches to our Bug Tracker"
> [1] http://trafficserver.apache.org/
> [2] http://sourceforge.net/apps/trac/metalinks/wiki/GsocIdeas
> [3] http://groups.google.com/group/metalink-discussion/browse_thread/thread/b59e5d99cf879529/f364acba525c7108
> [4] http://groups.google.com/group/metalink-discussion/browse_thread/thread/1849486ced99e778/7c2afbe0f3f8c41
--
(( Anthony Bryan ... Metalink [ http://www.metalinker.org ]
)) Easier, More Reliable, Self Healing Downloads
We are also thinking of using RFC 3230, Instance Digests in HTTP.
Given a response with a "Location: ..." header that's not cached and a
"Digest: ..." header, the plugin would check if another URL with the
same digest already exists in the cache and rewrite the
"Location: ..." header with that URL if so
Still more ideas include:
* Remember URLs for the same file so future requests for any of
these URLs use the same cache key. A problem is how to prevent a
malicious domain from distributing false information about URLs it
doesn't control. This could be addressed with a whitelist of domains
* Making decisions about the best mirror to choose, e.g. one that
is most cost efficient, faster, or more local
* Use content digest to detect or repair download errors
Finally, can anyone in the Metalink community recommend a reusable C/C+
+ solution for checking if a "Link: ..." header has a "rel=duplicate"
parameter? For now I am parsing these headers from scratch with
memchr(), but I expect that I am neglecting some accumulated wisdom on
getting all the RFC rules right, and maybe interoperating with
nonconformant implementations. Please let me know if you know a better
way