Fwd: [Wikitech-l] 34 TB Wikimedia Commons files on archive.org: you can help

2 views
Skip to first unread message

Kartik Mistry

unread,
Aug 1, 2014, 10:53:46 AM8/1/14
to GNU/Linux Users Group, Mumbai, India, floss-...@googlegroups.com
Fwd :)


---------- Forwarded message ----------
From: Federico Leva (Nemo) <nemo...@gmail.com>
Date: Fri, Aug 1, 2014 at 2:42 PM
Subject: [Wikitech-l] 34 TB Wikimedia Commons files on archive.org: you can help
To: Announce Mailing List <Wikimedia...@lists.wikimedia.org>,
Wikimedia Commons Discussion List <comm...@lists.wikimedia.org>,
Wikimedia developers <wikit...@lists.wikimedia.org>
Cc: wikiteam...@googlegroups.com


WikiTeam[1] has released an update of the chronological archive of all
Wikimedia Commons files, up to 2013. Now at ~34 TB total.
<https://archive.org/details/wikimediacommons>
I wrote to – I think – all the mirrors in the world, but apparently
nobody is interested in such a mass of media apart from the Internet
Archive (and the mirrorservice.org which took Kiwix).
The solution is simple: take a small bite and preserve a copy yourself.
One slice only takes one click, from your browser to your torrent
client, and typically 20-40 GB on your disk (biggest slice 1400 GB,
smallest 216 MB).
<https://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive#Image_tarballs>

Nemo

P.s.: Please help spread the word everywhere.

[1] https://github.com/WikiTeam/wikiteam

_______________________________________________
Wikitech-l mailing list
Wikit...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
Reply all
Reply to author
Forward
0 new messages