Wiki backups

6 views
Skip to first unread message

z...@tampaad.net

unread,
Jan 31, 2026, 5:50:34 PMJan 31
to swarm-...@googlegroups.com
TL;DR: I got backups.
 
The rest of the story:
 
   Okay, when our last wiki disappeared on us, it caught us flat-footed.  I'd occasionally done an XML dump of all the articles, but that was just the raw articles.  It didn't include any of the graphics, the categories, the templates, anything else.  And, of course, my latest dump was several months old.
   Remember that I spent some time over the previous decade trying to reach any living human at the hosting site (freewiki.in) about complete wiki backups, but no one ever answered my pleas for help.  All we had was that XML dump.

   Fine, a year ago I bit the bullet and set up wiki software on a domain I own "TampaAD.net", and spent the next several months laboriously importing that XML dump into my shiny new wiki.  Oh, and re-creating all the categories and templates and stuff.  Some of it still doesn't work, like the ship data infoboxes.  I was able to re-upload all the graphics that _I_ had provided, because I keep a copy of everything I do, but all graphics provided by others is gone.  That's okay, a lot of it was pretty stupid, but some of it was good and I don't have it to upload.

   Anyway, I'm planning on getting picked up soon and I'm sure the med-tubes will roll me back to my 20's, but what if I croak before I make pickup?  If my better half catches me with her sister that might happen pretty quickly, ya know?  I have no doubt that my wife will immediately cancel all my hobby stuff, and this wiki will disappear just like the last two.

   So, I'm working the problem from two directions.  First, since I control the domain, I can go in the back way and make complete backups of the whole server.  I do so once a month or so.  The problem with that is that I don't know enough about server databases to pick and choose what I want.  I end up with an SQL.TAR.GZ file that is a complete backup of the whole domain.  The web site.  The wiki.  All the email hosted there.
   Sure, if I'm dead I guess I don't care who reads my ZM@ email, but I'm not putting it out on the 'net for everyone to dig through _now_.  Just be aware that my laptop has an 8mb file on it named "Swarmwiki.zip" which is the compressed version of the 40mb "Swarmwiki.sql file.  I can't put THAT out on the 'net, either, because I'm pretty sure that it contains access credentials for everyone with an account on that wiki.

   Second, I can also do what I did for the last wiki, periodic XML dumps of all the articles.  Only, that was pretty miserable.  You could get a dump of any article, and even multiples if you provided their names in a list you type in.  600+ articles?  Yeah, I did it a couple of times, automating it as well as I could, but it was always awkward.
   This time, though, I found a cheat.  You can also tell it to dump all members of a category, like "Discussion" or "Heresy".  Hmmm.  Fine, I've spent the last three days adding the line [[Category:All]] to every single article in the wiki, all 694 of them.  We now have a category that is, simply, a list of all articles.  And, I've proven it works.
   Go to [[Special:Export]], type "All" into the field asking for a category, and press "Add".  It fills in the list of articles to be dumped for you.  You can either get only the current version of each page, that's 2.6mb, or include the history for each page for 17mb.
   I've copied a ZIPped version of them both to DropBox so anyone can have a copy.  As before, though, it's just the text in the articles.  No graphics, no categories, no templates.  If I croak, it's all yours.  Not my problem no more, man!
 
-Zen Master
 
   (Oh, yeah: The 'Main Page' has a supposedly live count of active pages.  It's fed from a system variable named NUMBEROFARTICLES, and it's at 684 today.  The list at [[Special:AllPages]] has 694 entries, though.  The list at [[Category:All]] has 694 entries, too.  So, the XML dump has 694 articles in it.  When you copy either one to a text file, the file has 694 lines.  No idea why the system says we only have 684.)
 

thinkin...@gmail.com

unread,
Feb 1, 2026, 7:27:45 PMFeb 1
to swarm-...@googlegroups.com

Hokay, I MIGHT be able to help with something, but we probably need our man Stoner to fix this.  I have a domain, ThinkingHorndog.com, but no servers and no website.  I can probably buy a bigger package from GoDaddy, but I don’t know if that will help.  I know diddly-squat about wikis, and have no suggestions regarding backups.

 

Thinker

--
You received this message because you are subscribed to the Google Groups "swarm-authors" group.
To unsubscribe from this group and stop receiving emails from it, send an email to swarm-author...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/swarm-authors/725e01edda4ac9510ab32a6e91b05835%40tampaad.net.

z...@tampaad.net

unread,
Feb 1, 2026, 7:34:25 PMFeb 1
to swarm-...@googlegroups.com

   I'd say let sleeping dogs lie.  I mean, I'm already paying for the hosting, why not get some use out of it?  Time enough to worry about it when I stop answering emails and my wife pulls the plug on my domains.

   But, yeah, if Stoner could help the way he did the last one, it would make the wiki a lot better.

-ZM

Reply all
Reply to author
Forward
0 new messages