Sorry for posting to the top of this page, but the suggestions down below are terrible ways of backing up a wiki.
One of the best ways is to use
rsync, which only transfers the changed wiki files, compresses files during transfer to save bandwidth, works over secure SSL connections, and ships standard on UNIX systems (bsd, linux, mac os x).
use this command:
rsync -av -e ssh email@example.com:/path/to/wiki/ localdir/
for more info, type
man rsync. You need remote shell access or a remotely running rsync server though.
Every couple of days, I do a brute-force backup of my wikis. I FTP a new copy of my wiki directory and all its contents down to my home base. The advantage is a vague feeling of security. The disadvantages are:
There must be a better way. Should I just backup the .db files, or the .kp files, or what? -- JerryMuelver
Perfect! I use
tar -cvf mywiki.tar ./wiki/page to package up the .db files. Is this all I need to recover from a disaster? -- JerryMuelver
The diff_log is almost a meg all by itself. I'll run a tar for the whole works, just to see what I get. An early experiment tar'd the entire server.... -- JerryMuelver
(Later) Ran it up to 3.3 meg for the tar file, from 0.8 for just the .db files. Not a huge problem with a satellite download, but elbow room could be a problem eventually for the tarring process. -- JerryMuelver
tar -cvzf mywiki.tar.gz ./wikito compress while tarring the files. You can also move or remove the diff_log at any time--it is not needed by the wiki. --CliffordAdams
My experience: I use an FTP client that can do an incremental download (Anarchie on MacOS?), which pulls down just the changed pages. Very fast. If I want seperate backups I can then duplicate the local copy in my file system (also very fast). -- EricScheid
A very simple procedure for the backup would be something like ..../wiki.pl?action=backup, that would create the tar.gz of the database and return it as a binary file to the HTTP client. Might be restricted to the admin, and also we could include the wiki.pl script in the tar. Or maybe use cpio instead of tar, that would allow to select the files to backup more easily. -- AlainMellan?
With this small perl-script you can tar and zip your wiki directories. Afterwards the tar.gz-file is sent by email to a specified mail-address.
You can automate your email-backups with the help of a cron-server, e.g. http://www.webcron.org.