Hetzner Backup

From Eastnet
Jump to: navigation, search

From the Hetzner DokuWiki Backup page at http://wiki.hetzner.de/index.php/Backup - This is a Google translation.

Backup request

Useful software

Backup with Tartarus

  • "On-the-fly backup to FTP server
  • Support for LVM snapshots
  • Encryption (symmetric and asymmetric by key or password)
  • Incremental backups
  • Based on common Unix tools, permitting easy recovery and from the rescue system (tar, bzip2, etc.)
  • Configuration through profile files
  • Integrated suitable "hooks" for special cases
  • Removal of old backups from the FTP server with the utility charon
  • Tartarus backup configuration kopierfertigen with examples and instructions

Backup using backup2l / gpg / ftp

Duplicity - GPG-encrypted, compressed, incremental backup on a non-trusted media or unencrypted protocols such as FTP by Hetzner exclusively offered to the backup servers . Can also still rsync and ssh. On Debian (4.0), it should not with apt-get install duplicityto be installed because the Debian contained in outdated version (0.4.2) problems with some FTP servers (Error 226: Transfer complete) does. Instead, the current version (tested with 0.4.9) by http://download.savannah.gnu.org/releases/duplicity/ be used. Details on Hetzner forum.

A Duplicity Script

General Tips

Backups should be generally carried out at night so there is no influrence network quality for the Racknachbarn or even the services offered.

Has proven useful also to set the cronjob on "crooked" times, not at 0:00 or 3:00 clock when many people perform their backups but, for example at 1:42 or 2:23 clock.

The "spare" the network and the backup server and as a pleasant side effect is usually faster and the backup completed as at peak times.

A very efficient backup strategy using rsync under http://www.linux-magazin.de/Artikel/ausgabe/2004/09/backups/backups.html described.

Historical backup all MySQL databases into separate files

Often you only make a mysqldump of all databases in one file. This file is then zipped often, which has disadvantages: a byte in the zip file gone: You may backup all gone.

Moreover, often resulting file sizes that will not act. When restoring you must first isolate the relevant data in a huge amount of data. The time the place has not one often.

My backup script is triggered every night from cron (as root), goes through all the databases individually, and gzipped SQL dump files and places each week to a new folder, so that one first, a daily backup and the other a week, historical backup has:

You put it (at least in Debian) file in the folder / etc / cron.daily:

echo "Alle MySQL-Datenbanken sichern:"
# Adjusted list of databases to create the
# 'secret' DBASELIST is the MySQL root password:
mysqlshow -psecret | awk '{print $2}' | grep -v Databases | sort >$DBASELIST
# Where should backups be written all over them?
cd /irgendeinverzeichnis
mkdir -p `date +%Y%m%W`
cd `date +%Y%m%W`
for x in `cat $DBASELIST`; do
    echo "Datenbank: $x saved";
    mysqldump --opt -psecret $x >$x.sql;
echo "Old .gz-Files Delete:"
rm *.gz
echo "Files zippen:"
gzip *

Determine memory usage

To find out how much space one still available (either in scripts or backup status e-mails), the program can 'lftp' are used:

# apt-get install lftp

Space provide:

# echo du -s .  \

Readable is it with the-h:

# echo du -hs . \

The command can also integrate into Tartarus by hook, by adding the following lines are inserted in the Tartarus configuration:

 echo "du" | /usr/bin/lftp -u "$STORAGE_FTP_USER,$STORAGE_FTP_PASSWORD" "$STORAGE_FTP_SERVER" | awk -v LIMIT=100 '$2=="." {print ((LIMIT*1024*1024)-$1)/1024 " MiB backup space remaining"}'