Menu

Automatically backup Drupal databases through cron

With my hosting provider I have all Drupal sites installed under my public_html directory. Each Drupal instance uses a different database. I could generate a mysqldump command for each instance – but that would become out of date very quickly with the immanent upgrades to version 6. The solution was to create a small script that searches all the sites folders for a settings.php file, extracts the relevant database connection information, and dumps and gzips the database. The database dumps are named according to the database name – all mine are different so no problems there – and the day of the week is also used. So, if the script is run every day, the last seven days of backups are kept.

 

#!/bin/sh
weekday=$(date +"%A")
backpath=/usr/home/amadain/files
for setting in `find /usr/home/amadain/public_html/*/sites/* -maxdepth 1 -name 'settings.php'`
do
  db_url=`grep '^$db_url' $setting | cut -d   -f 3`
  if [ "${db_url}" != "" ]
  then
    username=`echo ${db_url} | cut -d : -f 2 | cut -d / -f 3`
    if [ "${username}" != "username" ]
    then
      password=`echo ${db_url} | cut -d : -f 3 | cut -d @ -f 1`
      hostname=`echo ${db_url} | cut -d : -f 3 | cut -d @ -f 2 | cut -d / -f 1`
      database=`echo ${db_url} | cut -d : -f 3 | cut -d @ -f 2 | cut -d / -f 2| cut -d ' -f 1`;
      echo MySQL dumping ${weekday}'s DB ${database} on host ${hostname}
      /usr/local/bin/mysqldump -h${hostname} -u${username} -p${password} --opt --lock-tables ${database} | /usr/bin/gzip -9 > ${backpath}/${database}_${weekday}.sql.gz
    fi
  fi
done

Các bài viết liên quan

  1. Dumping Large MySQL InnoDB Tables
Facebook Comments