0

REF http://www.rsnapshot.org/howto/1.2/rsnapshot-HOWTO.en.html 4.3.9. backup_script

I need to backup ALL the mysql databases by dynamically should new ones be created. Is there an ideal way to do this in bash with minimal code?

Would I need to log in to mysql and get all the databases?

3 Answers 3

3

Seems that you are wanting a bash script to run backup on dynamic databases that are created in MySQL. You can add the mysql root user account information in my.cnf in the root directory or within the bash script under the # [ Define Variables ].

you will need to chmod the bash script with

$sudo chmod +x backupmysql.sh

This will allow you to run the script with the following command.

$sudo ./backupmysql.sh

You can name the script whatever you like. In this example, I named it backupmysql.sh.

Here is the bash script:

#!/bin/bash

# [ Define Variables ]
HOST=`hostname -s`
syslogtag=MySQL-Backup
DEST=/var/dba/backup/
DBS="$(mysql -u root -Bse 'show databases' | egrep -v '^Database$|hold$' | grep -v 'performance_schema\|information_schema')"
DATE=$(date +'%F')

#[ Individually dump all databases with date stamps ]
for db in ${DBS[@]};
do  

GZ_FILENAME=$HOST-$db-$DATE.sql.gz
mysqldump -u root --quote-names --opt --single-transaction --quick $db > $DEST$HOST-$db-$DATE.sql
ERR=$?

if [ $ERR != 0 ]; then
NOTIFY_MESSAGE="Error: $ERR, while backing up database: $db"
logger -i -t ${syslogtag} "MySQL Database Backup FAILED; Database: $db"
else
NOTIFY_MESSAGE="Successfully backed up database: $db "
logger -i -t ${syslogtag} "MySQL Database Backup Successful; Database: $db"
fi
echo $NOTIFY_MESSAGE
done

If you have large files for backup, you can replace the statement in the bash script for the mysqldump to compress the file using gzip.

mysqldump -u root --quote-names --opt --single-transaction --quick $db | gzip -cf > $DEST$HOST-$db-$DATE.sql.gz

you can use gunzip to uncompress the file.

Sign up to request clarification or add additional context in comments.

Comments

2

The mysqldump command has an --all-databases option to back up every single database in one pass.

The only down-side to this is you have to restore them all together, you don't have the luxury of picking and choosing.

Keep in mind that databases usually have an associated directory in your MySQL data directory, so you can always iterate through those to find out which databases exist.

2 Comments

You'll want to adjust that to skip things like test and information-schema.
1

For mysql local system process I use:

#!/bin/bash

DEFAULTSFILE=/etc/my.cnf.d/.backups.cnf
BACKUPDIR=/root/db_backups
SAVEIFS=$IFS

# get list of dbs with field name (Database)

DBLIST=$(mysql -e "show databases" | cut -d ' '  -f 2 | grep -v Database)
IFS=$'\n'
DBLIST=($DBLIST)

# create backups directory if it doesn't exist

[ -d $BACKUPDIR ] || mkdir -p $BACKUPDIR

# make a single dump file for all databases in case entire server needs to be restored:

echo "Dumping all databases..."
mysqldump --defaults-file=$DEFAULTSFILE --all-databases > $BACKUP_DIR/$(date +%Y%m%d).all.sql
echo "Done dumping all databases."

# backup each database into it's own file to facilitate discrete restoration:

for (( i=0; i<${#DBLIST[@]}; i++ ))
do
  echo "starting dump of  ${DBLIST[$i]}..."
  mysqldump --defaults-file=$DEFAULTSFILE ${DBLIST[$i]} --skip-lock-tables > $BACKUPDIR/$(date +%Y%m%d).${DBLIST[$i]}.sql
  echo "done dumping ${DBLIST[$i]}."
done
IFS=$SAVEIFS

# gzip the sql files in-place to save space

gzip -f $BACKUPDIR/*.sql

# delete sql files older than 7 days

find $BACKUPDIR -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' \;

For docker things are slightly different:

#!/bin/bash
DEFAULTSFILE=/etc/my.cnf.d/.backups.cnf
BACKUPDIR=/root/db_backups
SAVEIFS=$IFS

# get list of dbs with field name (Database) and information_schema excluded. information_schema hangs during dumps

DBLIST=$(docker exec mysql mysql --defaults-file=/etc/my.cnf.d/.backups.cnf -e "show databases" | cut -d ' '  -f 2 | grep -v Database)
IFS=$'\n'
DBLIST=($DBLIST)

# create backups directory if it doesn't exist

[ -d $BACKUPDIR ] || mkdir -p $BACKUPDIR

# backup each database into it's own file to facilitate discrete restoration:

for (( i=0; i<${#DBLIST[@]}; i++ ))
do
  echo "starting dump of  ${DBLIST[$i]}..."
  docker exec mysql mysqldump --defaults-file=$DEFAULTSFILE ${DBLIST[$i]} --skip-lock-tables > $BACKUPDIR/$(date +%Y%m%d).${DBLIST[$i]}.sql
  echo "done dumping ${DBLIST[$i]}."
done
IFS=$SAVEIFS

# gzip the sql files in-place to save space

gzip -f $BACKUPDIR/*.sql

# delete sql files older than 7 days

find $BACKUPDIR -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' \;

I haven't been able to get --all-databases working for docker. I don't like all-in-one backups anyway...

I threw these directly into root's crontab and they work fine...

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.