13

I want to backup a MySQL database. Is it possible to do it in Node.js? If not, what is the best way to do so?

7 Answers 7

14

simple way without libs:

var exec = require('child_process').exec;
var child = exec(' mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql');
Sign up to request clarification or add additional context in comments.

3 Comments

Assuming you have mysql client tools installed.
using this i can do with only one database what if I've to dump all database
Do I need to just replace bracket data or the whole? Like -p[password_here] or -p password_here
12

You can use exec method as described by siavolt it is simple but not the "best way" as requested by the question.

The best and efficient way in case if you care about your system memory and CPU and performance or if you have a big database is to use spawn along with the stream, so here is the "best way" in my opinion:

var fs = require('fs');
var spawn = require('child_process').spawn;
var wstream = fs.createWriteStream('dumpfilename.sql');

var mysqldump = spawn('mysqldump', [
    '-u',
    'DB_USER',
    '-pDB_PASSWORD',
    'DB_NAME',
]);

mysqldump
    .stdout
    .pipe(wstream)
    .on('finish', function () {
        console.log('Completed')
    })
    .on('error', function (err) {
        console.log(err)
    });

If your database doesn't have a password you can remove '-pDB_PASSWORD', code.

Remember to replace DB_XXX with actual values, i.e. 'root' instead of 'DB_USER', or '-pXXXXXXXX' instead of '-pDB_PASSWORD'.

2 Comments

this worked for me when I changed the 'spawn' line to; var mysqldump = spawn('mysqldump', [ '-u', 'DB_USER', '-pDB_PASSWORD', 'DB_NAME', ]);
note that this didn't work for me with async/await. I needed to use mysqldump.stdout.pipe(wstream) and for await ( const data of mysqldump.stdout) { // filler }
11

simple and powerfull way

npm install mysqldump

Run :

import mysqldump from 'mysqldump';
// or const mysqldump = require('mysqldump')
 
// dump the result straight to a file
mysqldump({
    connection: {
        host: 'localhost',
        user: 'root',
        password: '123456',
        database: 'my_database',
    },
    dumpToFile: './dump.sql',
});

File Path:

write absolute path example (check into G Disk Drive)

dumpToFile:'G:\dump.sql'

or Relative path

dumpToFile:'.\dump.sql' (root of project directory)

2 Comments

Clean and efficient!
Note that the Github repo of this package was archived by its author in 2021. The closing issue mentions It's not made for use in "production" environments, so there haven't been any security concerns to patch.. It's worth having that in mind if you start using it. npm link
6

You could backup mySQL Databases utilizing node-cron, I like this because it ensures that your DB backup will always run regardless of the OS that you are hosting your Node App.

npm install node-cron
npm install moment (only for file name formatting)

@app.js (the file that you serve your Node App)

const cron = require('node-cron')
const moment = require('moment')
const fs = require('fs')
const spawn = require('child_process').spawn

// You can adjust the backup frequency as you like, this case will run once a day
cron.schedule('0 0 * * *', () => {
// Use moment.js or any other way to dynamically generate file name
  const fileName = `${process.env.DB_NAME}_${moment().format('YYYY_MM_DD')}.sql`
  const wstream = fs.createWriteStream(`/Path/You/Want/To/Save/${fileName}`)
  console.log('---------------------')
  console.log('Running Database Backup Cron Job')
  const mysqldump = spawn('mysqldump', [ '-u', process.env.DB_USER, `-p${process.env.DB_PASSWORD}`, process.env.DB_NAME ])

  mysqldump
    .stdout
    .pipe(wstream)
    .on('finish', () => {
      console.log('DB Backup Completed!')
    })
    .on('error', (err) => {
      console.log(err)
    })
})

...
...
app.listen()

You could hardcode all your DB_USER & DB_PASSWORD but i find it is best practice to read from a .env file, so it can work in both Development & Production environment

ENVIRONMENT = development
LOCAL_API = http://api.url
GOD_TOKEN = godtokenauthmenow
DB_HOST = 127.0.0.1
DB_NAME = myDatabaseName
DB_USER = myUsername
DB_PASSWORD = myPassword

Comments

0
var exec = require('child_process').exec(' mysqldump -u root -p dbname > fileName.sql');

This worked for me

2 Comments

this will prompt for password every time the command runs. what if i want to automate the process?
doesn't correct answer already have this included in the explanation?
0

I prefer the answer given by Balaji because it is simple and clean. However, there are a few improvements to consider:

  • Using environment variables instead of hardcoding values.
  • If the file size increases significantly, zlib can be used for compression.

In my case, the file size reached up to 1 GB, even though the actual data was around 300 MB. This happened because:

  • It dumps everything without filtering.
  • Unlike Workbench or the native mysqldump, it does not compress data or use optimized insert formatting.

A more effective approach is to use the following method:

const mysqldump = require('mysqldump');
const env = process.env.NODE_ENV || 'test';
const config = require(__dirname + '/../config/config.json')[env];
const logger = require('../middleware/log');
const fs = require('fs');
const zlib = require('zlib');
const path = require('path');

async function bkp(){
    try {
        let backupDir = path.join(__dirname, 'db-backups');
        if (!fs.existsSync(backupDir)) {
            fs.mkdirSync(backupDir, { recursive: true });
        }

        let filePath = path.join(backupDir, `${Date.now()}-dump.sql`);
        let compressedFilePath = `${filePath}.gz`;

        await mysqldump({
            connection: {
                host: config.host,
                user: config.username,
                password: config.password,
                database: config.database,
            },
            dumpToFile: filePath
        });

        logger.info("DB backup created successfully, compressing...");

        // 🔹 Compress the backup file using gzip
        const gzip = zlib.createGzip();
        const input = fs.createReadStream(filePath);
        const output = fs.createWriteStream(compressedFilePath);

        input.pipe(gzip).pipe(output).on('finish', () => {
            logger.info(`Backup compressed successfully: ${compressedFilePath}`);
            fs.unlinkSync(filePath); // Remove original uncompressed file to save space
        });

    } catch (error) {
        logger.error("Error in DB Backup", error);
    }
}

Comments

-2
const {exec} = require('child_process');

exec('mysqldump --user=root --password=yourPassword --result-file=D:/dbBackup.sql --databases database_name);

1 Comment

You should provide some explanation along with supporting information for your answer

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.