Nowadays, most of the UNIX based systems come with several basic backup software options, including dd, cpio, tar, and dump. If the basic backup software of your server does not meet your needs, then there are a lot of options for backup software on internet that you may consider to check out. However, you can still be an old-fashioned programmer (maybe more cautious) and create a home-made backup with Node.js by downloading an entire folder from your webserver and storing it inside a hard-drive in your home or office.
Requirements
In this article, we won't going to handle the download of every single folder inside a directory, primary because that will increase the processing time of the transfer and because it's pretty tricky and painful. To make the backup so simple as possible, we are going to compress directly a folder remotely. For this, we will need:
- The command line utility
tar
for collecting many files into one archive file (available by default on any UNIX based server).
Note
The script can work on Windows Servers, however the tar utility should be available and instead of redirect the output to 2>/dev/null
, it should be NUL
instead.
Obviously you need Node.js installed and some patience. Let's get started !
1. Install tar-fs and ssh2 in your Node.js project
Open a new terminal in your computer and switch to the directory of your project. Then proceed to install the SSH2 module with the following command:
npm install ssh2
If you need more information about the SSH2 module, please visit the official repository in Github here.
Once the SSH2 module is installed, you will need now the tar-fs module, tar-fs allows you to pack directories into tarballs and extract tarballs into directories. Install the module with the following command:
npm install tar-fs
If you need more information about the tar-fs module, please visit the official repository in Github here.
After the installation of the modules, you can write some code that will download an entire directory from your server locally as a backup.
2. Create the transfer function
In order to test the script, create a demo file, namely backup.js
and save the following script inside. The following function works like this: using the connection object from the SSH2 library, a command will be executed in the remote terminal (something like tar cf - /folder/to/download 2>/dev/null
and if you use compression tar cf - /folder/to/download 2>/dev/null | gzip 6 -c 2>/dev/null
), this command will stream all the compressed files and folders to our connection (redirects the standard output (stdout) to /dev/null
, which discards it, treated as black hole in Linux/Unix) and with the help of the tar-fs module, you will be able to extract the streamed data into some local directory.
Note that you need to require previously the tar-fs and zlib module:
var tar = require('tar-fs');
var zlib = require('zlib');
/**
* Transfers an entire directory locally by compressing, downloading and extracting it locally.
*
* @param {SSH} conn A ssh connection of the ssh2 library
* @param {String} remotePath
* @param {String} localPath
* @param {Integer|Boolean} compression
* @param {Function} cb Callback executed once the transfer finishes (success or error)
* @see http://stackoverflow.com/questions/23935283/transfer-entire-directory-using-ssh2-in-nodejs
*/
function transferDirectory(conn, remotePath, localPath, compression, cb) {
var cmd = 'tar cf - "' + remotePath + '" 2>/dev/null';
if (typeof compression === 'function'){
cb = compression;
}else if (compression === true){
compression = 6;
}
// Apply compression if desired
if (typeof compression === 'number' && compression >= 1 && compression <= 9){
cmd += ' | gzip -' + compression + 'c 2>/dev/null';
}else{
compression = undefined;
}
conn.exec(cmd, function (err, stream) {
if (err){
return cb(err);
}
var exitErr;
var tarStream = tar.extract(localPath);
tarStream.on('finish', function () {
cb(exitErr);
});
stream.on('exit', function (code, signal) {
if (typeof code === 'number' && code !== 0){
exitErr = new Error('Remote process exited with code ' + code);
}else if (signal){
exitErr = new Error('Remote process killed with signal ' + signal);
}
}).stderr.resume();
if (compression){
stream = stream.pipe(zlib.createGunzip());
}
stream.pipe(tarStream);
});
}
With this single snippet, you will be ready to download a directory.
3. Download a directory
To start the backup of an entire directory from your server, you will need to create a new SSH connection with the ssh2 module. Require the module and create a new instance of it, then configure the settings object (credentials, URL of your server etc.) and add the ready listener. Inside the callback you need to execute the transfer function created in the previous step. Set the parameters as required and verify your script:
var SSH = require('ssh2');
var conn = new SSH();
var connectionSettings = {
// The host URL
host: 'your.server.url.com',
// The port, usually 22
port: 22,
// Credentials
username: 'root',
password: '*******'
};
conn.on('ready', function () {
// Use the transfer directory
transferDirectory(
// The SSH2 connection
conn,
// The remote folder of your unix server that you want to back up
'/var/www/vhosts/yourproject.com/some-folder-to-backup',
// Local path where the files should be saved
__dirname + '/backup',
// Define a compression value (true for default 6) with a numerical value
true,
// A callback executed once the transference finishes
function (err) {
if (err){
throw err;
};
console.log('Remote directory succesfully downloaded!');
// Finish the connection
conn.end();
}
);
}).connect(connectionSettings);
Then save the changes on your file and execute the script with:
node backup.js
Once the script finishes, you should get a new folder (backup
) in the same folder where the script is located. This script is really useful if you want to create backups from your projects, specifically files that the user uploads in projects like WordPress or other kind of CMS.
Happy coding !