I bought an Ali cloud server and deployed several small projects. MongoDB was used as the database. In order to avoid server failure and data loss, I made a regular backup of the data.

One, database backup implementation

Periodic backup consists of several steps

  1. Export online Mongoose data and make a tar package
  2. Write shell scripts, export, package and other operations
  3. Set a scheduled task to back up data at 2 am every day
  4. Synchronize a copy of database data to OSS (here using seven cows)

1. Export data and file it in a tar package

Exporting data is relatively simple, but to avoid errors, it is recommended to run this command locally and then run it on the server.

-h database address, -d database name, -o export path, -u username, -p passwordMongodump -h 127.0.0.1:27017-d UserSystem -o ./backup -u demo -p demo 

Package the database file
tar zcvf ./UserSystem UserSystem.tar.gz

# delete directory
rm -rf ./UserSystem
Copy the code

2. Write shell scripts

Every time you enter the server, you need to execute the export command, and then package, which is very troublesome, so write a shell script. You only need to execute the script for the next backup.

# tasks/backup.sh

#! /bin/sh
backUpFolder=/home/zhongxia/backup
# Backup file name with date information, avoid the same name, and easy to identify
dateNow=`date +%Y_%m_%d_%H%M`
backFileName=egg_cnode_$dateNow

Go to the backup folder
cd $backUpFolder

Create a backup directory
mkdir -p $backFileName

Export the dbName databaseMongodump -h 127.0.0.1:27017-d dbName -o $backFileName -u demo -p '123321'

Compress the exported data
tar zcvf $backFileName.tar.gz $backFileName

Delete the folder and keep only the backup package
rm -rf $backFileName
Copy the code

Once the script is written, execute it here.

# execute script
sh backup.sh

The user you logged in to does not have permission to operate the backup folder
# The most simple and crude solution is used here, and the specific situation depends on the processing
chmod 777 /home/zhongxia/backup
Copy the code

3. Set a scheduled task

Backing up files, manually entering the server every time, executing shell scripts is still a hassle, and often forgotten.

So set up a timed task and run yourself to save time and effort.

To set a scheduled task, execute the command into a configuration file
crontab -e 

The timing profile may have some references by default, which may be empty. (My machine is empty)
# add a scheduled task

# Minute hour day month day of the week command
The egg_cnode database is backed up at 2:00 a.m. every day
00 02 ** ** ** sh /home/zhongxia/task/backup.sh
Copy the code

00 02 ** ** ** indicates that crontab will be executed 2 days in the morning every day. You can search online how to configure crontab. There is no more introduction here, because it is seldom used and not very familiar with it.

4. Upload to seven Cows

Database backup, was afraid of the server hung up, no data. Now the backup data is still stored on the server. This is no better than no backup. The pot calls the kettle black.

So you can upload it to OSS, or somewhere else you can store it, in this case seven Cows.

4.1. Write scripts and upload files to qiniu private space

As a front-end, JS is the best, so use Node to upload seven cows.

// upload_qiniu.js

const qiniu = require("qiniu");

// Set your space names, AK and SK, which can be found in the key management section of the 7 ox console
var bucket = "<bucket>";
var accessKey = "<accessKey>";
var secretKey = "<secretKey>";

var mac = new qiniu.auth.digest.Mac(accessKey, secretKey);
var options = {
  scope: bucket
};
var putPolicy = new qiniu.rs.PutPolicy(options);

var uploadToken = putPolicy.uploadToken(mac);
var config = new qiniu.conf.Config();

// Get the file name and directory to back up from the environment variable
// This is done to facilitate execution in shell scripts
var params = process.env.NODE_ENV.split("@");
var fileName = params[0] + ".tar.gz";
var localFile = params[1] + "/" + fileName;

var formUploader = new qiniu.form_up.FormUploader(config);

formUploader.putFile(uploadToken, fileName, localFile, null.function(respErr, respBody, respInfo) {
  if (respErr) {
    throw respErr;
  }
  if (respInfo.statusCode == 200) {
    console.log(respBody);
  } else {
    console.log(respInfo.statusCode);
    console.log(respBody); }});Copy the code

4.2. Add an upload function to shell script

#. The above code is omitted and the command is added at the end of the script

#Upload the seven cows
export NODE_ENV=$backFileName@$backUpFolder && node /home/zhongxia/task/upload_qiniu.js
Copy the code

4.3. Verify

Run the script to see if it works
sh backup.sh 
Copy the code

If no error is reported and the Seven Ox console can see it, it is ok.

2. Supplementary content

1, create a mongo user to backup

You can create a user who only reads and writes to the database to back up data. Mainly for the control of user permissions.

For individual events, you can do whatever you like.

mongo
# login root
db.auth({user:'root'.pwd:'123123123'})

show databases;
use UserSystem;

# create user
db.createUser({user:'backup_user'.pwd:'123456',roles:[{role:'readWrite',db:'UserSystem'}]});

Log out and log in again
exit 
mongo -u backup_user -p 123456

Copy the code

2, backup multiple databases, it is recommended to write multiple scripts

Why write multiple scripts to back up multiple databases? One script handles all of them. Of course you can.

But write multiple scripts, you can avoid one database backup failure, all the database failure reason. Scheduled tasks can also be configured separately without affecting each other.

3, seven cattle private space file how to download?

You cannot download files in private space using the default temporary domain provided by Seven Cows. You need to bind a custom domain name.

How to bind domain name, see seven bovine official documents, how to set DNS resolution, you can go to DNSPod management console = “DNS management to add a CNAME record resolution.

After that, you can log in to qiniu Control Station to download the backup database files directly.

4. How to import backup files into mongo database

Decompress the tar package
tar xvf UserSystem.tar.gz

Export the UserSytem backup file of the current directory to the user-system database
Use the same method as exporting database filesMongorestore -h 127.0.0.1:27017-d user-system ./UserSystem -u demo -p 123123
Copy the code

To export and import a single table, look up the mongoexport mongoimport commands.

5. How to upload local files to the server

The downloaded database is backed up locally and needs to be uploaded to the server first.

For details, see the Linux SCP command

Copy from local to server
scp local_file remote_username@remote_ip:remote_folder

From server to local
# change the position of the two parameters after SCP
scp remote_username@remote_ip:remote_folder /home/space/music

# if your server port is 9999 instead of 22
scp -P 9999 local_file remote_username@remote_ip:remote_folder
Copy the code