This article takes about 10 minutes to read

Full backup of MySQL periodically (1)

MySQL > Alter database backup (2)

Upload MySQL backup to private cloud (3)

The profile

  • The introduction
  • review
  • Backup to private cloud – Python
  • Backup to private cloud -nodejs
  • Back up to a Git private repository

The introduction

Portal:

Full backup of MySQL periodically (1)

MySQL > Alter database backup (2)

The first two articles talked about MySQL full backup and incremental backup, we also learned about database backup and restore. This also enhances the security of our database. If, say, our entire database server is hijacked, or we have some other catastrophic disaster, it won’t matter if we backup it, because the backup files are on the same server. Then we’ll figure out how to backup and upload to other private clouds at the same time, so we don’t have to worry about it. Hence the article.

review

OK, from the last two articles, we can easily back up the database and get the following script files. In the last few lines of the script below, I added several lines of code to upload backup files to the private cloud. Here I comment out, if you want to use please uncomment. OK, now let’s play monster upgrade together.

#! /bin/bash
#Get the current time
date_now=$(date "+%Y%m%d-%H%M%S")
backUpFolder=/home/db/backup/mysql
username="root"
password="123456"
db_name="zone"
#Defining the backup file name
fileName="${db_name}_${date_now}.sql"
#Define the backup file directory
backUpFileName="${backUpFolder}/${fileName}"
echo "starting backup mysql ${db_name} at ${date_now}."
/usr/bin/mysqldump -u${username} -p${password}  --lock-all-tables --flush-logs ${db_name} > ${backUpFileName}
#The backup file directory is displayed
cd ${backUpFolder}
#Compressing backup Files
tar zcvf ${fileName}.tar.gz ${fileName}

# use nodejs to upload backup file other place
#NODE_ENV=$backUpFolder@$backUpFileName node /home/tasks/upload.js
date_end=$(date "+%Y%m%d-%H%M%S")
echo "finish backup mysql database ${db_name} at ${date_end}."
#Upload backup files to the private cloud using NodeJS
#NODE_ENV=$backUpFolder@$backUpFileName/ root/node/v8.11.3 / bin/node/usr /local/upload.js

#Use Python to upload backup files to the private cloud
#python /use/local/upload.py $backUpFolder $backUpFileName
Copy the code
#Use Python to upload backup files to the private cloud
#python /use/local/upload.py $backUpFolder $backUpFileName
Copy the code
$backUpFolder # Backup file directory
$backUpFileName # backup file name
Copy the code

These are the parameters passed to upload.py.

#Upload backup files to the private cloud using NodeJS
#/ root/node/v8.11.3 / bin/node/usr /local/upload.js $backUpFolder $backUpFileName
Copy the code

These two parameters are passed to upload.js.

## Backup to private cloud – Python

The private cloud used here is Qiuniuyun. Please replace it with other private clouds. In the following script, replace your own access_key, secret_key, and bucket_name.

from qiniu import Auth, put_file, etag
import sys
print(The number of arguments is:, len(sys.argv), '. ')
print('Parameter List :', str(sys.argv))

Get the directory where the backup files are stored
backUpFolder = sys.argv[1]
Get the name of the backup file
backUpFileName = sys.argv[2]
import qiniu.config

You need to fill in your Access Key and Secret Key
access_key = 'your_key'
secret_key = 'your_key'
Create an authentication object
q = Auth(access_key, secret_key)
# Space to upload
bucket_name = 'test'
# File name saved after uploading to Seven Cows
key = backUpFileName
Generate upload Token, can specify expiration time, etc
token = q.upload_token(bucket_name, key, 3600)
The local path to upload the file
localfile = backUpFolder + backUpFileName
ret, info = put_file(token, key, localfile)
print(info)
assert ret['key'] == key
assert ret['hash'] == etag(localfile)

Copy the code

Backup to private cloud -nodejs

The private cloud used here is Qiuniuyun. Please replace it with other private clouds. In the following script, replace your own access_key, secret_key, and bucket.

let qiniu = require("qiniu");
let arguments = process.argv.splice(2);
console.log(process.argv)
console.log('The argument passed is:'.arguments[0]);
console.log('The argument passed is:'.arguments[1]);
let file = arguments[1];
let filePath = arguments[0] + '/' + file;
console.log(filePath);
bucket = "test"// The name of your storage space

// You need to fill in your Access Key and Secret Key
let accessKey = 'access_key';
let secretKey = 'secret_key';


let config = new qiniu.conf.Config();
/ / space corresponding to the room, choose a room look at seven cows documentation: https://developer.qiniu.com/kodo/sdk/1289/nodejs#form-upload-file
config.zone = qiniu.zone.Zone_z0;
let mac = new qiniu.auth.digest.Mac(accessKey, secretKey);
let options = {
    scope: bucket,
};
let putPolicy = new qiniu.rs.PutPolicy(options);
let uploadToken=putPolicy.uploadToken(mac);


let localFile = filePath;
let formUploader = new qiniu.form_up.FormUploader(config);
let putExtra = new qiniu.form_up.PutExtra();
let key = file;
// File upload
formUploader.putFile(uploadToken, key, localFile, putExtra, function (respErr, respBody, respInfo) {
    if (respErr) {
        throw respErr;
    }
    if (respInfo.statusCode == 200) {
        console.log(respBody);
    } else {
        console.log(respInfo.statusCode);
        console.log(respBody); }});Copy the code

Above is a screenshot of my local script file uploading readme. md. Uploading succeeded.

Back up to a Git private repository

Private repositories on Github are available for a fee. You can choose the code cloud as a private repository, and its private repository is free.

#! /bin/bash
#Get the current time
date_now=$(date "+%Y%m%d-%H%M%S")
backUpFolder=/home/db/backup/mysql
username="root"
password="123456"
db_name="zone"
#Defining the backup file name
fileName="${db_name}_${date_now}.sql"
#Define the backup file directory
backUpFileName="${backUpFolder}/${fileName}"
echo "starting backup mysql ${db_name} at ${date_now}."
/usr/bin/mysqldump -u${username} -p${password}  --lock-all-tables --flush-logs ${db_name} > ${backUpFileName}
#The backup file directory is displayed
cd ${backUpFolder}
#Compressing backup Files
tar zcvf ${fileName}.tar.gz ${fileName}

# use nodejs to upload backup file other place
#NODE_ENV=$backUpFolder@$backUpFileName node /home/tasks/upload.jsdate_end=$(date "+%Y%m%d-%H%M%S") echo "finish backup mysql database ${db_name} at ${date_end}." git add . git commit -m  'commit by script' git pushCopy the code

Follow the wechat public account, reply to [mysql Resources], get the advanced video tutorial on mysql performance optimization