Automatically Backup Data to Google Drive


1. Install Gdrive

First step we need to install a third-party CLI tool which lets you simply transfer files to your Google Drive with scripts.
[root@www ~]# cd /usr/bin/
[root@www bin]# wget -O drive https://drive.google.com/uc?id=0B3X9GlR6EmbnMHBMVWtKaEZXdDg
[root@www bin]# chmod 755 drive
[root@www bin]# drive
In this step, after you type "drive" it will ask for verification code like this :

root@www bin]# drive
Go to the following link in your browser:
https://accounts.google.com/o/oauth2/auth?client_id=367116221053-7n0vf5akeru7on6o2fjinrecpdoe99eg.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&state=state

Enter verification code:

So, please access the link from your browser and will show the code :

Copy the code and paste into your VPS Console : (Enter verification code: )

2. Create Folder backups

Next step, we need to create a folder /backups to store the files backup
[root@www bin]# mkdir backups
[root@www bin]# cd /backups/
And create file googledrive.sh
[root@www backups]# nano googledrive.sh
And then paste this scripts :
-------------------------------------------------------------------------------------------
#!/bin/sh
##-----------------------Database Access--------------------------##
DB_NAME="my-database-name"
DB_USER="my-database-user"
DB_PASSWORD="my-database-password"

##-----------------------Folder Web or Folder you want to backup--------------------------##
NameOfFolder=("albennet")
SourceOfFolder="/home"
BackupLocation="/backups"
date=$(date +"%Y-%m-%d")
##That mean, you will Backup the folder /home/albennet and will save into Folder /backups

if [ ! -d $BackupLocation ]; then
mkdir -p $BackupLocation
fi
find $BackupLocation/*.zip -mtime +10 -exec rm {} \;
for fd in $NameOfFolder; do
# Name of the Backup File
file=$fd-$date.zip

# Zip the Folder you will want to Backup
echo "Starting to zip the folder and files"
cd $SourceOfFolder
zip -r $BackupLocation/$file $fd
sleep 5s
mysqldump -u $DB_USER -p$DB_PASSWORD $DB_NAME | gzip > $BackupLocation/$date-$DB_NAME.sql.tar
sleep 5s
##Process Upload Files to Google Drive
drive upload --file /backups/$file
sleep 5s
drive upload --file /backups/$date-$DB_NAME.sql.tar
if test $? = 0
then
echo "Your Data Successfully Uploaded to the Google Drive!"
echo -e "Your Data Successfully created and uploaded to the Google Drive!" | mail -s "Your VPS Backup from $date" youremail@yourdomain.com
else
echo "Error in Your Data Upload to Google Drive" > $LOG_FILE
fi
done
-------------------------------------------------------------------------------------------
And then save the file

3. Change Permision

The next step is change the permision files
[root@www backups]# chmod +x googledrive.sh

4. Crate Cheduler

The next step is create cronjob to run the script googledrive.sh automatically
[root@www backups]# nano /etc/crontab
And insert this line:
0 2 * * * /backups/googledrive.sh
Then save and restart crond.
[root@www ~]# /etc/init.d/crond restart
That mean, every 2:00 AM the script runs automatically to perform backup and store it in google drive.
And this is the result of data stored in Google Drive :

Backup is very important, Do Before It's Too Late

 

Copyright Albenet Hosting Sunday 22-Oct-2017 All rights reserved.