Topic:- Automatically backup your website

Is your website backed up? What would happen if you woke up morning and your web server had crashed, hacked by someone, infected by malware or hacker deleted your entire website? Scary..!, Same thing happened to me. Well I am not alone here, statistics shows 58{1c918b003a0fec779e46518dd4d8df22f3dc554de918030f5a1a0cfd93cb28be} of small to medium businesses not prepared for data loss.
If you still not backup your website, Don’t stress, this article will help you to get your backup plan straight.

`
Automatically Back Up Your Web Site Every Night
Automatically Back Up Your Web Site Every Night

How to Backup your website:

You need to backup your website data same way you backup your computer’s data. But on database driven websites, there are two things you need to backup: the files that shows the face of your website ( PHP/Python/Perl, Images, Icons, CSS, JavaScript files etc..), and the data stored in database. Furthermore, a good backup system keep local and a remote copy of backed up data.

Scenario:

This guide will show you how to automatically backup your website every night where a simple bash script and cronjob will do all backup process, and safely upload local files to an external ftp server automatically while you are away from your desk. Sounds good..! Right ?.. So here is a quick guide to backup your website including database using bash script and cronjob.

This guide assumes you have:

  • A dynamic website based on LAMP (Linux, Apache, MySQL and PHP/Perl/Python).
  • Shell access to your web server via SSH.
  • Basic Linux Command Line skills required such as how to make new folders and chmod permissions on files.
  • Basic skills to run bash scripts at the command line on your server and setting up cronjobs.
  • Location of website files on server.
  • MySQL Database information and username and password you use to log into MySQL.
  • Access of remote FTP server to transfer backup copy.
  • FTP client package should be installed on server.

If you got all these prerequisites, so we are ready to go.!

Now let’s prepare a smart bash script that will take backup of two things, source code of our website and entire database contents. So in case your website blown up, you can simply restore it from backup source, and everything will work fine as before.

Login to your web server via SSH first, and then create a directory named mybackups under your home directory.

[BroExperts@lxweb ~]$ mkdir mybackup

Now create a file named web-backup.sh

[BroExperts@lxweb ~]$ touch web-backup.sh

Now we will copy and paste below showing contents into web-backup.sh file.

[BroExperts@lxweb ~]$ vi web-backup.sh
#!/bin/bash
#Purpose = Website Source & Database Backup
#Created on 09-05-2018
#Author = Hafiz Haider
#Version 1.0
################### SCRIPT START #####################
## 1: TIME STAMP
TIME=`date +{1c918b003a0fec779e46518dd4d8df22f3dc554de918030f5a1a0cfd93cb28be}b-{1c918b003a0fec779e46518dd4d8df22f3dc554de918030f5a1a0cfd93cb28be}d-{1c918b003a0fec779e46518dd4d8df22f3dc554de918030f5a1a0cfd93cb28be}y`
## 2: YOUR WEBSITE NAME
WEBSITE=BroExperts.com
## 3: HERE I DEFINE WEBSITE BACKUP FILE NAME FORMAT
FILENAME=$WEBSITE-backup-$TIME.tar.gz
## 4:LOCATION OF WEBSITE SOURCE CODE DIRECTORY ON WEB SERVER
WEBDIRECTORY=/var/www/html/broexperts
## 5: DESTINATION OF BACKUP FILE ON LOCAL SERVER
BACKUPDIR=/home/BroExperts/mybackup
## 6: MySQL DATABASE CREDENTIALS
DBUSER=root
DBPASS=redhat
DB=broexperts
## 7: WEBSITE BACKUP COMMAND
tar -cpzf $BACKUPDIR/$FILENAME $WEBDIRECTORY
## 8: DATABASE BACKUP COMMAND
mysqldump -u $DBUSER -p${DBPASS} $DB | gzip > $BACKUPDIR/dbbackup_${DB}_${TIME}.bak.gz
## 9: FINAL COMMAND TO GENERATE SINGLE ZIP FILE CONTAINING WEB AND DATABASE BACKUP
zip -rm $BACKUPDIR/Full_Backup_${WEBSITE}_${TIME}.zip $BACKUPDIR/*.gz
## 10: TRANSFER FILES TO REMOTE FTP SERVER
HOST=192.168.2.132
USER=ftpuser
PASSWORD=password
ftp -inv $HOST <<EOF
user $USER $PASSWORD
lcd $BACKUPDIR
mput Full_Backup_${WEBSITE}_${TIME}.zip
bye
EOF
################### SCRIPT END #####################

This script will zips up your website data including database and save one copy of your website backup on local and one copy on remote ftp server.

Also read other articles related to Linux Backup

Backup Script Explanation

To deploy this script in your environment you need to read commented lines starting with two hashes plus numbers “## 1” and edit the right values for your setup.
This script will first take a copy of my website contents from /var/www/html/broexperts (See: ## 4:), and transfer tar file into user’s home directory /home/BroExperts/mybackup directory (See: ## 5:). Then it will login to MySQL Database server and create a backup of provided DB (See: ## 8:), then finally it will combine both separately created website source and database backup files into one zip file (See: ## 9:). At the end, script will transfer a copy of backup to remote FTP server (See: ## 10:).

To make this script executable, you must execute this command:

[BroExperts@lxweb ~]$ chmod +x web-backup.sh 

To make this whole process automatic, schedule script execution using crontab. To run it at 1:01 am every morning, your crontab would look like this:

[BroExperts@lxweb ~]$ crontab -e
1       1       *       *       *      /home/BroExperts/web-backup.sh

Using this script and automation technique, all of your website data will be regularly backed up and stored in a secure separate location. No matter the disaster – human error, hacking, corrupt files, hardware or software issues – you’ll be ready to restore your files anytime, from backups.That’s All.

If you need any help regarding this guide, please feel free to post in comments section. Thanks

Similar Posts