Home The importance of autonomous backups

The importance of autonomous backups

Jekyll Theme

Why do I need backups?

The simple answer: You never know when you will need them, but when you need them, you really need them. With the internet rampet with viruses and worms infecting machines that then will either be discarded, or completely wiped to get rid of the infected code, you may need one. You could even have a dreaded drive failure, and trust me, keeping regular backups is much, much cheaper than data recovery services. A bitlocker malware could make it so you need to pay X BTC to get your data back, and there is no assurance that they will even comply and give you the key… who has time for that?

This article ties in well with out other article on using UPS software to script routines on a power failure!

Some wild statistics on why you should back up your data:

More than 140 thousand hard drives crash weekly.
Nearly 2/3 of the businesses suffering dataloss will close their doors within 6 months of their dataloss event.
31% of users have experienced total dataloss, losing everything, at least once.

I suggest planning ahead and avoiding all this trouble.

So how many should I keep?

My general philosophy is keep 9 backup files. Two of the storage mediums you should back up on should be on-site, and one of them offsite. Then I suggest keeping backups three deep on each medium, in case you need to back track (in the event that data is damaged, or lost on a more recent backup that you do have). Redundancy is definately key here. It’s all about saving yourself time and effort later in the event of a whatever-can-happen-will-happen-event, where you inevitably lose your data.

How should I back my data up

Well there are a wide range of soulutions. If you run a business, I suggest investing in some enterprise software, and possibly enterprise equipment, like tape drives, if you need long term data storage. If you are a home user, you can usually turn on backups on your phone, that can happen automatically on whatever timetable you want (for your apps, app data, address book, etc).

If you run a Linux workstation, I’ve written a pretty simple backup script that you are welcome to use, I use this on my Linux workstations.

# Backup to nas and copy to other partition
# Marshall Whittaker / oxagast
logd=$(date "+%Y%m%d%H%M%S")
echo "$logd Starting backup:" | tee -a $log
# What to backup. 
# log file
# Where to backup to.
# Create archive filename.
#day=$(date +%A)
day=$(date +%d-%m-%y)
hostname=$(hostname -s)
# Print start status message.
echo "$logd Backing up $backup_files to $desta/$archive_file ..." | tee -a $log
# Backup the files using tar.
size=$(du -bs /home/marshall | cut -d '/' -f 1)
newsz=$(echo "$size * 0.85" | bc)
szint=$(printf "%.0f" $newsz)
tar --exclude='/home/marshall/.bin/' --exclude='/home/marshall/Work/customers' --exclude='/home/marshall/GDrive' \
--exclude='/home/marshall/.local' --exclude='/home/marshall/.cargo' --exclude='/home/marshall/.cpan' \
--exclude='/home/marshall/snap' --exclude='/home/marshall/.cache'  --exclude='/home/marshall/Downloads' \
--exclude='/home/marshall/ISOs' --exclude='/home/marshall/Photos' --exclude='/home/marshall/Videos' \
--exclude='/home/marshall/Pictures' --exclude='/home/marshall/VMs' --exclude='/home/marshall/Work/customers/rahaf'  \
 -cvf - $backup_files 2>/tmp/file.tar.backup.lst | pv -ptrabI -N "Generating backup file" -s $szint | gzip | \
openssl enc -aes-256-cbc -e -out "$desta/$archive_file" -pbkdf2 -pass pass:"nicetryskid"
echo -n "$logd Files backed up: "
cat /tmp/file.tar.backup.lst | wc -l
echo "$logd Moving offsite..." | tee -a $log
rclone mount GDrive: /home/marshall/GDrive &
rsync -Px "$desta/$archive_file" "$destb/$archive_file"
rsync -Px "$desta/$archive_file" "$destc/$archive_file"
echo "$logd Removing out of date backups..." | tee -a $log
number_current=$(find /home/marshall/GDrive/Backup/ -name "backup-$hostname-*.tar.gz*" -type f -mtime -16 | wc -l)
if [[ $number_current -ge 1 ]]; then
  find /home/marshall/GDrive/Backup/ -name "backup-$hostname-*.tar.gz*" -type f -mtime +16 -delete
  echo "$logd Removed stale backup..." | tee -a $log
echo "$logd Checking sizes..." | tee -a $log
du -b "$desta/$archive_file" "$destb/$archive_file" "$destc/$archive_file" | tee -a $log
echo "$logd Dismounting offsite backup server"| tee -a $log
umount /home/marshall/GDrive
# sending notification to desktop
kdialog --passivepopup  "Backup Complete at $desta/$archive_file" --icon drive-multidisk --title "Backup"
echo "$logd Backup complete!" |  tee -a $log

Some setup is required, like setting the encryption password, changing the variables of what and where you want to back everything up, as well as setting up the Google Drive rclone utility so offsite backups work. Then you will need to configure a crontab entry to run it weekly, or however often you want; I apologize for this not being easier to configure and use, there are certainly more userfriendly options out there.

Go ahead and set some utility up. I suggest doing it today, because… you just never know.

If you enjoy my work, sponsor or hire me! I work hard keeping oxasploits running!
Bitcoin Address:

Thank you so much and happy hacking!
This post is licensed under CC BY 4.0 by the author.