avatar
oxasploits
one zero day at a time
  • HOME
  • SERVICES
  • CATEGORIES
  • ARCHIVES
  • WORDLISTS
  • EXPLOITS
  • UPTIME
  • GITHUB
  • PRIVACY
  • ABOUT
  • PREVIOUS ENEAVORS
Home The importance of autonomous backups
Post
Large Logo

The importance of autonomous backups

By Marshall Whittaker
Posted Oct 20, 2022 Updated Dec 8, 2022 4 min read
backups data security infosec script
If you enjoy my work, please donate! I work hard keeping oxasploits running!
Bitcoin Donation Address:
bc1qclqhff9dlvmmuqgu4907gh6gxy8wy8yqk596yp
You can also sponsor me on GitHub!
Thank you so much and happy hacking!

Jekyll Theme

Why do I need backups?

The simple answer: You never know when you will need them, but when you need them, you really need them. With the internet rampet with viruses and worms infecting machines that then will either be discarded, or completely wiped to get rid of the infected code, you may need one. You could even have a dreaded drive failure, and trust me, keeping regular backups is much, much cheaper than data recovery services. A bitlocker malware could make it so you need to pay X BTC to get your data back, and there is no assurance that they will even comply and give you the key… who has time for that?

This article ties in well with out other article on using UPS software to script routines on a power failure!

Some wild statistics on why you should back up your data:

More than 140 thousand hard drives crash weekly.
Nearly 2/3 of the businesses suffering dataloss will close their doors within 6 months of their dataloss event.
31% of users have experienced total dataloss, losing everything, at least once.

I suggest planning ahead and avoiding all this trouble.

So how many should I keep?

My general philosophy is keep 9 backup files. Two of the storage mediums you should back up on should be on-site, and one of them offsite. Then I suggest keeping backups three deep on each medium, in case you need to back track (in the event that data is damaged, or lost on a more recent backup that you do have). Redundancy is definately key here. It’s all about saving yourself time and effort later in the event of a whatever-can-happen-will-happen-event, where you inevitably lose your data.

How should I back my data up

Well there are a wide range of soulutions. If you run a business, I suggest investing in some enterprise software, and possibly enterprise equipment, like tape drives, if you need long term data storage. If you are a home user, you can usually turn on backups on your phone, that can happen automatically on whatever timetable you want (for your apps, app data, address book, etc).

If you run a Linux workstation, I’ve written a pretty simple backup script that you are welcome to use, I use this on my Linux workstations.

#!/bin/bash
#
# Backup to nas and copy to other partition
#
# Marshall Whittaker / oxagast
#
logd=$(date "+%Y%m%d%H%M%S")
echo "$logd Starting backup:" | tee -a $log
# What to backup. 
backup_files="/home/marshall"
# log file
log="/var/log/backup.log"
# Where to backup to.
desta="/var/storage/Backups/"
destb="/var/backups/USB/"
destc="/home/marshall/GDrive/Backup/"
# Create archive filename.
#day=$(date +%A)
day=$(date +%d-%m-%y)
hostname=$(hostname -s)
archive_file="backup-$hostname-$day.tar.gz.aes"
# Print start status message.
echo "$logd Backing up $backup_files to $desta/$archive_file ..." | tee -a $log
# Backup the files using tar.
size=$(du -bs /home/marshall | cut -d '/' -f 1)
newsz=$(echo "$size * 0.85" | bc)
szint=$(printf "%.0f" $newsz)
tar --exclude='/home/marshall/.bin/' --exclude='/home/marshall/Work/customers' --exclude='/home/marshall/GDrive' \
--exclude='/home/marshall/.local' --exclude='/home/marshall/.cargo' --exclude='/home/marshall/.cpan' \
--exclude='/home/marshall/snap' --exclude='/home/marshall/.cache'  --exclude='/home/marshall/Downloads' \
--exclude='/home/marshall/ISOs' --exclude='/home/marshall/Photos' --exclude='/home/marshall/Videos' \
--exclude='/home/marshall/Pictures' --exclude='/home/marshall/VMs' --exclude='/home/marshall/Work/customers/rahaf'  \
 -cvf - $backup_files 2>/tmp/file.tar.backup.lst | pv -ptrabI -N "Generating backup file" -s $szint | gzip | \
openssl enc -aes-256-cbc -e -out "$desta/$archive_file" -pbkdf2 -pass pass:"nicetryskid"
echo -n "$logd Files backed up: "
cat /tmp/file.tar.backup.lst | wc -l
echo "$logd Moving offsite..." | tee -a $log
rclone mount GDrive: /home/marshall/GDrive &
rsync -Px "$desta/$archive_file" "$destb/$archive_file"
rsync -Px "$desta/$archive_file" "$destc/$archive_file"
echo "$logd Removing out of date backups..." | tee -a $log
number_current=$(find /home/marshall/GDrive/Backup/ -name "backup-$hostname-*.tar.gz*" -type f -mtime -16 | wc -l)
if [[ $number_current -ge 1 ]]; then
  find /home/marshall/GDrive/Backup/ -name "backup-$hostname-*.tar.gz*" -type f -mtime +16 -delete
  echo "$logd Removed stale backup..." | tee -a $log
fi
echo "$logd Checking sizes..." | tee -a $log
du -b "$desta/$archive_file" "$destb/$archive_file" "$destc/$archive_file" | tee -a $log
echo "$logd Dismounting offsite backup server"| tee -a $log
umount /home/marshall/GDrive
# sending notification to desktop
kdialog --passivepopup  "Backup Complete at $desta/$archive_file" --icon drive-multidisk --title "Backup"
echo "$logd Backup complete!" |  tee -a $log

Some setup is required, like setting the encryption password, changing the variables of what and where you want to back everything up, as well as setting up the Google Drive rclone utility so offsite backups work. Then you will need to configure a crontab entry to run it weekly, or however often you want; I apologize for this not being easier to configure and use, there are certainly more userfriendly options out there.

Go ahead and set some utility up. I suggest doing it today, because… you just never know.

tools
This post is licensed under CC BY 4.0 by the author.
Share
Recently Updated
  • Enumerating SUID files targeted for priv esc
  • Writing the shortest valid C quine
  • I Hacked a Bank and Got Arrested in 2012
  • Advanced Fuzzing Techniques in ansvif
  • Fuzzotron and Radamsa pcap testcases
Trending Tags
exploit vulnerabilities PoC code-injection config perl walkthrough 0day bitcoin blueteam


  

Further Reading

Aug 16, 2022

Chipmonk with NUT to event script power outages

Ah, shit. The power went out. So you just found the key, almost have the exploit at a PoC state where it fin… Wait what? The power went out! You just lost your last 10 minutes in between commit...

Nov 5, 2022

Lock binaries in memory using vmtouch cache

What does this really accomplish? Our goal here is to first look at reads on everything you commonly use when you use a linux computer, where be it common command line utilities, or GUI apps suc...

Aug 30, 2022

Jekyll minification optimization

Jekyll minify intro So as you can see, I build websites with Jekyll static site generator a lot. The problem with this is the jekyll implementation is usually used on GitHub for internal sites,...

Anatomy of a hardened Apache2 configuration

Lock binaries in memory using vmtouch cache

© 2023 Marshall Whittaker. Some rights reserved.

   | Home | Services | About | Wordlists | GitHub |
| Exploits | Services | Privacy| Endeavors | Status |