Hackviking He killed Chuck Norris, he ruled dancing so he took up a new hobby…

25Sep/170

Automated Raspberry Pi Backup – complete image

I love my Raspberry Pi projects and I run a lot of specialist "mini" servers at home doing everything from torrent sharing of Linux distros to media streaming and media playing. But all Raspberry Pi's and other single board computers that rely on SD-cards sooner or later comes to a point where they trash the card and doesn't boot again.

Every time I run into that situation without remembering exactly what was running and how on the particular Raspberry Pi. I want backups, not just the backup I usually do right after installation but a last night backup or similar. So I put up an NFS share on my NAS to store the backups, it will work just as well with a USB stick connected directly to the Raspberry Pi. Here is a step by step guide how I automated the backups on all my Raspberry Pi's. This script will create a complete image of the SD-card while the Raspberry Pi is running. You can just write that image to a new SD-card and pop it into the Pi and it will be like nothing happened!

Continue reading...

11Sep/170

MS SQL: Automatically restore latest backup

A common need is to restore the latest production backup to a test system or user acceptance test system on a regular basis. Depending on your system (database) size this can be time consuming. You would prefer to have this done during the night right after the backups run. If you don't have any third party solution for backups where this feature is built in it can be a bit tricky. The reason why is that the automated backups in a Microsoft SQL Server maintenance plan have somewhat unpredictable names.

Solution to this problem is a simple T-SQL script that you can put in a maintenance plan, to run every night or just on schedule as you please. The script grabs the latest backup an makes the restore.

The script can be downloaded from Github - T-SQL Automatic restore of latest backup

Continue reading...

20Mar/170

Kodi central db backup

Using a central database for all your Kodi media players is convenient. Only one of them need to scan for new content or you can even update the database straight away. It holds state across all the devices like paused movies, watched episodes etc. If you have a large library it takes time to scan it all again so you should keep it backed up. I didn't but now I do!

Continue reading...

10Apr/160

Pi: BtSync satellite – spin down hard drive

My BitTorrent satellite has finally synced my 6tb of data over the local network. The initial sync took several days but so far it seems to pretty quick picking up new files and syncing them. Before I move it to my office I want to make sure I get some peace and quite in the office. I need it to spin down the hard drive when not syncing data. I had the same issue with the BitTorrent Sync server in my apartment always spinning up my NAS but this was actually a bit different.

Since this node uses a USB-disk instead of the network shares on a NAS it can actually do some basic stuff, like indexing, without spinning up the drive. I don't know if it's due to the utilization of Truecrypt or if it's built in but there is some cache which allows the btsync daemon to list the files on disk without the disk spinning up. So I don't have to reconfigure the indexing intervall like I had to on the node uses the NAS. That is communicating over the network to the NFS shares of the NAS and it will spin up it's disk every time someone access it. So there I had to reset the sync intervall to 12 hours. But for my backup solution that will be just fine.

The second thing I was sure I had to change was my script for the LCD display. Since it's reads a JSON file with user credentials from the encrypted disk every 45 seconds I thought it would spin up the drive. No it also ended up cached somewhere and everything is working great at the moment. Have tested throwing new files in there and it synconices just fine! The disk spins up, writes the data and goes back to sleep again after 30 minutes.

To achieve this we need to use hdparm, if your on a Raspberry you need sudo before these commands:

apt-get install hdparm

Then we can run it from the command line:

hdparm - S120 /dev/sda1
/dev/sda:
setting standby to 120 (10 minutes)

To make it persistant after reboot just nano /etc/hdparm.conf, and add this at the end of the file:

/dev/sda1 {
spindown_time = 120
}

So this is the last step before I can move it to my office and really test out the GEO-location backup. Here is a list of the other posts about this:

3Apr/160

Pi: Geo-location backup with BtSync

Building a geo-location backup for your NAS is a good idea! To spread the risk over two or more locations increases your backup value a lot. Most people confuse redundancy and backup. If you only have a USB-disk backup of your NAS it only protects you against hardware failure. If there is a fire or a break in you will still lose your data. A lot of people take a USB-disk to a second location, like their office, to mitigate this problem. But to be honest how often will that backup be done if you have to remember to bring the disk back and forth? We want automatic backups to our offsite location, in this case my office. So we are going to build a BitTorrent Sync "satellite"
Continue reading...

15Feb/160

AWS EC2 Linux: Simple backup script

I have a small EC2 box running a couple of WordPress sites. To back them up I wrote a quick bash script that dumps out the databases and also zips up all the files. This script is running daily, the amount of disc space doesn't really matter since the sites are really small. If you have larger sites you might want to split the script into one for weekly backup for files and a daily one for databases, the principal are the same.

Prerequisites

  • Credentials for MySQL with access to the database in question. In this example I run with root.
  • A folder where you want to store your backups. You should a separate folder since we clean out old backups in the script.
  • Path to the www root folder for the sites.

Script

First we create a script and open it for editing in nano:

nano backup.sh

First line in any bash script is:

#! /bin/bash

Then we create a variable to keep the current date:

_now=$(date +"%m_%d_%Y")

Then we dump out the databases to .sql files:

mysqldump --user root --password=lamedemopassword wp_site_1 > /var/www/html/backup/wp_site_1_$_now.sql
mysqldump --user root --password=lamedemopassword wp_site_2 > /var/www/html/backup/wp_site_2_$_now.sql

Here we use the $_now variable we declared in the beginning of the script. So we can easily find the correct backup if we need to do a restore. Next step is to zip up the www contents of each site:

tar -zcvf /var/www/html/backup/www_backup_site1_$_now.tar.gz /var/www/html/wp_site_1
tar -zcvf /var/www/html/backup/www_backup_site2_$_now.tar.gz /var/www/html/wp_site_2

Once again we use the $_now variable to mark the file name with the current date for the backup.

Then we want to clean out backups older then x days. In this example I remove all backup files older then 7 days.

find /var/www/html/backup/* -mtime +7 -exec rm {} \;

The first part find /var/www/html/backup/* -mtime +7  then we use the -exec to pipe the result into a command, in this case rm. The {} inserts the files found and the \ escapes the command to prevent it to expand the shell. Then we finalize the row with a ;. So this means that for each file found it will execute the rm command and remove that file.

Save the backup.sh file and exit nano. Now we need to make the script executable:

chmod 755 backup.sh

Then we can do a test run of the script:

sudo ./backup.sh

Now check the folder that the files was created and contain the actual data. If your satisfied with the result you can move the script into the cron.daily folder.

sudo mv /home/ec2-user/backup.sh /etc/cron.daily/

Now the daily cronjob will create a backup of the WordPress sites. Both files and databases.