Hackviking He killed Chuck Norris, he ruled dancing so he took up a new hobby…

21Jul/170

Transfer Google Drive and photos between accounts

I ended up finally transferring everything from my old gmail.com based Google account over to my custom domain Google account. I have been running them in parallel for ages switching between them to access different services. I read a number of blog posts and forum discussions on how to do this, also took a look at Googles own documentation.

According to Google you can merge two Google accounts but only if both our in the same organisation. And since my "new" one was the only one actually in an organisation and the other one isn't this wasn't an option. I couldn't figure out, from the documentation alone, if merging two gmail.com based Google accounts are possible or if it has two be two proper organisation/custom domain accounts.

Google Drive

Started out with Google Drive by sharing all my files from one account to the next. Then I could just copy the files on the new account. The issue I ran into was that then all the filenames started with "copy of". I didn't want that so I started looking at other options.

Looked at the API with the plan to create a script that shared everything from the first account and then copied it on the second one. The kicker would be to remove the "copy of" part of the name on all the files.  I started messing around with the Javascript library provided by Google but didn't find an easy way to authenticate two accounts at the same time. Didn't wan't to spend to much time on it so I moved on.

By just selecting all the files and select download I received a zip file with all my files. Most of my files has been created on Google Drive with the built in apps and they got converted to excel and word respectively when downloaded. Then I downloaded the Google Drive app for windows, made sure I turned on the "convert uploaded files to Google..." option and then just dropped all the files in there.

Google Photos

This was a bit more tricky then Google Drive. I found a nice option of sharing my entire repository between the accounts and could set the new account the automatically save the shared photos to it's of repository. It did how ever take ages so I quit that and just used Google Takeout to download two huge zip files with all my photos. I then configured the Google Drive app to backup the folder with the unzipped content and add photos and videos directly to Google Photos.

Yes I lost all my albums but the improvements of the Google Photos Assistant gives my correct suggestions for creating albums matching my trips and events daily now.

14Mar/170

Windows: Set DNS and add to domain from command line

Just got 18 virtual servers delivered from a private cloud supplier. Since none of them are joined to our domain I need to access them one by one and set them up. After they are joined to the domain it's easier to manage them. So I wanted a quick way to add our internal DNS servers and add them to the domain. Doing this manually is a time consuming task, error prone and straight up boring. So by doing this from the command line I could do it fast, correct and less boring.

Continue reading...

18Aug/160

Google Compute Engine: Monitor disk usage with Stackdriver

Setting up monitoring of your cloud servers are not only useful to get alerts when a server goes down or run out of disk. It's also very good to have historical data over performance metrics when troubleshooting issues. In this post I cover the basic setup and issues you can run into using Stackdriver for monitoring your Google Compute Engine servers. There are several Stackdriver features already plugged into the Google Cloud, we will focus on monitoring. Keeping track of our virtual infrastructure and use. Out of the box, just by enabling monitoring in the Google Cloud console, it will collect 5 basic metrics for all your instances.

  • CPU Usage
  • Disk Read I/O
  • Disk Write I/O
  • Network Inbound Traffic
  • Network Outbound Traffic

To get to a basic monitoring level we need at least memory and disk usage. Then we can start to look into more metrics in regards to applications and their performance.

Monitoring agent

To be able to collect these metrics we need to install the Stackdriver agent on the servers. Google have install instructions for the agent in their documentation where you can get a basic understanding of the whole setup process. It's actually very straight forward as long as you didn't change any of the "Cloud API access scopes" when you created the VM. If so make sure that your VM have at least "Write only" for the "Cloud Monitor API". Then you can just download the agent from the link in the install instructions and you will see additional metrics come in. The additional metrics are:

  • Memory usage
  • Open TCP connections
  • Page File Usage
  • Volume Usage (disk usage)

Issues with Volume Usage (disk usage)

So far I have installed the agent on 20 windows machines and all of them report all additional metrics except for the the disk usage.  In high through put and data intensive solutions this is one of the most important metrics. After ours of trouble shooting and browsing of documentation and forums I realized that I'm not the only one having the problem but no one had a good solution for it.  I then noticed that the download link for windows in the install instructions was named stackdriverInstaller-GCM-17.exe while the latest I could find directly from Stackdriver was stackdriverInstaller-39.exe. This leads me to believe that the versioned linked from the install instructions are outdated.

The GCM branded one is an automatic install, no input needed. The one downloaded from Stackdriver needs the API key to install. I couldn't find a good download page on the Stackdriver homepage but after Googling found a Support Center entry linking to their repo. At the same time this entry is from April 21st 2015 it seems to be outdated. I did however try different version numbers on the download link and 39 seems to be the latest one. Anyhow it's much never then 17 at least, but as stated in the Google install instructions their is no way to check the current version of the Stackdriver agent currently installed on windows.

Enough about that! This install requires you to input your Stackdriver API key. If you open up the Stackdriver web-ui via the link in the Google Cloud Console and go under "Account settings" and "Agent" you will find it there. Account settings are found under the project dropdown next to the Stackdriver logo in the top left corner of the UI. Just copy the "Key" and past it into the install wizard.

Dashboards and filtering

Now you can create dashboards with different metric charts to get a good overview of your system. The charts can be filtered on resource name via regex. So far I have not been able to filter out specific drive letters. In the view, as well as the underlying JSON, the data is in the format of {instance name}(C:) for example. So a regex like ^.*\(C:\) should match all C:\ drives but it doesn't work. It's not a big issue but there is a few improvements that I hope will come shortly. We have to remember that at this point Stackdriver functionality comes to Google Cloud as beta and does not have any SLA at all.

15Feb/160

AWS EC2 Linux: Simple backup script

I have a small EC2 box running a couple of WordPress sites. To back them up I wrote a quick bash script that dumps out the databases and also zips up all the files. This script is running daily, the amount of disc space doesn't really matter since the sites are really small. If you have larger sites you might want to split the script into one for weekly backup for files and a daily one for databases, the principal are the same.

Prerequisites

  • Credentials for MySQL with access to the database in question. In this example I run with root.
  • A folder where you want to store your backups. You should a separate folder since we clean out old backups in the script.
  • Path to the www root folder for the sites.

Script

First we create a script and open it for editing in nano:

nano backup.sh

First line in any bash script is:

#! /bin/bash

Then we create a variable to keep the current date:

_now=$(date +"%m_%d_%Y")

Then we dump out the databases to .sql files:

mysqldump --user root --password=lamedemopassword wp_site_1 > /var/www/html/backup/wp_site_1_$_now.sql
mysqldump --user root --password=lamedemopassword wp_site_2 > /var/www/html/backup/wp_site_2_$_now.sql

Here we use the $_now variable we declared in the beginning of the script. So we can easily find the correct backup if we need to do a restore. Next step is to zip up the www contents of each site:

tar -zcvf /var/www/html/backup/www_backup_site1_$_now.tar.gz /var/www/html/wp_site_1
tar -zcvf /var/www/html/backup/www_backup_site2_$_now.tar.gz /var/www/html/wp_site_2

Once again we use the $_now variable to mark the file name with the current date for the backup.

Then we want to clean out backups older then x days. In this example I remove all backup files older then 7 days.

find /var/www/html/backup/* -mtime +7 -exec rm {} \;

The first part find /var/www/html/backup/* -mtime +7  then we use the -exec to pipe the result into a command, in this case rm. The {} inserts the files found and the \ escapes the command to prevent it to expand the shell. Then we finalize the row with a ;. So this means that for each file found it will execute the rm command and remove that file.

Save the backup.sh file and exit nano. Now we need to make the script executable:

chmod 755 backup.sh

Then we can do a test run of the script:

sudo ./backup.sh

Now check the folder that the files was created and contain the actual data. If your satisfied with the result you can move the script into the cron.daily folder.

sudo mv /home/ec2-user/backup.sh /etc/cron.daily/

Now the daily cronjob will create a backup of the WordPress sites. Both files and databases.

11Feb/160

AWS EC2 Linux: Enable SSH password logon

Amazon AWS EC2 instances are by default secured with ssh certificates. This is great for security until you need to provide a UAT duplicate for an external user or developer. You don't want to share your certificate with them and setting up a new one is more work than this quick fix. The security isn't as important on a UAT or test system as it is on a production system so for this use case we can go for lower security.

To enable users to access we first need to set a password on the ec2-user account. It's highly recommended that you select a strong password!

sudo passwd ec2-user

Then we need to allow password only connections. We edit the ssh settings, find the line PasswordAuthentication no and change it to PasswordAuthentication yes.

sudo nano /etc/ssh/sshd_config

Then we need to restart the ssh service.

sudo service sshd restart

Now you can login to you Amazon AWS EC2 instance with only a password. To secure the server again just change the PasswordAuthentication line back to no.

4Feb/150

Amazon EC2 Linux – Add additional volumes

EBS Mappings

Adding additional storage to your Amazon EC2 instance have several advantages. You can select the right storage type for the use. Why use a fast SSD backed volume for storing nightly backups instead of magnetic storage, that ar slower but come at a much lower price.

First you need to provision storage and assign it to your instance. Amazon provides a good guide on how to add additional volumes to your instances. There are several advantages to using several different volumes. As I wrote in my guide to move mysql storage you will not risk running the boot disk full witch will make the system halt. Other advantages include the selection of storage fit for your purpose and price range, as mentioned above. External volumes can also easily be migrated between instances if and when you get a need for that. It is also easier when you need to extend your storage space. Instead of making a snapshot of the entire instance and then launching a new one with a bigger drive you can attach new storage and migrate the data. This approach will make the downtime much shorter.

When selecting the correct storage for you solution there are a few things to keep in mind. When it comes to EBS it comes in three basic flavors. All with there benefits and disadvantages, it is there for important to make an educated decision.
Continue reading...

4Sep/145

Amazon AWS EC2 Linux Swapfile

The Amazon EC2 Linux instances comes without swap. Sooner or later this will be a problem with service hangups or crashes as a result because you run out of memory. I found a lot of instructions on the web about how to add a swap file but no one takes the storage into concern and you may end up paying a lot for a very little performance gain. This article will guide you through swap files on Amazon EC2 linux hosts.

Continue reading...

30Oct/130

Google Code Project Home Page: Tips & Tricks

When I was updating the home page for Picasa Web Downloader on Google Code I found two things that might interest others! 🙂

Paypal Donation Buttons

I found more written on the subject then about any other issue on Google Code markup. I have to say that there is a really easy way! Just create a donation button on paypal, copy the url for the image, and use the email donation link to create a common <a> in the markup like this:

<a href="https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=XFGNUAPH5YR8C"><img src="https://www.paypalobjects.com/en_US/SE/i/btn/btn_donateCC_LG.gif" /></a>

Just be shore to close the img tag like /> otherwise it will mess up the markup.

Download Links

Google stopped the function of download links because of misuse. I found a blog entry written by one of there techs suggesting using Google Drive instead. In what way can you misuse free shared storage on Google Code that you cant misuse on Google Drive? Any way, I put up the link but when someone clicks the link for the latest release they get a view of the zip file contents on Google Drive instead. There is an easy fix for this! The link you get when you share it looks like this:

https://drive.google.com/file/d/0B7xgtMzrLFNNTE1XUUtjNXJsYVU/edit?usp=sharing

Just take the ID part (0B7xgtMzrLFNNTE1XUUtjNXJsYVU) and put it in this line below:

https://docs.google.com/uc?export=download&id=YourIndividualID

That will send the user directly to the download!

 

16Oct/130

Google Code “Get started” is wrong

I was fixing with my project on Google code and as I remembered it you could have a download tab for compressed files with releases and so on. But I couldn't find the download tab, I referred to the  "Getting started guide" for Google Code and there it was, a reference to the downloads tab!

 

google code release sharing

But I still couldn't find it! After a while a come across this: http://google-opensource.blogspot.se/2013/05/a-change-to-google-code-download-service.html
People miss used it to much so it has been removed.... That's to bad, nice to have it all in one place. How ever.... Update needed on the "Getting started guide"!

14Nov/110

Get all from Spotify

I have been checking peoples Spotify settings whenever I get access to there computers for service and stuff like that. And almost all people don't get all they can from Spotify. Most of the computers I checked uses the premium version, the free version isn't much fun anymore with all the limitations. If you have a premium account most songs are available in 320kbs but the standard setting is only 160kbs.

In the menu Edit -> Preferences... this can be corrected! Check "High quality streaming" and uncheck "Set the same volume level for all tracks" to get the full range of the track.

On your smartphone you go to the Settings (the small gears icon in the lower right hand corner). Set both Stream and Sync to High Quality and if you ever have bad reception, or anything else that gives you lower data transfer speed, just change the stream setting temporarily. I usually sync all my tracks because Sppotify uses a lot of battery power when streaming all the time!