Ubuntu and dnsmasq

June 1st, 2013

To disable dnsmasq in Ubuntu 12.04+:

Edit
sudo nano /etc/NetworkManager/NetworkManager.conf

and comment out the line “dns=dnsmasq” in the [main] section:
#dns=dnsmasq

Finally, restart the NetworkManager:
sudo restart NetworkManager


Delete the nth line from a file

June 1st, 2013

To delete the nth line from a file in Bash:

sed -i 'nd' filename

Decrypting Remmina Passwords

April 24th, 2013

This morning I was trying to change my password on a remote server, accessed using Remmina. My login password for this machine was saved in the Remmina config file, which automatically logs me in. Since I rarely had cause to actually type this password in, I had forgotten it. Because I couldn’t remember my current password, I couldn’t change it.

Remmina stores passwords in it’s config files (found in ~/.remmina/); they are encrypted using Triple DES, and base 64 encoded. The encryption key is also base 64 encoded, and stored in the Remmina preferences file (~/.remmina/remmina.pref). I attempted to use some online decryption tools, but wasn’t getting very far, and decided to try a different tack.

After a little GooglingDuck Duck Go-ing, I came across this post by mateuscg which showed how to decode passwords saved by Remmina. It’s a neat bit of code, that uses Remmina’s own routines and a tiny bit of extra C code:

First, grab the Remmina source from GitHub:
git clone https://github.com/FreeRDP/Remmina.git

We’ll need the following files from the Remmina source:

  • remminacrypt.c
  • remminacrypt.h
  • remminapref.c
  • remminapref.h
  • remminastringarray.c
  • remminastringarray.h

We’ll also need to create an empty file called “config.h”.
Put all of these files in a folder somewhere.

Next, create a new file called remminapwd.c in that folder, containing the following source:

#include <glib /gprintf.h>
#include "remmina_pref.h"
#include "remmina_crypt.h"

int main(int argc, char **argv)
{
   if (argc == 2) {
      remmina_pref_init();
      g_printf("%s\n", remmina_crypt_decrypt(argv[1]));
      return 0;
   } else {
      /* No (or too many) parameter(s) given, exit with code 1 to signal error */
      return 1;
   }
}

Compile using:
gcc -DHAVE_LIBGCRYPT -o remminapwd remminapwd.c remminacrypt.c remminapref.c remminastringarray.c -Wall `pkg-config --libs --cflags gtk+-2.0` `libgcrypt-config --cflags --libs`

Usage is as follows:
./remminapwd "[base64 encoded, 3DES encrypted password goes here]"

Thanks very much mateuscg – awesome stuff.


Binding the middle click event to a keyboard shortcut

January 25th, 2013

Just so I don’t forget how to do this:

1. Install xsel and xvkbd:

sudo aptitude install xsel xvkbd

2: Add a keyboard shortcut (in Gnome, use gnome-control-center keyboard), with the command:

sh -c 'xsel | xvkbd -file - 2>/dev/null'

Based on the accepted answer and comments found here.


Packaging files for deployment using Git

June 5th, 2012

I regularly need to move a number of changed files from staging to production server environments following change requests from clients. I’ll often be asked to add feature X, and this can lead to many new and changed files. In the past, I’ll admit to employing a rather manual approach to deploying the files, and it’s easy to miss a file which will break the whole update.

Git can really help ease the pain here. By generating a list of all changed files between commits, branches or tags, we can create a tarball containing everything we need, send the bundle to the server and deploy in an instant:

git diff --name-only <commit a> <commit b> | tar -c[z]vf <tarball_filename> -T -

In my case, I want to modify the filenames added to the tarball slightly, so I use:

git diff --name-only branchA branchB | tar -cvf package.tar -T - --transform 's|^|www/|'

(This prepends “www/” to each filename stored in the tarball, mirroring the directory structure used on cPanel server).

Once done, SCP or FTP the tarball you’ve created to the production server, and unpack using:

tar -x[z]vf <tarball_filename>

Done.

Hat tip to Deryck Lio for the idea.

Edit:

In a similar vein, it’s also easy to use this technique to create a backup of the files on the live server, which can be rolled back quickly in the event of a problem. Create a list of the changed files on the staging / development box:

git diff --name-only | sed 's|^|www/|' > changed_files

Copy the file “changed_files” to your production server, and create a backup of those modified files:

tar -c[z]vf -T changed_files <backup_filename>

In the event of a disaster after deploying the new code, roll back using:

tar -x[z]vf <backup_filename>


Mentawais

March 22nd, 2012

This year I was finally lucky enough to fulfill one of my surfing dreams – a boat trip to the Mentawais Islands. We were on board the “Naga Laut”, and had nine awesome days surfing Nipussi, EBay, Telescopes, Macaronis, HT’s, and Bang Bangs – often scoring the break to ourselves. A few pictures of the trip are below:

Outrigger canoes at Padang

Local padang kids surfing in filthy water after a storm

Sleeping quarters aboard the Naga Laut

Board Rack

Perfect Tropical Island

Accomodation near Playgrounds

Me on a nice right-hander at Nipussi

Local fishermen at Telescopes

My temporary island home in Playgrounds after being caught in a storm

Pro surfers at E-Bay

Future Indonesian champion at HT's

Me on another backhand disaster at Telescopes

A slightly better attempt at Telescopes

Sunset at Playgrounds

Another sunset


Validating image uploads with PHP

December 8th, 2011

I was asked by a colleague recently how I ensure that a file uploaded to a web server is of a particular type. For example, if I have an image gallery, how do I ensure that only valid images are uploaded, and not (as happens more regularly than you’d hope) an image embedded in an MS Word document.

There are a number of naive methods for achieving this, most common of which are “check the mime type reported by the browser when uploading”, or the really dumb “check the file extension”. Honestly – do you really think a malicious user wouldn’t think of changing a file extension?

I prefer to use the Linux file command. It identifies files based on their magic number, and can identify images from malicious executables easily. Here’s some simple example code:

	function validImage($filename)
	{
		// Check that the requested file exists
		if (!file_exists($filename) || is_dir($filename))
			throw new Exception('Specified file does not exist');

		exec("file -bi $filename 2> /dev/null", $output, $returnvalue);

		if (((int) $returnvalue === 0) && (count($output) > 0))
		  return preg_match('/^image\/.*/i', implode(' ', $output));

		return false;
	}

The above function attempts to identify a given file, and determines it’s MIME type. If the MIME string begins with the phrase “image/”, then we can assume that the file is indeed an image file. Note that this does not necessarily mean that the file is valid or uncorrupted.

There are a few proviso’s to be aware of before using this technique:
a) You must be on a Linux-based server (duh)
b) Your PHP installation must allow for calls to exec

This method can be extended to identify any type of file uploaded. It’s also not necessary to use the MIME type reported by file either – if you know what you’re looking for you can use the normal out of the command.

If you’re specifically looking to identify images, an alternative is to use ImageMagick’s identify command. This can be slightly more problematic in that ImageMagick will report on PDF documents and video files too – but by carefully inspecting the output you might be able to extract the file type and other useful information (eg image dimensions) in one command.


Javascript snippets

October 20th, 2011

Today I had to convert a whole bunch of CSS colours from RGB(x,y,z) to HTML hexadecimal format. There are loads of these colour format converters on the web, and they all seem to do the same thing: input the R, G, and B components into separate boxes, press a button and you’ll get back an HTML colour string.

There’s a problem with all of these though – you have to copy and paste each RGB component into a separate box. If you have lots to do, this will take ages. Why do none of these converters allow you to paste in the whole RGB triplet at once?

In frustration, I wrote this little Javascript function to do the job:

function rgbToHex(rgb) {
	var result = '#',
	    format = function(n) { result += ((parseInt(n) < 16)?'0':'') + parseInt(n).toString(16); };

 	rgb.split(',').forEach(format);
	return result;
}

Not particularly complicated, and it works a treat – have a try in the space below:

 

Oh yeah… This requires Javascript 1.6, which means a decent browser. IE < 9 need not apply – sorry.


Preventing Googlebot from indexing your website

September 21st, 2011

I know – this seems like the opposite of what most people want from Google – but there’s a good reason for it on occasion:

Recently one of the sites we were building was indexed by Google during the development phase. Somehow Googlebot managed to find a way onto the development server, crawled an almost complete e-commerce site, and proudly displayed those results on it’s search results.

Very soon after that, the site went live and Google crawled that too. It decided that the live version contained duplicate content from the development version, and dropped those pages from it’s listings altogether – not good. This issue was quickly resolved by some permanent redirects on the development server (and resubmitting the live URL for Google to index), but this should not have been necessary in the first place.

A better way to resolve this issue is to stop Googlebot seeing your pages at all. There are a number of ways to achieve this – the most obvious being a restrictive robots.txt file. There are other ways however, and my preferred method is to use some mod_rewrite trickery. Place the following in your htaccess, virtualhost or other config files:

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} google [NC]
RewriteRule .* - [R=404,L]

Of course, you’d probably want to modify that HTTP_USER_AGENT line to include the user agents of other bots too.

An easy way to test if this is working is by using the User Agent Switcher plugin for Firefox.

Why have I settled on this method? I have a bash script which I run whenever I create a new site. It sets up a new vhost on our development server, creates appropriate directories, sets appropriate permissions, and reloads Apache2′s configuration. It also includes the above in the vhost file, meaning that I don’t have to worry about remembering to create a robots.txt (and ensures I don’t copy the restrictive robots.txt file to the production server accidentally)


MySQL and Negative Unix Timestamps (Dates pre-1970)

March 24th, 2011

Writing this post as I keep forgetting how to store pre-1970 timestamps in a MySQL datetime field.

The Unix timestamp is an integer representing the number of seconds since January 1st, 1970. I regularly store datetime variables from PHP in a database using:

$db->ExecuteSQL('INSERT INTO table (datefield) VALUES (FROM_UNIXTIME(?))', 'i', $datevalue);

However, this comes unstuck when dealing with pre-1970 dates. Although PHP can easily handle negative timestamp values, MySQL’s FROM_UNIXTIME function cannot, and things start going awry. To fix this issue, use the following:

$db->ExecuteSQL('INSERT INTO table (datefield) VALUES (DATE_ADD(FROM_UNIXTIME(0), ?))', 'i', $datevalue);

Similarly, converting back to a Unix timestamp using MySQL’s UNIX_TIMESTAMP function will also fail if the date value is before the Unix epoch. To retrieve these values as a Unix timestamp from the database, use:

$db->FetchValue('SELECT TIMESTAMPDIFF(SECOND, FROM_UNIXTIME(0), datefield) AS datefield FROM table');