Category: Tips & Tricks

Peering Inside My Inbox

Posted by – October 7, 2010

I was talking to my friend John Richards from Perturbed Normal (A near-vacant blog that I think shows promise, but has hardly any posts as of yet).  I often bounce ideas off of him as I trust his opinion and we tend to have more skill overlap than others I know (who else will discuss cryptographic hash functions with me?).

Long story short I was wondering if he had any cool tips for Gmail or Google Reader labels as I’ve been doing a bit of spring cleaning and I feel there must be a better way that I just don’t see.  We both then agreed (I plan to hold him to it) that we’ll blog about what we have and can find.  So I thought I’d kick things off by showing everyone how I organize my Inbox.

My Gmail label sidebar.


So as it stands at the time of writing I have 13 labels, 2 of which are simply the label of which email address the email was sent to.   One of them is an ‘Employment’ label which I use only once every few years while I look for a steady income source (see:  job).  For all intensive purposes these 3 labels are hidden from left menu.

So the labels I use routinely are:

Accounts – Used to keep all my registration emails for forums, sites, whatever.  This keeps them in an easy to find place if I forget my username or other various details.

Attachments – Ever try searching for an attachment but not sure where or when it was from?  This label helps keep them together.  Gmail has been improved quite a bit though, so you can actually just search for has:attachment and it’s exactly the same thing.  I use it out of habit and laziness.

Backups – Whenever I backup WordPress for example, I can just put my email address and have it mailed there.  It’s a good place to store some backups for things that aren’t too large.  I mostly use a filter to auto-label these emails, but more on that later.

Business – Whenever I do consulting or freelance work I want to hold onto these emails.  This is where they go.  I’m contemplating creating a different email address all together though as I feel this causes clutter.

Family – Anything from or relating to family.

[Employer] – Emails from and about work at my current employer.  Should probably have a separate email for this too, but my employers are also good friends of mine.

Friends – Anything from or related to friends.  I’m tempted to crunch this into a Family/Friends label but I’m not short on room, and better organization is better.

Notifications – Emails from Facebook, Google Alerts, the library or anything else that is a notification of some kind.  This way I can bulk-delete so these don’t pile up, but I also don’t miss bill notifications when things are due.

Receipts – Receipts and invoices for purchases.  Nice to have them all in one spot for easy reference.

Support/Questions – Whenever I send an email to a webhost or practically any business or group.  Sometimes I need to find them further down the line such as the whole Linksys NAS issue I had awhile ago.

I’ve also gone ahead and selected colours that I don’t find offensive.  I tend to gravitate towards colours just dark enough to have white text.  Gives my inbox a nice feel, but depending on your theme the schemes might be different.

Speaking of colours though, I am hesitant to add any more labels because I’ve sort of run out of colours to use.  I can use different shades, but it sort of reduces the ability to know what an email is out of the corner of your eye.  Then again perhaps I’m just being pedantic.  Any ideas are welcome however, as I’ve always been bad at colour selection.


To make my life easier I’ve been creating a number of filters to apply tags and move my email around.  I don’t think many are innovative or anything, but they might help someone out.  Here are some examples:

Attachments – Anything that has an attachment is labeled ‘Attachments’ (duh!).  This is now just a checkbox on the create filter page.

Friends, Family, Work – All email addresses associated with each group get their own filter and labelled accordingly.

Backups – Anything sent to is automatically labeled ‘Backups’.

In case you weren’t aware any mail that’s sent to will filter into your inbox.  This is a good way to mark email addresses to see if businesses are selling your address ( for example).  I think most spammers and businesses have learned this trick though and remove the +Whatever from your email address before they sell it, so your results may vary.

Notifications – All the email addresses I routinely get notices from, whether it’s Facebook or utility companies saying my E-bill is ready get labeled ‘Notifications’.

Text Alerts – I don’t have nor do I see myself purchasing a smartphone anytime soon.  This is mostly due to the horrible oligopoly Canada has in regards to cell phone providers, the data costs are simply way too high.

This means I do a lot of text messaging, but not emailing.  I have some filters that will send critical alerts (such as server crashes etc) to my phone via the email address my provider gives me which I receive as texts on my phone.

Got Ideas?

I’m more than happy to hear if anyone has any ideas on how I can improve my system.  Whether it’s by more labels, or better filters or hopefully something I haven’t even thought of yet.

I’ll try to get to work on my Google Reader labels, but to be honest it’s a big bucket of fail.  In comparison my Gmail is rockin’.

Anyway, your move, John.

Why I Don’t Play New Releases

Posted by – March 25, 2010

Video Games: Why waste good technology on science and medicine?

Video Games: Why waste good technology on science and medicine?

In my previous post Finally, I’m Back In Business I mentioned that I planned on doing video game reviews, and to not expect new releases.  I thought I’d take the time to explain why that is, as many people may not think like I do.


Assuming you buy a game fairly new (within the first few months) and don’t opt for the collectors edition you’re looking at about $60.  That’s a lot of money to drop on just one game, especially when you haven’t played it.

You might buy a game like Dragon Age: Origins and end up with potentially a hundred hours of gameplay.  Or you mgiht end up with Halo 3 ODST which is only 5 hours long.  One would have a good cost per hour, the other would be dreadfully low.  Even if you attempt to avoid short games, or poor games, how can you be sure?  Most gaming sites and magazines almost feel rigged.   If a publisher advertises in a magazine, then they seem less harsh on them, because the last thing they’d want to do is lose a big name publisher like EA or Activision.

If you wait awhile, you might be able to score a platinum hits edition of a game, which would bring down the price to $20.   Or even better, buy a good copy of it used and save even more.  That means for every ONE new release you buy, you could potentially buy THREE or more releases.  It’s really a no-brainer.


When the wife and I pre-ordered Fable 2, we were excited.  We had played Fable Pub games for hours, raking in money and items to use for the game.  However only a few shot hours into the game, I ran into what is called “The Abbot Glitch” which made it so I had to start a new character and follow precise instructions to avoid this glitch again.  Meaning all my time I spent in the pub games was a complete waste as it was now unrecoverable.

To make things worse, I ran into more glitches that were rather annoying and prevented me from playing the game the way I wanted to, which is pretty bad considering that’s the whole experience they were trying to sell in the Fable franchise (choice!).

If I had waited a few months, I would have played this game with all it’s patches and thus avoided all the pain I endured.  And considering how many game developers are pushing games out before they are fully done or tested it’s not like Fable 2 was an exception by any means.

Annoying Multiplayer Experiences

I think everyone who has ever played on Xbox Live has run into hordes of screaming pre-teens who do nothing but use profanity, suck at the game, or simply team kill and become a nuisance.  They tend to get their parents to buy them new releases, especially in long-franchises such as Halo, Call of Duty, etc.  If you give them a few months to move on to the next big release, you will usually come across some good players who genuinely like the game, and still play, or new people such as yourself who picked up the game on sale.

Of course, this isn’t really possible if you have friends that HAVE TO play new releases, and if you don’t get it at the same time, you miss out.  If this is the case, I’d say to find some new friends, or simply stop caring and have fun on your own terms, not theirs.

That just about sums up my thoughts on new releases.  I personally don’t really notice a difference, the games don’t get any worse, only better.  You also tend to avoid the bad releases that tend to get good reviews due to fanboyism.  I’d like to hear what others think though, can you think of a reason why I would want to play titles hot off the press?

Safety of Information In the Cloud

Posted by – October 19, 2009

This will be me someday...

This will be me someday...

Most of us have probably heard about how T-Mobile failed pretty epically by losing all of their customer’s sidekick data (as the device has no storage of it’s own, it’s all stored in the cloud).  Luckily Microsoft has stated they have been able to recover “most, if not all” of the information.  However consider the fact that when this news broke, T-Mobile openly admitted to not having backups.

Like many Internet users I rely on Google a great deal of the time.  I use their email service, their RSS reader, their office document suite and of course their search engine.  It appears to me Google has a lot of relaly smart people in their ranks and I not only assume but can pretty confidently say I KNOW they keep backups.  How comprehensive these backups are, and how often it’s backed-up I don’t know.

So it seems I put a lot of faith in Google to keep my information safe, but then again I would have thought T-Mobile would have the same obligation.  This is why I was quite amazed upon hearing of The Data Liberation Front which is a team, well I’ll quote the website:

The Data Liberation Front is an engineering team at Google whose singular goal is to make it easier for users to move their data in and out of Google products.  We do this because we believe that you should be able to export any data that you create in (or import into) a product. We help and consult other engineering teams within Google on how to “liberate” their products.

So basically Google is working to allow YOU (the user) to keep and regulate your own backups without having to find crazy work-arounds like many other services (where their business model is to make it difficult for you to leave).

While most of their help will only help those who are technically minded (such as knowing  to use their forwarding/popmail to grab a copy of all your email as a backup) it’s definitely a step in the right direction.  While cloud computing is definitely making lives easier, it also raises the stakes for catastrophic data loss and we need to be careful to ensure a bad day at Google isn’t a bad year for us.

I’m probably going to write a few guides over the next several weeks on configuring tools and scripts to automating the backup process (as I have been backing up Gmail and other web services for years).

The Search for Good Web Hosting or Why DreamHost Sucks

Posted by – July 27, 2009

Some Random Server Room

I need to get me one of these.

[EDIT: This is an old article.  Arvixe was bought out by EIG and now sucks.  Ugh.]

As you may or may not have noticed we’ve changed servers.  In fact my blog has even moved to  This was due to a recent failure that took place with our main server Hal.  I was able to get it back up but we still haven’t figured out why it went down.  We decided it best that we move to web hosting and keep Hal strictly for shells and wanted daemons.  Thus began my quest for a Good hosting company.

This came at a time where just a week or two ago I was looking for hosting for the company I work for.  For them we decided upon DreamHost which I thoroughly enjoyed for a week before we attempted to get an account on there for Freedom-Uplink.

So to test DreamHost out to see if it’d fit our needs, I added as a domain in their control panel but didn’t change the domain’s nameservers.  I was then able to configure pretty much everything we needed.  At this point we decided to sign-up and start using them.

So when we signed up we used a promo code I found in Google to waive the setup fee as we were going to pay monthly (in the event that we found a better solution, or ran out of funds).  Everything was going swell until I tried to add the domain to our new control panel.  It stated that the domain had already existed on the DreamHost servers under a promo’d account and thus we couldn’t use a promo if we wanted to keep our domain.

I tried troubleshooting the issue just in case by making sure the domain was not installed on my employer’s account.  I then proceeded to contact their support, this is where things went downhill.

It turns out that my employers also happened to use a promo code when they signed up to save $50 dollars or something.  And since I was able to “test” my domain and see their control panel, I was not eligible for their promo as it’s for new customers only.  I explained that we’re a non-profit (but not registered in the US) but they were unsympathetic and wanted to charge us $50 dollars which is almost our whole hosting budget for this year.

I continued to try to explain the situation about how I am a new customer which just so happened was able to try it before I buy it and now want to give them money for hopefully years to come.  I even explained that my employer would allow me to host the domain under their account if I wanted, but I’d prefer this setup which meant they get more money.  They were still uninterested.  The closest I got to anyone who understood was a rep who explained that they know that I’m not abusing the system at all, but that the policy is there and they have to be vigilant.

Needless to say I canceled my account with them and got a refund.  Am I sure glad I did too, as I found a hosting company that is several folds better than DreamHost.

Arvixe's Logo. Best. Hosting. Ever.

Arvixe’s Logo. Best. Hosting. Ever.

After searching around, and conversing (see: monologuing) in our IRC channel #Uplink, Sabrebutt made a suggestion.  He suggested using Arvixe.  I had not really heard of them except for a few mentions of it’s good pricing.  At the time I wasn’t looking for hosting so promptly forgot about them.  I revisited them though and after looking over their plans decided that it’s practically the same deal as DreamHost but they have better customer support (24/7 chat, phone or email) and much better reviews online.

So I signed-up and it’s been the best move I’ve made when it comes to web hosting.  Their customer support is extremely helpful and will pretty much bend over backwards to help with any weird requests (Freedom-Uplink is a lot more complicated than I remember it being).  I’m so impressed that I might be looking into seeing if we can get a PromoCode made so we can entice Freedom-Uplink users to sign-up.

So while it was a pain, it worked out extremely well.  And while DreamHost seemed quite awesome at first, in comparison to other hosts such as Arvixe, it has a long way to go to be anywhere near as great.

I’d also like to point out that since we recently moved, please let me know if you find broken links or anything not working like it should.  I think I fixed FeedBurner but won’t know till after this is posted.

I’ve also NOT received any payment from Arvixe or incentive in anyway to sing about their good graces.  Nor will I ever sell out like that.

My Quest For Good Quality YouTube Video Uploads

Posted by – April 3, 2009

The Internet's most popular video sharing site.

The Internet's most popular video sharing site.

Yesterday I spent a good chunk of my day trying to find the best settings, format and codec for the best quality Youtube videos.  The good news is I think I found the right recipe, the bad news is Youtube’s quality still sucks.

Over my experiment I probably uploaded the same video  twenty-five times or more, in a range of sizes, the largest being 400MB for a 13 second clip!  Just to see if there was a point of overkill.  So in order to let you know how the testing worked, let me explain the clip.

For testing I used a 13 second clip of my wife feeding a pigeon by hand.  I did this because it’s very small compared to the 10 minute limit set by YouTube so it should be easy to upload multiple times to test quality.  It also shows a range of motion from tilting to the flapping wings in decent outside lighting.  The original format was a DV AVI file for the most quality out of my Sony DCR-HC52.

The first thing I did was upload the clip in it’s original uncompressed format (DV AVI) with a total size of a little less than 400MB.  This being completely insane considering the max filesize for YouTube is 1GB and to upload at this size, the max length would be less than a minute.  Either way I tried it, and the quality wasn’t so great to be honest.  I think mostly because of the tools Google uses isn’t optimized to convert from DV to their MP4 and FLV.

I think did twenty or so conversions to FLV, Divx, Xvid, MPEG 2 & 4, WMV H.264 and many more on the advise of many forums and guides.  I don’t want to bore you all with the details (I’m sure I’ve lost some people already) but the important thing to note is that Google states H264, MPEG-2 and MPEG-4 are preferred.  Unfortunately they drop the ball by not giving the best settings (bitrate, resolution) to get the best quality videos.

So, what is the best way to encode your videos for youtube?

Use the H264 codec with a bitrate of 2000 (although you can play with this, but 2000 seems to be the sweet spot from what I can tell, more is better, but you probably wont notice).  You can probably get away with less bitrate too, but I want the best bang per MB.   Also set the resolution to the highest native resolution of the source.  For example DV from MiniDV tapes has a max Resolution of 720×480, so I use this.

This was a lot of work for such a simple idea.  The higher quality you give YouTube the better it’ll look.  However the goal I was looking for was where the diminishing returns sets in, and I feel I’ve have found it.  However I’m still unhappy with the quality of the video, and it seems comparing my videos with others that it’s now the fault of YouTube not the source video quality.  This is now setting me on another quest to find a better place to host my web videos.  Which of course is something I’ll touch on after I have an answer.

P.S.  I really wish I had more information to give you guys, especially considering how many hours I spent on this topic.  I hope it’s useful to someone, and it’s definitely a good reminder for me, so I don’t attempt such a thing again.

Setting Up SSH Tunnels With Putty

Posted by – March 3, 2009

Always ensure secure network connections.

Always ensure secure network connections.

I’ve been playing around with VPN’s and ssh tunnels to try and get my ipod touch to use something secure when I’m connecting to random wireless networks.  Needless to say it’s not working so great.  I want my itouch to tunnel everything through ssh to my server at home, but Apple hasn’t ever thought of that nor can I find any application to do so (and probably won’t as it would need to run in the background which Apple doesn’t allow, at least when your not jailbroken).  It’s leaving me little choice but to jailbreak or else I can’t have secure connections without setting up a VPN over IPsec which is about as fun as it sounds.

So while I was toying around with different things, it occurred to me that many people don’t know how to secure their traffic and prevent people form listening in.  I’m going to show you how you can take a windows PC or laptop, and route web traffic through a shell account you have ssh access to.  I’ll then show you how to setup FireFox and the SwitchProxy extension to use the tunnel efficiently, as well as the basic premise to make any program you have access the tunnel as well.

First I guess I should explain just why you’d want to go to the trouble of doing all this.  Well, whenever you use someone else’s connection whether it be a wireless access point at a coffee shop, shopping mall, neighbor’s or even plugged into a school’s network the bulk of your web traffic is sent as plain text.  This means anyone who wants to can probably listen in on anything you say to your friends on an IM client, or even check your email if your not enforcing SSL.  Even on a WPA or WEB enabled wireless connection your data would be easy enough to sniff if the  attacker has time enough to crack the key.  I know many people who even go to a coffee shop and setup their own laptop to act like an access point, collecting all the information for anyone who connects to it, in a classic man-in-the-middle attack.

Alright, so the first thing you need to do is open Putty.  If you don’t have Putty already get it, it’s one of if not the best terminal program for Windows!  Alright now that it’s open to go ‘SSH > Tunnels’ on the left hand menu.  In this section, click on the radio button marked ‘Dynamic’ and put ‘9999’ (or any port of your choosing, providing it’s not in use) in the ‘Source port’ text box, click “Add”.

Setting Up The SSH Tunnel in Putty.

Setting Up The SSH Tunnel in Putty.

Now go to the ‘Session’ Menu on the left side again, and enter the server information.  Then Name it, and click save.  it should look something like this:

Saving the Session in Putty.

Saving the Session in Putty.

Alright so now that the session is saved with your tunnel settings your now ready to go.  Login to your shell, and just leave it there for now (you can do anything you’d normally do, except leave [which will close the tunnel]), and open FireFox.  Go to Tools > Options, then select the ‘Advanced’ Tab and click on ‘Settings’ where it says “Connection: Configure how FireFox connects to the Internet”.

Firefox connection settings, to put in the address of the SSH tunnel.

Firefox connection settings, to put in the address of the SSH tunnel.

Now select “Manual Proxy Configuration” and for the “SOCKS Host” enter ‘localhost’ and ‘9999’ for the port (unless you specified something else earlier).  Accept all changes.  Your now browsing the web through FireFox securely through your new SSH tunnel.  Keep in mind if you close your Putty terminal you’ll get ‘connection refused’ messages until you either reconnect to the shell or you go into your settings and remove the proxy.

Firefox Proxy Settings.

Firefox Proxy Settings.

Now that you have the basic premise of how to setup your SSH tunnel through Putty, we’re going to install the SwitchProxy Firefox extention to make the switch to secure browsing simple and quick.  SO go ahead and grab a copy of SwitchProxy from the Mozilla Add-on website.  Install it, then restart FireFox (as required).  You’ll now notice that in the bottom right corner it’ll say “Proxy: None”.  You’ll also notice an annoying toolbar, which you can right-click on and remove luckily.

Alright, so right-click the bottom right corner, and select “Manage Proxies”, click “Add” then select “Standard”, name it, enter ‘localhost’ for the ‘SOCKS proxy’ and ‘9999’ for the port, and finally select “SOCKS v5″ and save changes.  You can now right-click SwitchProxy in the bottom corner, and select ‘SSH Tunnel” (or whatever you named it) and switch effortless back and forth between secure and default connections.

Adding the SSH tunnel to SwitchProxy.

Adding the SSH tunnel to SwitchProxy.

Phew.  That seemed like a bit of work, but it’s well worth it to have this setup for whenever you may find yourself in unknown territory.  Keep in mind you can set ‘localhost’ and port ‘9999’ as ANY proxy you find in any program you use in order to secure it.  Pidgin, MSN, AIM are all good candidates as are POP3 and IMAP mail clients if they aren’t (and even if they are) SSL enabled.

I hope this guide helps at least someone out there.  If anyone has an ideas on how to tunnel through on an ipod touch be sure to let me know.

Edit: You may also want to go into FireFox’s about:config (but entering it into the address bar) and changing network.proxy.socks_remote_dns to true.  This will send DNS requests to the tunnel as well for added anonymity.

Using Gmail’s IMAP Through Alpine

Posted by – July 5, 2008

Alpine is the FOSS version of Pine, which is now dead software. It’s an excellent command-line text-only email and newsgroup reader which I use as I prefer to do all my work through an ssh shell as opposed to a USB drive as some others choose to do.

I thought I’d quickly help anyone who may be having troubles setting up Google’s GMail IMAP server through Alpine as it took me a little bit of messing around to come up with a solid answer. It’s a pretty easy setup if you know Gmail’s IMAP server addresses and especially easy if you’ve used IMAP in the past.

Simply replace the following variables in your .pinerc configuration file (in your home directory):
# Over-rides your full name from Unix password file. Required for PC-Alpine.
personal-name=YOUR NAME

# Sets domain part of From: and local addresses in outgoing mail.

# List of SMTP servers for sending mail. If blank: Unix Alpine uses sendmail. USER

# Path of (local or remote) INBOX, e.g. ={}inbox
# Normal Unix default is the local INBOX (usually /usr/spool/mail/$USER).
inbox-path={ USER}Inbox

Of course this is just the tip of the iceberg. I’d strongly suggest checking out this page as well as it’s full of useful information on using alpine and other tips and tricks.