Articles

Certifiably Insane

The process of moving the site over to a “secure,” certificate-based setup has been what most reasonable people could describe a “messy” process. For the most part it’s working, but it’s not working well — there’s a great chance that when you first arrive your browser will scream OH MY GOD THIS SITE DOESN’T MATCH UP WITH ITS CERTIFICATE AND I’M NOT SAYING YOU CAN’T GO THERE BUT I WON’T BE HELD RESPONSIBLE WHEN HE STEALS YOUR IDENTITY AND SKIPS OFF TO ARUBA WITH YOUR LIFE SAVINGS which isn’t the kind of first impression I relish making.

So below I discuss, in as much detail as I understand (because this entire process is pretty foreign to me) what I’ve done, why I’ve done it, what I still need to fix, and when the insanity will end.

If you don’t have enough time to read the whole thing I’ll reveal the surprise ending: the insanity doesn’t end. It’s all downhill from here…

The Beginning: Firesheep

The site had been back online for no more than a week when I looked at my twitter feed and saw Jeff Darlington tweet (as @gpfjeff):

Firesheep makes me want to weep for the Internet and laugh maniacally, both simultaneously: http://bit.ly/9vil65

Naturally, I was curious. So I followed the link.

The link is a post from Firesheep’s creator, describing in detail exactly how he was about to make my life miserable just before NaNoWriMo started. In short, to drive home the point about how insecure the current web landscape is, he created a Firefox plugin that made it ridiculously easy for anyone running it to intercept the information passed between a website and someone trying to log in to that website, and then to log in using that intercepted information.

I then did a little googling and learned from various sites and articles that in the two days since the thing was published, more than 200,000 people had downloaded it.

This was a Pandora’s Box moment for me. The plugin was so easy to use that anyone, it seemed could use it… and while I didn’t think there was a significant chance that someone would use the plugin specifically to screw with one of the few user accounts people have made on my site, I didn’t feel like it was a good idea to ignore the problem and assume nothing would happen to me or to any of my visitors. When I was still on Keenspot I had my site trashed by a guy who was able to exploit the WordPress front-end I was using to publish news on the site. I didn’t like it and I’ve never forgotten it.

But let’s be completely, brutally honest here: unless you’re using the same password on this site that you use to log in to your bank, chances are very small that someone logging into your account on this site and raising mischief will impact your life in any important way. And some people do leave their car doors unlocked because they don’t think their cars are worth stealing. And in the grand scheme of things, if a real honest-to-god malicious hacker wanted to break in to this site, well, I don’t have the technical know-how to stop him. All that aside: I still think I ought to try to do what I can, however insufficient that effort might be.

All of which led up to my plan: according to its author, Firesheep and similar exploits could be thwarted by enabling end-to-end encryption on a website. So I resolved to enable SSL on my site.

This meant five things:

  1. Generating a site key.
  2. Getting a signed certificate from a “trusted” certificate provider.
  3. Configuring my site to use them.
  4. Turning SSL on.
  5. Redirecting all incoming traffic so that it is forced to use the secure site.

Preparing for the Switch-Over

The simplest way to protect your site is to set up end-to-end encryption. What is that, you ask? Well basically you set up an encryption key on your site that your web server (in my case, Apache 2) uses to encrypt information sent to a browser — most importantly, it encrypts any cookies it sends to your browser. This was pretty easy to do. There are plenty of instructions on how to do so on the web, and it’s just a matter of one command followed by answering a series of questions as the key is generated.

However, when a web browser encounters an encrypted site, the first thing a browser will do is warn you. This might seem odd, since the purpose of a key is to protect the user, but for every person who invents a more secure way of doing things, there will be a whole horde of people eager to misuse it. The key supplies identifying information about the site, but it doesn’t offer a way to prove that the identifying information is correct. When I was generating my key I could have, if I were less than forthright in my dealings with you, have had my key identify itself as being part of a national chain of banks. Some phishers create sites that look like online banking sign-in pages, generate a key that identifies itself as part of the bank, and then send out emails to people asking them to verify their account information.

So browsers don’t assume a site is legitimate based on the information in the encryption key. Most modern browsers look for site certificates that are signed by a third-party organisation in order to verify that the site is actually what it claims to be, and then it will move on with the business of actually rendering a web page. If it doesn’t find a certificate, or if the certificate doesn’t look valid, it throws up a warning page and forces you to make the decision to either trust the encryption on your own, or to stay away from the page.

Many of my readers are tech savvy, but not all. Some are like me — we know more than some people, less than a lot, and generally know just enough to try to something new that fails so spectacularly it takes us days, weeks or even months to clean up the mess. Others don’t know a lot about computers per se but are all to familiar with the things computer companies have done to screw them over. I didn’t want to drive anyone away, so the last thing I wanted was someone pointing a browser at ubersoft.net and getting a message saying “the site you think you’re getting might be that site, but might not. Do you want to continue? Keep in mind that if you do, [company of said browser] will not be held responsible for anything that happens to you from this point forward.”

So I wanted to get a certificate. The first place I went to was Verisign. Verisign is a company that generates third-party certificates for other sites. They were the only company I knew of, initially, that did this, so I figured I had to use them.

Their cheapest certificate was $400 a year.

Two friends suggested using GoDaddy, and they were in fact significantly cheaper than Verisign. And so when I started exploring the number of options of certificates available, I ran into another problem: these certificates are awfully limited in scope.

Certificates do not, as I had originally thought, certify a website. They certify domains. Specifically, they certify the validity of a single domain name, and no other. You could buy certificates that covered multiple domain names, but they cost more. You could buy certificates that covered a single domain name and an unlimited number of subdomains (i.e. first.ubersoft.net, second.ubersoft.net, third.ubersoft.net, etc.), but those also cost more.

So here was my dilemma. EvisceratiNet consists of five domain names:

  • eviscerati.net
  • eviscerati.org
  • eviscerati.com
  • ubersoft.net
  • ubersoft.org

Using the marvellous power of the .htaccess file I redirect all the eviscerati domains to www.eviscerati.net. The ubersoft domains go to www.ubersoft.net.1

So I wasn’t sure what kind of certificate I should buy. I knew I needed a certificate that would be valid for at least two sites (www.eviscerati.net and www.ubersoft.net) but I didn’t know if I’d need more. I decided to get a 5-site domain from GoDaddy because they didn’t have 2-domain certs, and while I intended to continue redirecting everything to my two “primary” domains I had no idea whether that was going to occur before or after anyone reached the site.

So I got the 5-site certificate and told it to recognise www.eviscerati.net, www.eviscerati.org, www.eviscerati.com, www.ubersoft.net, www.ubersoft.org. A few minutes later I got an email telling me the certificate was ready, and I downloaded and then ftp’d it to my server.

This seemed like a perfectly reasonable solution at the time, but I’d overlooked something pretty important, which I’ll get to a little down the road.

Switching to SSL: the First Attempt

On Thursday night after putting my daughter to bed I had managed to generate my key, upload the certificate and had modified everything in Apache that I thought I needed to modify in order to make the switch to SSL. All I needed to do, I thought, was restart Apache and change the redirects in the .htaccess file to point to port 443. 2 So I took the site offline, turned off Apache, then turned it back on.

Well, I tried. Apache wouldn’t restart.

It took me a few hours to figure out that I had actually configured the SSL stuff incorrectly in Apache. I managed to sort that out, sort of,3 and eventually Apache was willing to turn itself back on. So then I fired up my browser and typed “https://www.eviscerati.net” — fully expecting to see my site in all its glory — and what I found instead was a jumbled, convoluted string of characters in my address bar and Firefox screaming that the Unwashed Horde was descending on us and that we were all going to die.4

After a few hours of troubleshooting I had turned off SSL and the site returned to its normal, insecure method of operation. The problem appeared to originate from my Single Sign-On server, which allowed a user logging into eviscerati.net to automatically be logged in to ubersoft.net and vice-versa. So I regretfully decided that in order to make this work, the SSO setup was going to have to be dismantled.5

Switching to SSL: the Second Attempt

Friday night after putting my daughter to bed I turned of SSO and moved all my user account data back into my main site database. That was easy. Then I turned on SSL. That was also easy. SSL worked! If I pointed my browser to https://www.eviscerati.net or https://www.ubersoft.net I was able to see the site. I had a warning that some of the content wasn’t secure (more on that later) but this was where I’d wanted to be yesterday.

This is where I realized I had overlooked an important piece of information about how the Internet handles domain names: “www.eviscerati.net” and “eviscerati.net” are different domain names. “eviscerati.net” is a domain, and “www.eviscerati.net” is technically a subdomain of that, just like “mail.eviscerati.net” and “forums.eviscerati.net” and “secretproprietaryinformation.eviscerati.net” could be subdomains of eviscerati.net if I wanted to use them. The certificate I bought was for www.eviscerati.net (and .org, .com, www.ubersoft.net and .org) only. Which meant if you typed “eviscerati.net” into your browser it would warn you that you were about to be manipulated by an evil, identity stealing genius who had been laughed out of his profession by his peers and swore that one day he would show them, he would show them all.

By the time I gave up to try to get a few hours sleep, everything sort of worked if you typed https://www.eviscerati.net or https://www.ubersoft.net into your browser window. Beyond that, you wound up getting messages talking about incompatible certificates and cranky browsers and dire warnings of doom.

Switching to SSL: The Second Attempt, Part Three

I managed to resolve most of the egregious SSL certificate warnings by being repetitive with my .htaccess file: I inserted a series of commands to take each permutation of an http:// address and redirect it to an https:// address. This appears, based on my testing with IE, Firefox, Konqueror, Chrome, and Rekonq, to have made the initial “OMGWTF” warning from browsers pretty much go away. That said, I’ve discovered that if you type an invalid https:// address into your browser, i.e.

  • https://eviscerati.net
  • https://eviscerati.org
  • https://eviscerati.com
  • https://ubersoft.net
  • https://ubersoft.org

No redirection occurs and you get a browser certificate warning. I haven’t found a way to fix this.

Another problem that still exists is that my site, despite now being 95% properly encrypted and certified, is still displaying un-encrypted content from other sites.

For example, I use Project Wonderful for advertising. All of the ad images are stored locally on my site, they’re linked to Project Wonderful’s servers. These links consist of url’s that point to javascript files and images, and those links are not secure links (https) but are instead your standard, run-of-the-mill links (http). This makes it technically possible for someone to intercept and exploit that link and compromise my site by sending malicious code in place of whatever I actually wanted to get.6

Another example is Creative Commons. I use a CC license for most of my content on the site, and the links I display at the bottom of each comic refer to information and images on the Creative Commons website. Unfortunately, CC doesn’t appear to support SSL at all so those images are not secure.7

In order to make a web browser completely happy, any content displayed on your page needs to be secure… which means I need to get those things fixed. It also means I need to go through my site with a fine-tooth comb and locate every instance where I lazily linked to an image on my own site improperly.

This is probably going to take a while, and until it’s completely resolved every web browser that cares about certificates will display a warning symbol, either in the address bar or the window footer, indicating that there is insecure content displayed on the page. That’s not the way I’d prefer to have things but I’ll have to live with it for a little while until I can get it all sorted out.

Final Thoughts

I would like to be able to wrap this up in a neat little bow, tell you all what I learned from it, and promise that it meant my ability to maintain a site that served content to readers without putting them or that content at risk had improved dramatically, but I can’t say that.

The painful truth about this experience is that it underscores exactly how damn confusing it is to make even marginally complicated sites work. Throwing an extra (and extremely important) requirement into the mix — that it be secure — has been dizzying and unrelentingly confusing. At this point I think I have about 90-95% of what I want, but that last 5-10% looks like it might never close unless I’m willing to throw more money at the problem — too much money for me, at the moment.

I don’t begrudge guys like Eric Butler for taking a hard line where security is concerned, and I can understand his motives for releasing Firesheep — he saw an industry that was asleep at the wheel as far as security went and decided to do something that would force them to wake up and pay attention — but I do begrudge the fact that he did it and proceeded to dump the mess in my lap. Those of us who are learning as we go, largely because we simply don’t have the money to get hosting at a site that does all that stuff for us, aren’t going to make it in a world where people in the know consider us acceptable losses for the greater good of forcing the big sites to act more responsibly. Some day, some concerned security expert (or gleefully destructive malcontent) is going to release a tool that requires site modifications I won’t be able to do on my own, and if I’m not independently wealthy at that point, then I’m probably just going to have to close up shop. I feel like the countdown to that has already begun, and I wonder if I’m going to make it to March.

I’ll keep struggling through as best I can, and when possible I’ll post what I learn on this site, as a record for any other struggling proto-site admins like me. And in the meantime we’ll all have to cross our fingers and hope that eventually we get everything right.

Footnotes

  1. I use the “www.” in my urls because I read somewhere that doing so kept cookies from subdomains from overwriting your top-level domain, and I wanted the option of using subdomains for this and that at some point in the future. I have no idea, at this point, if I am remembering what I read correctly or if it is something that is no longer true, but that was the standard I adopted.
  2. For those of you not in the know: most web traffic goes through port 80 — port 80 is the standard access point on a server for http: and that’s where everything goes by default. Anything with a url starting with https: is going to port 443, and you have to configure your site to work with port 443. Also, if you want to make sure that people visiting your site are visiting it securely, you have to make sure that http: is redirected to https:, or more specifically, that port 80 traffic is redirected to port 443 traffic.
  3. Unfortunately anything I do with computers deserves the “sort of” qualifier tacked on the end of it. This includes “publishing a webcomic.”
  4. Around that time there was also a lot of talk about zombies on Twitter. I’m not sure it was a coincidence
  5. I say “regretfully decided…” I had been trying to get SSO to work with Drupal for at least three years with no success, and when I finally get it working, and working almost perfectly, it’s up for one week before I have to shut it down.
  6. The irony here is that Project Wonderful supports SSL when you browse to it, it just doesn’t send encrypted ads. That’s OK, though. We’ll be able to work that out.
  7. Though to be fair I can just copy that image on to my site and display it locally.

Related posts

0011 0100

C. B. Wright

Review: The Aphorisms of Kherishdar

C. B. Wright

Two and a Half Years Later…

C. B. Wright

Leave a Comment