Hi, neighbourino! I say that, since Easter is a time of being with family, and in the case of many, gorging on chocolate-based products. I had plenty of it for my birthday (imagine a box of paper filled to the brim of confectionary)… In my previous blog post (linked here ), I suggested that one of my projects would be to upgrade this website to use a new Web Framework. I had some time today, and decided to move to the future, that is using HTML5 and CSS3, made possible thanks to Twitter Bootstrap .
Now, I’ve built my own sites in times past (including the one that used to sit here. If you’re lucky, it might be on the ‘wayback’ machine, or cached on Google. I was proud of the design, and even though it used tables here and there, it did it’s purpose. The only issue was that Fancybox (a fantastic JQuery plugin) which handled users clicking links in the navigation bar to take them to the next page wasn’t very good on tablets – I manhandled the code to work on iPads etc, but it wasn’t clean.
Being at University, I saw the benefits and cool things that responsive media can provide today. Twitter Bootstrap is a very common HTML framework being used, and is great if you have dynamic websites, for example those served via PHP. Even though, as I said in my previous post, I hand-build each page (with love, sweat and sometimes tears!) and it suits my needs perfectly. The framework itself is very pleasant to use, and provides a wealth of cool things you can do, such as placing items on a grid, and there are a plethora of additional plugins for Bootstrap, such as validation tools and date-pickers (if you look at the source of these pages, you’ll see some referenced). Whilst these aren’t used at present, they might come in use later on. Bootstrap did also require some man-handling to perform the tasks I wanted, especially the search bar at the top, which is simply a redesigned Google Custom Search form. I know it’s open-source, but some consistency in code would be very useful.
I also used Bootstrap for my Final Year Project, which was well-received. Now it’s on this website, it looks very good, and again, I just build a few templates, one for the Blog and for general pages, and off I went! It was a matter of transferring the old content from the previous system to the new one, adding appropriate changes etc (if you look at the previous post, you’ll notice quotations are formatted nicely now).
I received a text from my friend over a week ago to let me know about Heartbleed. Something very interesting about it was the large number of Linux/*NIX-based webservers, which implement the OpenSSL package. Moreover, the number of clients which use it for SSL transactions. Now, whilst it was a genuine oversight, it does beckon the question as to whether more time and efforts should go into the open-source movement. Even Government bodies were using OpenSSL, so why can we not collaborate more? That said, the more people who get involved can cause issues with the overall process. Numerous ‘pull’ requests each day, and there might be less-experienced people contributing to something which is fundamental. I wrote a kernel-module for a basic DNS firewall, and even then, I managed to brick the kernel and was forced to either use a snapshot or the emergency kernel on Ubuntu (my distribution of the time). I say Ubuntu was my distro at the time, as I’m now a Debian convert; I find XFCE better in Debian, and I don’t necessarily agree with Canonical allowing Amazon see your search. Also, the replacement of X Server with MIR has been delayed numerous times. I have the rule – why replace it when it does a good job?
Anyway, I digress… The amazing response by companies, both large and small to tackle this vulnerability was no mean feat. Not only did the servers need to be patched, but their certificates revoked and re-issued. I had a certificate for another server I have from StartCom (or StartSSL as they are known to others), which is free and lasts for 1 year, but that server was using a vulnerable version of OpenSSL (just). StartCom, unlike other (very) reputable CA’s, were charging for the revokations of the certificates, whilst Terena, Verisign and other CA’s were revoking free of charge, and re-issuing. For StartCom, it’s understandable that they need to charge, but the amount they wanted to charge was astronomical. I certainly won’t use them again – I’ll stick to self-signed certificates – it also allows a better chain-of-trust, if you ask me, since I generated the certificate and can prove that it’s me, whereas Root CA certificates serve corporations well. My Debian VPS wasn’t affected, as it was running the ‘old-stable’, so used an older major version, whereas my Ubuntu server (I had no choice in the matter – Canvas wouldn’t work on any other distro) was vulnerable, and I took all the required measures to get a new certificate that day.
This just goes to prove that no matter what, you can’t be too careful. The concept certainly scares me, given it was around for 2 years before being raised; I think we’re only on the tip of the iceberg.
2013 and 2014 were not good years for Data Security either. Morrison’s personnel data accessed and more data stolen from a variety of sources. My favourite security blog, Naked Security by Sophos is fantastic, and Graham Cluley and Paul Ducklin explain the concepts well. I highly recommend it to everyone.
So… my second post. What do you think? Let me think also what you like/dislike about the new site. If you liked this post, or have any comments on it, let me know – @richyjt on Twitter, or click ‘Contact Me’ on the navigation bar or below and drop an email to me.