Category Archives: Opinion

Thoughts on computers and the IT industry.

When is a pencil and paper better than a computer?

In this article in MacUser Howard Oakley notes that a number of schools have recently banned the use of wireless networks due to the unknown effects of the radio waves used. He then connects this with the declining number of people taking science subjects at those same schools and their ability to understand the likely risks of said networks.

It’s an interesting piece, but what I find interesting is that as the general populations understanding of how the world works dwindles, so our reliance on high technology increases ((As this article asks, in relation to decreasing interesting in science degrees, “do they just totally not care about where things like web search and MP3 codecs and 3D graphics and peer-to-peer protocols come from”?)).

One incredible thing is that sometimes we start moving to a highly technical solution despite there being little advantage in it. Or at least as far as I can see, the advantage is that it is digital and new.

My favourite example is that of electronic voting machines. It’s easy to point and laugh at all the problems that they’ve been causing, particularly in the recent elections in the US. But despite the problems, despite every indication that they often choose who wins an election rather than the electorate, there is still a drive to increase their use.

The main thing that I want to know about the voting machines is this. What problem are they solving? What part of the old system was so broken that it required a complicated, flawed and unreliable new system? ((It’s also worth noting that before the new electronic machines, the US had problems. Remember the “hanging chads” problem with Bush’s first election? That was a flaw in a method of automatically counting votes.))

Some say that the current system is inefficient or labour intensive or slow. Unfortunately, as far as I can see, that’s either untrue or by design. The system needs to be both anonymous and yet track fraud, two ends of the privacy spectrum. The model in use is similar to public key cryptography in that working out who made a particular vote is not impossible. In fact, in principle, it’s very easy. But unless you have plenty of time to manually check thousands of ballot papers it’s going to take a while. This is by design.

Similarly, the effort required to count the votes in the first place is also a benefit. It makes it more difficult for any individual to have a dramatic effect on the final outcome. This is a good thing.

And slow? It’s simple to make quicker: throw more people at it. That makes it quicker and reduces any inherent bias.

I love new technology and gadgets, I’m fascinated by how they work and the effect that some of them have on society. But in the end, you have to use the right tool for the job. And the right tool does not always have an embedded computer.

Blessed is the Tool Maker

In the fifth part of Douglas Adams’ Hitchhiker trilogy, Arthur Dent makes his living among a group of stone-age settlers by utilising the one skill he had that was relevant to that world: sandwich making.

I guess we all have a special skill. But my point in this article is that if you’re a software professional, your special skill (unless you’re stranded on a stone-age planet) should be making software tools.

This realisation dawned on me over a long time. It’s well known that developers range in ability by at least an order of magnitude but less clear why. I’m certainly not qualified to answer that question in its entirety, however I’m convinced that developers can punch well over their weight simply by learning to write software that automates dull tasks, such as writing other programs.

Now a lot of people reading this will be thinking, “What’s the big deal?” Is this not obvious? Does every developer not do this? Originally I thought they did but recently became clear to me that that is not the case.

As is my habit, I had written a small utility program. A colleague was going through the same problem so I shared it with him. Even if we didn’t work for the same company, the script had only taken me a couple of hours to write. I made no claims that it was perfect but I knew that it did most of the heavy lifting required.

Next came the surprise.

A couple days later I conversationally asked how he’d got on with my script. It turns out that it didn’t work straight away; there was a minor bug. As far as I can tell he immediately abandoned my code and spent five hours doing the same thing by hand. He’d not mentioned the bug to me before and, for the record, it took about ten minutes for me to fix the problem that he’d encountered.

So, let’s make things very clear here. This otherwise smart person had done five hours work rather than spend a few minutes looking through a program or, even more bafflingly, ask me a simple question ((There’s an argument that the problem here is his communication skills. I’m not convinced that is the case in this circumstance. He’s normally fine. And I think I’m fairly approachable!)).

What’s the difference? Why do some people go out of their way to write simple programs while others go out of their way to do repetitive, mind-numbing tasks?

I think it’s simple. Writing a program to write another program just does not occur to many people, they have difficulty thinking in abstract terms and, in general, can’t go “meta.”

Looking back to my university days I remember a number of parallels. A number of people had problems with the assembler course, but the most telling in many ways was on our compilers course. A compiler is a literally a program that writes another program, normally from a high level language such as C++ to assembler or machine code, and to do this you need to keep two separate worlds in your head simultaneously. Firstly you need to consider the program that you’re writing, that is you need to remember all the lexical tokens and the grammar as well as the memory you’re allocating and using. Secondly, you need to keep track of what’s going on in the program that the compiler will be writing when it runs, stuff like the stack, the memory that it will need to allocate at run-time. Since both are just computer programs, many, if not most, of the people on the course had some difficulty getting their head around either the concept or at some level of the implementation.

So far you might think that I’m bragging, but really I’m not. There’s another key part of this. It’s not just a matter of building software tools, it’s knowing when it’s not worth the effort. Had my colleague known that it would take me five hours to fix the problem then he would have made the right move (if we discount the possibility that the script could help other people if fixed).

Similarly, on a number of occasions I have spent far longer building a tool than it would have taken to do it by hand. Sometimes the desire to solve a programming challenge means that I lack the perspective to see when it’s not such a good idea to script the task. Tool building is not without risk. Those truly great developers, no doubt, have both the tool building ability and the ability to recognise when it makes sense to do so.

But overall I still think that one very significant factor in making some developers more productive than others is their ability to write software tools, particularly programs that write other programs.

The end of WMA?

The sky is falling! EMI have announced that they are to allow distribution of their content without DRM. From next month, you’ll be able to buy albums from iTunes without the digital rights management chains of Apple’s FairPlay and in higher quality (twice the bit-rate). This is clearly good news, and EMIs move can’t help but encourage the other major labels to follow.

But one thing missing from the articles is that this also pretty much spells the end of Microsoft’s WMA.

Right now, when you buy a song from iTunes you get a file with AAC encoding. AAC is the follow-up to MP3 and is both higher quality and, unlike the latter, requires no payment for distributing a player. [ Update 2007/04/10: Okay, I got this bit wrong. There are royalties for selling a player or encoder. However, distributing content is free. For a low margin service such as the iTunes Store this makes perfect sense. Plus, the fact that AAC is not controlled by a single organisation makes it more desirable overall. ] That is, it’s an industry standard. What is non-standard about iTunes is the FairPlay DRM system.

WMA is Microsoft’s attempt to tie music playing to Windows. Both the file format and the DRM scheme they use is proprietary, tying you to Windows Media Player (only now getting usable with version 11) and one of the few PlaysForSure devices you see, dusty and unloved, next to the latest iPods. Even Microsoft’s Zune uses a different scheme.

Previously there was an advantage, if more potential than actual, in that the WMA gave you a greater choice of on-line music store and music player. But the new EMI songs will be in AAC format that it playable on most recent portable music players, including the Zune.

Why would Creative licence WMA in the future given that AAC is free?

And those stores that compete with iTunes? They can also use AAC, which doesn’t require payment to Microsoft for its use and can be used on an iPod (which WMA can’t).

Why would Yahoo licence WMA in the future given that AAC is free?

Microsoft have spent the last five years chasing the iPod and Apple’s “closed” system. With Zune they finally have achieved parity. Only now they find that the landscape has changed again. How will they respond?

Follow-up: Belkin Wi-Fi Phone

Back in November I wrote about my then-new Belkin Wi-Fi Phone for Skype. At the time I was fairly pleased with the concept but less so with the actual implementation.

The phone’s hardware was fine. The unit as a whole was reasonably solid. The buttons were a bit wobbly and the screen was smaller than you might initially think, but there was nothing to complain about too much.

The software, however, was more problematic. The main issue we experienced was the unit drifting on- and off-line when it was left unattended. The only way of keeping the unit on-line all the time was to leave it plugged in. Not exactly optimal.

On trying to upgrade the firmware we found that the update software was packaged as a Windows executable, not ideal for this Macintosh-only home. Of course the idea is that the phone can be used without a computer, so it?s slightly comical that the system requirements to fix known bugs are so specific.

I tried reporting the bugs and asked for a solution from Belkin technical support. My first problem was that it wasn?t easy to negotiate their website. The Skype phone was so new that it was not possible to enter the SKU into their site, so I could raise a support call. After some time playing around I eventually found a way of doing it.

My effort was rewarded with… nothing. I got no reply at all. Maybe they were confused by the UK details as it turns out that I’d posted my question to the US technical support department. I still think it was rude to get no reply whatsoever.

I waited until January 2nd to resubmit the question. This time I was able to do so from the UK site and, fortunately, this time I got a response.

My experience with them has been mixed. In general the information given has been good. My message had clearly been read by a real person. Often I get replies asking me to do something I already tried, or something that it clearly not possible. ((My favourite is from a guy at a company that will remain nameless. He kept asking me to click the Start menu and select Internet Explorer. I explained that I had a Macintosh which had neither of those things. That would have been fine, except he was very insistent that I did, in fact, have a Start menu. And every computer had Internet Explorer. Clearly I was crazy. I eventually gave in, told him that I’d done as he asked and clicked Safari on my Dock.)) But not with Belkin. And they offered to replace the phone without any quibbling.

I packaged the phone up and mailed it back to Belkin. I was quite impressed that they paid for the postage. Realising that our SkypeIn number was now unavailable when we didn’t have a computer switched on, I forwarded calls to our mobile numbers. ((I was a little hesitent to do this as I didn’t find much documentation on what to expect. Do both phones ring? What if one it switched off? What happens if we’re logged in on a computer, too?))

I would also commend them on returning calls. Their call centre was always busy, but after five minutes on hold they’d always take my number and then actually call back. Sure, if would have been easier if I could get through to a real person straight away, but this approach was a pleasant surprise.

However what seems to have been lacking is communication. Responses often took over a week to arrive and sometimes not at all. In once instance I was told that the Customer Services department would be sending me the phone in a couple of days. They say this did happen but I recieved no notification and I ended up writing again a couple of weeks later when the phone failed to arrive. One advisor told me that the parcel had no tracking number. Another said that they couldn’t find my address.

So after all this effort what is the verdict? Does the replacement phone work?

It?s early days yet, but the answer seems to be a qualified “yes.” The phone operates correctly when under battery power. While it does seem to appear offline occasionally when viewed in another Skype window, calling my SkypeIn number from my mobile has, so far, usually resulted in the Belkin ringing. This is not 100% correct in my opinion, but is good enough for our use.

The other definite plus is the reduction in the annoying echo that pretty much every caller on the old phone mentioned. As before, the call sounds fine from the Skype phone. From the other end of the SkypeOut ((One thing that works against the Belkin phone is not a problem with the device itself. In January Skype changed their call plans, requiring a “termination” charge for every call. In the case of many UK calls this will actually double the cost. This makes the case for switching away from having a landline less convincing than it was.)) call, however, things are much improved. And, so far, we’ve not had any problems with the reliability. We?ve not experienced any dropped calls or crashes during a call. This is something I intend keeping a close eye on in the future.

The “qualified” comes from the things that still do not work correctly. For example, left unattended while we were at work yesterday, the phone somehow just got stuck at 15:50. This didn’t immediately arouse suspicion as it does tend to lose time very quickly. Pressing a button brought the otherwise blank screen to life giving the appearance of activity, but nothing actually worked. I had to restart it.

Of course one of the main reasons we sent it back was so that we could reliably run it on battery power. We might have to reconsider this now we know how long it lasts between charges. And the answer is: not long. It has needed charging every couple of days so far, and that’s without it being used for calls.

Overall the replacement phone is a great improvement over the original model. The call quality is improved and it is possible to operate it without mains power, which does mean that it does what it says on the box. Still, the glitches and the battery life stop me from unreservedly recommending the phone.

Backup

Reading this article by David Pogue reminded me of my own search for a reliable and easy backup solution. I came to a rather different answer so I thought it might be worth detailing a little history, the options I considered and the one I eventually chose.

Ancient History

In the olden days — i.e., going back a couple of years — I split the files into four categories:

  1. System files such as the operating system itself and all my applications
  2. Email
  3. Small files such as Word documents and Excel spreadsheets
  4. Big files such as movies, photos and music

Category one, system files, I didn’t backup at all, figuring that I had all the discs and licence keys so I could recreate the whole thing if necessary.

I was much more careful with emails. While my archive gets a little spotty in places where I’ve not been able to transfer them from one computer to another, I have some messages going back to 1997 and have no intention of losing them. So I used the IMAP server ((There are two types of email server in common use at most ISP’s. “POP” is the most common and is designed to allow you to download messages but not to store them for the long term. “IMAP” is rather more sophisticated and email programs tend to cache messages that are permanently stored on the server.))provided by Apple’s .Mac service. This had the added bonus of allowing me to access my emails on my desktop, laptop and work machines.

For small files, category three, I also used part of the .Mac service, this time the iDisk. iDisk is basically an online disk ((Technically speaking, it’s just a web server accessed using the WebDav protocol. OS X cached the files so they were available off-line too.)), automatically copying anything saved there onto a server on the internet. Everything was magically and almost immediately backed up. Again, this allowed me to access my files pretty much anywhere.

I backed up larger files on a more ad hoc basis. I copied my photos onto CD (or DVD for larger projects) more or less as soon as they were downloaded from my camera. Usually. As ever, laziness had a tendency to set in and it didn’t always happen for a while after the event; not good. Music downloaded from iTunes followed a similar pattern.

Everything ripped from a CD was not backed up at all. I had no easy way of backing up over 20Gb of data and, in theory, I had all the source data anyway.

What happened next

There are (at least) three problems with this system. First is .Mac’s reliability, something I’ve already talked about. Second is the piecemeal approach, where some things are saved immediately and other things might not be backed up at all. And finally, if my hard-disk did ever die, how long would it take to restore everything?

When I got my new MacBook I realised how long it takes to set up all my applications. And when I wiped the hard-disk of my “old” iMac ready for sale on eBay I found how long it can take to install the operating system and get it up to date. For the record, it took the best part of a day and that was without restoring any data ((First install OS X 10.4.0. The plan was to install iLife and then run software update, but iLife required OS X 10.4.3. Unfortunately, one of the problems with 10.4.0 was that WiFi didn’t work so it took a while to work around that. Then I updated to 10.4.8, which was over 100Mb of files. Then I was able to install iLife, which also required some very large updates. In all, there were over 250Mb of updates to just iLife and OS X.)).

In reality, despite these problems, I was far too lazy to bother changing things. Until I decided on two things: not to renew my .Mac subscription and to get a new laptop not just to replace my ailing iBook but also to replace my iMac. Without .Mac I had no choice but to re-evaluate my options.

I am going to concentrate only on the backup side of things in this article, although that was not the only thing I changed. For example, rather than use an alternative IMAP provider for my email, I decided to move to Google Applications for Domains. My experience there is perhaps worth another post but for now I will move on.

For backups I considered two main approaches, online backups as promoted by David Pogue in the first link in this post and traditional backup software, copying data to an external disk.

The answer

I considered a few different online backup solutions. Their pricing structures varied slightly and I’m sure that they all worked, but after I thought about it I realised that none of them were ideal. Pogue mentions the main problem in passing: timing. I have about 100Gb of data that needs to be backed up. How long would that take, even on a broadband connection? Let’s see, 256kbps (the upload speed on my cable broadband connection) works out to around 20k/sec. A little bit of maths ((100 * 1024 * 1024 / 20 / 60 / 60 / 24 = 60.68)) suggests it could take 60 days to upload the whole disk! In the event of a disaster, it would be quicker (download speeds are ten times faster than upload speeds) to download everything but it would still be nearly a week! Even if my maths is bad (very much a possibility), I still think my bandwidth would make online backup impractical.

Note that I think online backup is a fantastic idea in principle, just not one that works for me. If you mainly have Word documents to backup I’m sure it’s cheap and easy.

So, software. I had been using Apple’s Backup software, which is mainly for use with .Mac but also works with external hard-disks. However it was slow and, according to reports on the Internet, very unreliable. The one time I ever needed it I had managed to get back my files but I didn’t really want to risk it another time.

My LaCie external hard-disk came with SilverKeeper backup software. I had a quick play about with it looked overly complicated. Having had to write backup software at work, I did have a tendency to look for “complete” solutions — i.e., get me that file I deleted three weeks ago — but on further thought I realised that I didn’t need anything so sophisticated at home.

Most other software I looked at was just as complicated and, worse, they all had the same failing: they don’t back up the operating system itself. I’d already seen that it takes nearly a day just to restore the system software before you even start looking at the data.

In the past I had played with the Carbon Copy Cloner (CCC), software that cloned a whole disk. I liked the idea — as the backup is just a complete, boot-able copy of the main disk — but I didn’t like the amount of time it took to copy the whole disk every time.

In the end I decided on Shirt Pocket’s SuperDuper!, which is like CCC except it has a “Smart Update” feature that only copies the files that have changed, making a weekly backup take, on average, less than ten minutes. Nice.

So that’s the system I’m using at the moment. Would something like that work for you too?

Hiring

Talking about Google’s old and new hiring practices seems to be all the rage at the moment, so I thought that I would get in on the act.

I got through two phone interviews for a technical consultant role here in the UK before being rejected. My second interviewer told me that he’d had fourteen interviews before being hired. That’s just an absurd number. How much holiday and sick leave can you take at short notice without arousing suspicion?! (They were both long enough or required Internet access that I couldn’t do them at work.)

By the end of the second I was in two minds whether to take things any further anyway. I wanted to work for Google, but could I go through fourteen interviews? I was concerned about the money, as no number was on the job spec and big names often offer low and offer options to compensate. I can’t pay my mortgage with stock options! And the work in the consulting side didn’t sound quite as appealing as the kind of thing you hear on the development side.

Most significantly, was the style of interview. They asked brain-teasers, which I tend to think is a lousy way to scope out a candidate. Either you know the trick and can do it instantly, you get lucky or you need a hint. None of these really shows how smart you are, how well you can program a computer, interact with clients or, indeed, any other aspect of the job. The interviewer was also clearly typing away in the background while I was trying to answer the questions, only half listening, which was just plain rude.

Most communications were friendly and personal, right up to the last. The rejection email was signed, impersonally, “Google Staffing.”

So overall I’m not terribly impressed with Google recruitment. Okay, maybe I’m biased against them as they turned me down but as an interviewer I’ve always considered part of my job as leaving a positive impression of the company even with candidates that are not going to be hired. Google failed in this.