Tag Archives: history

How Britain Broke The World

Popular opinion is that the whole of the UK was against Blair’s invasion of Iraq. Over a million people marched in London.

I wasn’t one of them.

I’m not sure that I was as politically engaged then as I am now, but the main reason that I wasn’t there was because I wasn’t entirely against the intervention. Sure, I never believed the justifications that they gave. The whole ability to attack in 45 minutes seemed unlikely, and the connection to Al Qaeda didn’t seem plausible either. Blair deserved all resistance he got for such obvious untruths.

So if I didn’t believe in the reasons given, why was I not against the invasion? Because the regime was abusive to its own population. We talk about choices and democracy and representation, but what can people do when they have such a corrupt, oppressive and violent government?

Other countries tend to say it’s not their concern. Does that mean it’s okay to let people suffer because they were unlucky enough to be born in the wrong country? I say no1. The international community has a responsibility to the world’s population, wherever they live2.

I should add that my (limited) support of the policy was about the idea of an intervention. The execution of the idea was clearly a mess, but no one marching knew that.

Anyway, the book.

As I tend to do, I second guess myself. Was my opinion, if not correct then, at least justifiable? If I didn’t know better, should I have known better? I got “How Britain Broke the World” by Arthur Snell to answer that question, and others.

It starts in 1997 with Kosovo and finishes with Brexit in 2021. I do think it strays from the title at times, which comes across as the book equivalent of click-bait. However, it largely answers my questions.

The simple chronological structure helps put the individual events into perspective. I’d forgotten some of them, and the details of many. In the end, I think my opinion on the Iraq invasion is similar to that of many of the interventions: the argument for doing something was there but the execution was poor3.

If there’s something to take away, it’s that we don’t learn.

You can see that because, weirdly, this is a very current book. By which I mean that shortly after reading various sections, I’d come across some contemporary event that was about the same thing. Russia. The Middle East. The US-UK “special relationship.” It’s all there and it’s all ongoing.

I can’t say I’m now an expert on any of these events or situations. It’s all complicated. Many of the challenges we have are from people who are trying to give simple solutions to complex problems4. But I can say that I am better informed than I was. To paraphrase, Donald Rumsfeld, I now have fewer unknown unknowns.


  1. It’s a slightly odd realisation to finally figure out that you don’t believe in the concept of a country. Not as in I deny that they exist, obviously, but in the sense that your potential shouldn’t be constrained by the place you happen to have been born. ↩︎
  2. Deciding what are universal rights has been a challenge, too. The Universal Declaration of Human Rights seems pretty good to me, though arguably it come from a liberal, western perspective so perhaps I would? ↩︎
  3. I realise this isn’t a wholly original take. I’m just slow on the uptake sometimes. ↩︎
  4. That’s Brexit in a nutshell. ↩︎

History

A few years ago I had a job where every new recruit would go through a long process of shock and gradual acclimatisation to the main software product.

What it did doesn’t matter as much as how it was built: it was an application developed on top of a proprietary programming language and user interface designer. The reaction was always the same. Why? Why?! Why would you reinvent Visual Basic on Unix? Why would you inflict a programming language even worse than Basic on developers?1

The answer, it turns out, is that the original developers were idiots.

No, of course that’s not true. But if that’s the case, then why did almost every developer start from that point of view when they first arrived at the company?

That brings us to Twitter and its new owner. One of his first public proclamations is to declare that there are too many micro-services running, and, worse, most of do nothing useful! The reply-guys all agree and, between them, argue that it’s entirely possible to rebuild Twitter from the ground-up in weeks, possibly even a weekend if given enough pizza and Blue Bottle.

Were the original developers of Twitter also idiots?

I don’t know as much about Twitter’s architecture, but I’d be willing to bet that, no, they were also not stupid.

If it’s not the original developers, what does it say about the critic? It says that they see the complexity but not the nuance. They see the current state but they do not see any of the decisions that lead up the current system. They see complexity, but without understanding the whole problem domain they don’t see why that complexity exists.

In the case of my job, the software predated Visual Basic, which is a pretty good reason for not using it. It also had to work on Unix and be editable on client sites without extra tooling. By the time I worked there, it may have been dated but it was in production at many clients. It worked. Sure, it’s not how you’d architect it now but the decisions that led to the design did make sense.

If it’s dated, then why not rewrite it? That has been covered many times before, but the short answer is that when you design it, you focus so much on the clean, new solution that you forget why you added the warts to the old system. The layers upon layers of fixes and enhancements represent real world experience. Those micro-services are there for a reason. Not understanding the reason doesn’t change that2.

This is not an argument against evolving the software, only that you should understand what you already have. Sometimes rewriting can be justified. Rationalising a bunch of micro-services isn’t always a ridiculous idea. But there’s an important difference between complex and complicated. Can you know which your inherited system is after a few days on the job?


  1. It was a stack-based language, along the lines of Forth and Postscript. Long time users could do amazing things with tiny amounts of code. I never quite got there. ↩︎

  2. Logical fallacy: argument from incredulity. ↩︎

Crisis? What Crisis?

Empty shops, rising prices, the laughing stock of Europe, our place in the world in question, people out of work and fuel shortages. But that’s enough about late 2021, I decided that I wanted to learn more about the Seventies, the decade that brought, well, me, the Winter of Discontent, power cuts, the three day week and shocking fashion sense. There are a few books that cover the same timeline, but I decided on “Crisis? What Crisis?”[affiliate link] by Alwyn W. Turner.

The book is in roughly chronological order, with occasional jumping around to make certain aspects make sense.

Despite being such relatively recent history, there are surprising volumes of material that are shocking, or at least uncomfortable. I know the name Enoch Powell and the phrase “rivers of blood,” of course, but even then the more detailed background is both depressing and familiar. The parallels with the modern anti-immigrant movement are obvious.

On the other hand, it made the rise of Margaret Thatcher more understandable to me. I’m not a fan of her politics but you can appreciate the desire to shake things up. Having said that, I thought her victory in the 1979 election was assured so it was fascinating to read that it wasn’t, and that had the election been called just a few months earlier things might have turned out differently.

Those looking for a change with Thatcher may not have realised what they were letting themselves in for. I guess I’ll have to read the next book about the Eighties to find out.

Looking back, the Seventies is often seen as a “lost” decade, which is why it’s nice that the book concludes with the upsides that we often don’t consider:

For most of the country, for most of the decade, times were really quite good. In retrospect, the 1970s can look like a period of comparative calm and stability. It was still possible for an average working-class family to live on a single wage, very few were required to work anti-social hours, and housing was affordable for most.

Almost by definition, I can’t say how complete the book is but I do get a much better feeling for the decade than I had before, which makes it worth the read.

The Computers That Made Britain

I’m still fascinated by the computers of the eighties. Without well known standards, every machine was different, not only from those of other manufacturers but also older machines from the same company. As as user it was terrible. Back the wrong horse and you’d be stuck with a working computer with no software and no one else to share your disappointment with.

But looking back, there’s a huge diversity of ideas all leaping onto the market in just a few years. Naturally, some of those ideas were terrible. Many machines were rushed and buggy, precisely because there was so much competition. Going on sale at the right time could make or break a machine.

Tim Danton’s “The Computers That Made Britain” is the story of a few of those machines.

He covers all the obvious ones, like the Spectrum and the BBC Micro, and others that I’ve not seen the stories of before, like the PCW8256.

While it’s called “The Computers That Made Britain” rather than “Computers that were made in Britain,” I would argue with some entries. The Apple II is certainly an important computer but, as noted in the book, they didn’t sell well in the UK. Our school literally had one, and I think that’s the only one I’ve ever seen “in the wild.” Sales obviously isn’t the only criterion, but the presence of these machines presumably pushed out the New Brain and the Cambridge Z88 (among others). Since this book is about the computers than made Britain, I would have liked to see more about them and less about the already well documented American machines like the Apples and IBMs.

The chapters are largely standalone, meaning you don’t need to read them in order. I read about the machines I’ve owned first, before completing a second pass on the remaining ones. They’re invariably well researched, including interviews with the protagonists. Some machines get more love than others, though. Talking about the Spectrum, it finishes with a detailed look at all the subsequent machines, right up to the Spectrum Next, though curiously missing the SAM Coupe. But the Archimedes gets nothing, even though there was a range of machines. Did they run out of time or was there a page count?

But those are minor complaints for an otherwise well put together book. Recommended.

It’s published by the company that makes Raspberry Pis, which you could argue is the spiritual successor to the Sinclair and Acorn machine. You can download the book for free, but you should buy it! The above link is for Amazon, but if you’re near Cambridge you should pop into the Raspberry Pi store and pick up your copy there instead.

If this is your kind of book, I would also recommend “Digital Retro” and “Home Computers: 100 Icons that defined a digital generation,” both of which are more photography books than stories.

War?

Eric Schmidt says Google is the new Microsoft and it’s winning the war against Apple. I think he’s missing some perspective.

One of the key things that Steve Jobs realised when he returned to Apple in the late nineties was that the industry is not necessarily a zero sum game.

We have to let go of a few things here. We have to let go of the notion that for Apple to win, Microsoft has to lose.

The current situation is not identical but I think that the lessons might be substantially the same. While Google believe that they’re winning it’s not clear to me that they’re playing the same game as Apple and Microsoft. It’s like saying that you’re winning at Scrabble when your opponent is playing Chess; sure, you played some great words on triple letter scores, but your chances of getting check-mate are limited.

For all Google’s efforts and marketshare, most web traffic and ad impressions — the real metric that they’re interested in — still comes from iOS. They largely succeeded in commoditising the smartphone, unfortunately their users either don’t surf the web much or, in the case of Android-derived devices like the Kindle Fire, do but don’t go via Google.

Would they not be better toning down the rhetoric and figuring out how everyone can play nicely? War makes for good headlines but often ends in everyone losing.

For Google to win, Apple does not have to lose.

Spectrum

You’ve probably seen that it’s the Sinclair Spectrum’s thirtieth birthday today. There are lots of great retrospectives — this is probably the best single source I read — so I’m not going to rehash all that. But I thought it might be worth a few words of my own.

Like many Brits my age, the Spectrum was my first computer. Technically it was the family computer, but after a few weeks I was pretty much the only one who used it.

I remember some of the first few hours using it. I remember, for example, ignoring most of the preamble in the manual and diving straight into typing in a programming example. Those who have used a Spectrum will realise that you can’t just type in code; you need to use special keyboard combinations to get the keywords to appear on screen. I didn’t know about that.

After a while I managed to persuade to let me type in the code. The computer didn’t really understand it and I didn’t know why. I can’t remember whether I found the right section in the manual before having to go to bed but even in my confusion I knew that I was hooked.

And really that was the start of my career in IT. I started really wanting to play games but I ended up spending more and more time programming and less and less loading Bomb Jack. Usually I saw something neat in a program and though “How do they do that?” Then I’d try to figure out what they’d done. I was quite proud of making some text slowly fade into view and then gradually disappear again. Obviously that was after the obligatory:

10 PRINT "Stephen was here!"
20 GO TO 10

I remember all that surprisingly well. Which makes the following line all the more shocking:

It may have been startlingly modern once, but at 30, the Sinclair Spectrum is as close in time to the world’s first commercial computers of the 50s as it is to the latest iPad.

The first commercial computers where created in the early fifties. The first computers — at least computers you’d kind of recognise as computers — were only built a few years before that. Computers are so powerful and connected these days that it’s difficult for me to remember what I even did with them. I wonder what we’ll make of the iPad in a few decades? I’m sure it’ll look just as dated.

The last thing I wanted to mention was about that weird, unique program entry system. In short, each key had a number of different keywords printed on it. The J key, for example, had LOAD (used to load a program from tape), VAL, VAL$ and the minus sign. When you entered code, the editor would be in one of a number of modes and, in addition to shift, there was a key called “Symbol Shift.”

I’ve never seen a sensible explanation for this. They all seem to say it was “easier” or a cue for users so they’d know all the valid keywords to Sinclair Basic. I never bought this. Is it really easier to remember a bunch of non-standard keyboard shortcuts rather than just type? Don’t think so.

And then when I was at university I did a course on compilers, the software that is used to turn human readable code into the binary gibberish that a computer can actually run.

The interesting bit was all around the grammars and recursive descent parsers and the mental challenge of writing a program that wrote other programs. But the first step of the process is called lexical analysis, which takes the jumble of letter that is your program and converts them into “words” that can be processed more easily, so PRINT is a keyword, 123.4 is a number and “Hello” is some text.

Given the resource limitations of writing a whole operating system and BASIC interpreter in 16k, my guess is that it was easier to write a strange keyboard entry system than a lexer.

Can anyone comment on the accuracy of this guess?

But back to nostalgia. From hazy memories, to university, to wild speculation and the iPad. We’ve come a long way. But it was the Sinclair Spectrum that started it all for me. Thank you Clive Sinclair!