Tag Archives: computer

What are Registers?

When people say that Twitter is a cesspool of conspiracy and abuse, I don’t recognise it based on my experience. My Twitter timeline is all jokes and geeky chat1, and that’s where this post takes its cue:

When I started learning assembler, no site ever mentioned what registers were good for. Wish it had said:

CPU talking to a RAM chip is slow, registers are a bit of memory built into the CPU in which you load numbers from RAM, do several calculations, and only THEN write back.

I said that this was a RISC-centric approach and was challenged to come up with a better definition.

It’s a harder question to answer than I initially thought. Every time I came up with a pithy definition, I thought of an exception. And with every clarification I got further and further away from 280 characters.

With no character limit, what is a register?

Wikipedia says that it’s “a quickly accessible location available to a computer’s processor,” which is the definition that I was arguing against. Thanks, Wikipedia.

Nevertheless, I maintain that I’m right. Let’s dig into the definition further.

It wasn’t the “quick access” bit I didn’t like. Registers are faster than main memory2. They’re also faster than on-die caches3.

The RISC-y bit was in the second sentence: loading bits of memory, do a few calculations, write back.

To explain why that’s not always true we have to take a quick tour of CPU design history. By necessity I’ve missed out many details or made sweeping generalisations4. I don’t think this detracts from my point.

First, a quick aside: for the sake of completeness, I should point out that there’s nothing sacred about registers. There are CPU architectures that do not have them, primarily stack-based but there may be others. Let’s ignore them for now.

There have been a few waves of CPU and instruction set architecture design, and the use of registers has changed over that time.

ProcessorRegistersInstructions
Intel 400451646
MOS 6502556
Intel 80868?6
Motorola 680001577
PowerPC32?
ARM31?

In The Beginning, making a CPU at all was a challenge. Getting more than a few thousand transistors on a single chip was hard! The first chips designers optimised for what was possible rather than to make things easy for programmers.

To minimise the number of transistors, there were few registers, and those that were present had predefined uses. One could be used for adding or subtracting (the accumulator). Some could be used to store memory addresses. Others might record the status of other operations. But you couldn’t use the address registers for arithmetic or vice versa.

The answer to the question “what is a register for” at this point is that it saves transistors. By wiring up the logic circuits to a single register and having few of them anyway, you could have enough space to do something.

As it became easier to add transistors to a slice of silicon, CPU designers started to make things easier for programmers. To get the best out of the limited hardware, many developers wrote code in assembler7. Therefore, making it easier for programmers meant adding new, more powerful hardware instructions.

The first CPUs might have had an instruction to “copy address n from memory into a register” and another to “add register 2 to the accumulator.” The next generation built on those foundations with instructions like “add the value at address n to the accumulator and write to address m.” The complexity of instructions grew over time.

Working on these early machines was hard partly because they had few registers and the instructions were simple. The new instructions made things easier by being able to work directly with the values in memory.

These new instructions were not magic, however. Having a single instruction to do the work didn’t make copying data from memory quicker. Developers trying to eke out the best performance had an increasingly difficult time figuring out the best combination of instructions. Is this single instruction that takes twenty cycles faster than these other three instructions that nominally does the same thing?

Around the same time, writing code in higher level languages became increasingly popular. The funny thing with compilers and interpreters is that it’s easier to write and optimise them when you use a limited set of instructions. All those esoteric instructions that were designed for people were rarely used when computers wrote the assembler code.

CPU designers figured out that they could make real-world code execute more quickly by heavily optimising a small number of instructions.

This led to completely different CPU designs, called RISC, or reduced instruction set chips. Rather than have a small number of special purpose registers and a large number of complex instructions, RISC insisted on large numbers of general purpose registers and few instructions8. The reduction in the instruction count was partly achieved by making arithmetic operations only work on registers. If you wanted to add two numbers stored in memory, you first had to load them into registers, add them together and write them back out to memory.

At this point, the answer to the question “what is a register for” changed. It became a sensible option to throw away the transistors used to implement many of the complex instructions and use them for general purpose registers. Lacking instructions that worked directly on memory, the definition of a register became “temporary storage for fast computations” — pretty much what we started with.

Since then, the original designs, with lots of instructions and a small number of registers (CISC), and the newer one, with lots of registers and few instructions (RISC), have to some extent merged. RISC chips have become less “pure” and have more special purpose instructions. CISC chips have gained more registers9 and, internally at least, have taken on many of the attributes traditionally attributed to RISC chips10.

Let’s loop back to the original question. Are registers a bit of memory built into the CPU in which you load numbers from RAM, do several calculations, and only then write back?

We’ve seen how registers are not necessary. We’ve see that their importance has waxed and waned. But, if we had to distill the answer down to a single word or sentence, what would that be?

On current hardware, on most machines, much of the time, the answer is: yes, they are a bit of memory built into the CPU in which you load numbers from RAM, do several calculations, and only then write back.

Was I being pedantic for no reason?


  1. I appreciate that there’s an element of white, male privilege here. ↩︎
  2. Even on architectures like Apple’s M1 chip where the main memory is in the same package as the CPU. ↩︎
  3. I’m going to assume you know a fair few concepts here. My focus is more “how did we get here” than “what do these things do.” ↩︎
  4. While I’ve mostly done this deliberately, I’m sure I’ve done it by accident in a few places too. ↩︎
  5. Four bits wide! ↩︎
  6. It was surprisingly hard to get a count of the instructions for most of these. ↩︎
  7. Machine code is the list of numbers that the CPU understands. Assembler is one level above that, replacing the numbers with mnemonics. Even the mnemonics are considered impenetrable by many programmers. ↩︎
  8. Of all the gross simplifications in this piece, this is probably the biggest. ↩︎
  9. I don’t want to get into all the details here but how they’ve managed to do this without substantially changing the instruction set is both fascinating and architecturally horrifying. Rather than add new instructions to address the new registers, they have a concept called “register renaming” where the register names, but not the values, get reused. ↩︎
  10. Again, without wishing to get into too much detail, modern CISC CPUs typically convert their complex instructions into more RISC-like instructions internally, and execute those. ↩︎

Webcam

I’m not entirely sure what I was thinking. In about 2005 I bought an iSight, Apple’s relatively short-lived external webcam. It was a beautiful device. Sleek, easy to use and functional.

At least, I think it was functional.

For a device that cost me well over £100 I didn’t really think it through. No one else I knew at the time had a Mac with iChat. Or a webcam.

Before I finally gave in and sold it on eBay I did use it a few times with my then girlfriend (now wife). And it was really nice; like the future. Having grown up with old, slow computers the idea of playing video on them is still slightly magical to me. To have a computer simultaneously record, compress, transmit, receive, decode and display high resolution videos still strikes me as pretty amazing.

Even now, web chatting once a week, I think it’s neat. My son, before he was two, thought nothing of having long, detailed “conversations” with his grandparents. What’s high-tech to me is entirely normal to him.

And all this leads me to my latest technology purchase: a webcam. I got it for two reasons: firstly, I’ve been using my laptop with the lid closed a lot, which means I can’t use its builtin webcam. The second reason: it only cost £5.

I’ve probably already used it more than I ever used the iSight.

Is it as pretty as the iSight? Is it as well made? No and no. But it’s amazing what £5 can get you these days. I added the following as a review:

Considering the cost it’s remarkably well put together, comes with a decent length USB cable, has a flexible stand and works straight out of the box. The LEDs are a bit of a gimmick and the picture quality is a little muddy compared with the built-in camera on my MacBook, but it’s totally usable and easily forgiven given the price.

Webcams have moved from an expensive toy that I wanted to like but couldn’t actually use to a practically disposable tool — I’m sure there are drinks in your favourite coffee chain that cost more — that I use almost daily in less than ten years. I don’t mind being a foolish early adopter if it helps get genuinely useful technology into the hands of more people. If only all my other toys prove to be quite to useful.

Which Tablet?

I was recently asked to recommend a tablet. I thought my reply might be generally useful, so below is a lightly edited version of what I wrote.

The machine I’d recommend depends. It depends mostly on how much you want to pay and what it might used for. The good news is that, by and large, you get what you pay for. (Corollary: don’t get any of the really cheap ones. Argos, for example, do a really cheap one. Avoid it.)

The main ones I’d consider are:

Kindle Fire HD 7″ £119

By far the cheapest but very much tied to Amazon — indeed it’s pretty much sold at cost with the expectation that you’ll spend more money with Amazon later on. That means there are fewer apps (games), you can’t download/rent movies from iTunes, etc. But if you just want to surf the web, check email, etc. and play some big names games it would be fine. Probably worth spending the extra £10 to get the version without adverts (“special offers”) though.

Google Nexus 7 £199

Nicer hardware than the Kindle but mostly what you get is access to the Google App Store, which has far more apps, lots of which are free or very cheap. It runs Android, which is the main competitor to Apple and is generally considered to be pretty good, though I’ve not used it much myself. It’s also not tied just to Amazon (though you still can’t get iTunes) but you can get most of the Amazon stuff. Like the Amazon one, it’s cheap because Google expect to make money from you in other ways.

Apple iPad Mini £249

Better hardware than either of the previous two (metal rather than plastic case) but, arguably, a worse screen than the Nexus (physically bigger but fewer pixels).

iPad gives you all the iPhone and iPad software — which is typically better than Android. Also gives access to iTunes for music, movies and TV shows. The iPad software is often considered to be bit easier and less confusing than Android and you’d get stuff like FaceTime and iMessage (free text messages with other Apple users) which you can’t get on Android.

iPad mini with Retina display £319

As above but with a far nicer screen and is about four times quicker. It will probably last longer as it’s more future proof (but that’s obviously speculation at this point). Possibly hard to get hold of right now as it literally just came out and it “supply constrained.”

Apple iPad Air £399

As above but with a 10″ screen rather than 8″. I have an older versions of this, though the mini didn’t exist when I got mine…

(The prices above are “retail” prices. Some of the links go to the same product but for a lower price.)

It’s also worth noting that you can get more expensive versions of all of them that come with more space and/or cellular radios (so you can access the Internet when you’re out of the range of a friendly WiFI network).

It’s even harder to give general advice about this than the tablets themselves. In general, the more you want to download movies and large, complex games, the more capacity you’ll need. If you mostly surf the web or read books even the smallest versions should be okay. (Indeed, that’s what I use.)

The 3G/4G question is tricky. Me, I get the cellular radio because I do travel with my iPad and I have a Pay As You GO SIM which means I don’t pay a penny in months that I don’t use it. But it does cost more. You might prefer to spend the extra to get a larger storage capacity.

When I first got an iPad, it was because users of one of my apps were asking for a version that used the iPad’s bigger screen. I was skeptical that I would actually use it. These days I probably use it more than my Mac. I guess what I’m saying is that it’s worth getting the right product rather than just the cheapest.

Deleting a Google Apps Domain

Imagine the scene…

Okay, that’s a bit dramatic.

Recall that I have an iPhone app called Yummy. It has, or rather had, a website called YummyApp.com. Last year I formed a company called Wandle Software and since then have been merging my various web “properties.” The website moved over to wandlesoftware.com earlier this year, email was the last thing that needed transferring.

My email is hosted using Google Apps — Gmail but without the gmail.com email address if you’ve not heard of it. What I wanted to do was move yummyapp.com from being a “proper” domain to what Google refer to as a domain alias for wandlesoftware.com.

I assumed that what I’d have to do was deactivate the old one, wait a bit and then reactivate it on the new domain.

So I looked at the Dashboard to find the “delete” option.

Nothing.

So I deactivate all the services. I delete all the users except one. And I look again. Still nothing.

Desperate times call for desperate measures. I look in the documentation.

According to the Help I was basically correct. Unfortunately the delete option just wasn’t there.

I reset my cache. I switched to Firefox. And cleared the cache in Firefox, all to no avail.

The Help for Google Apps is actually pretty good. These seems to be a lot of it and, as you’d imagine, search mostly works well. But help is different from support and — long story short — there’s no way to get the latter.

Instead there’s a forum that (I think) is user supported rather than directly by Google. This is fine unless there’s a problem with the software. I asked a question and within an hour got a very helpful reply that said that I needed to raise a support ticket.

So how do you raise a support ticket when that’s a “value add” feature reserved for paying customers? (Even when there’s a fault.)

I cheated.

I took the “free thirty day trial” of Google Apps for Business. I immediately raised a support ticket and only a couple of hours later got a phone call from the US. The guy was personable, efficient and immediately solved my problem. He even offered to stay on the line long enough for me to confirm that it had worked rather than hanging up immediately to improve call times (as many call centres do).

Of course I appreciate why Google can’t provide phone support to all and sundry, but surely there has to be a better of helping customer who find flaws in the software?

Blocks, both technical and mental

Blocking content from the Internet is getting a lot of press of late. The last couple of weeks has seen the Pirate Bay being blocked by a number of large ISPs and debate over whether the blocking of “adult” content should be opt-in or opt-out.

Unfortunately the enthusiasm to “protect the children” and “protect the copyright holders” seems to have pushed aside much of the debate of whether we should be doing this at all or whether it’s practical.

Whether we should be doing it or not is political. I have my opinions1 but what I want to concentrate on here is whether or not blocking such content is actually possible.

There are a number of different ways of vetting content. They’re not necessarily mutually exclusive, but they’re all deeply flawed.

First, a common one from politicians: the Internet is just like TV and cinema:

Perry said that she has been accused of censorship over the campaign, but argued that the internet was no different to TV and radio and should be regulated accordingly.

No, no it isn’t. There are a handful of TV channels, even taking cable and satellite into account, and a relatively small number of movies released every week. It’s practical to rate movies. TV programmes are distributed centrally, so pressure can be placed on a small number of UK-based commercial entities when they do naughty things.

The Internet is very different. Firstly, counting the number of web pages is rather harder. This is what Wikipedia has to say:

As of March 2009, the indexable web contains at least 25.21 billion pages.[79] On July 25, 2008, Google software engineers Jesse Alpert and Nissan Hajaj announced that Google Search had discovered one trillion unique URLs.

Note that even the smaller number is from three years ago. I’d bet that it’s not smaller now. Clearly the same system of rating an regulation clearly isn’t going to work on that scale. And even if it was possible to rate each of these sites, the UK government has little leverage over foreign websites.

There are basically three ways to automate the process: white list, black list and keyword scanning.

A white list says “you can visit these websites.” Even assuming no new websites are ever added and no new content is ever created, rating those 25 billion pages is not practical. I don’t think we want an official approved reading list.

A black list is the opposite: “you can visit anything except these pages.” We have the same scale problem as with white lists and a few more. Much of the Internet is “user contributed” and it’s not hard to create new sites. If my site is blocked, I can create a new one with the same content very, very quickly. Basically, there’s just no way to keep on top of new content.

Keyword scanning is exactly as it sounds. Your internet traffic is scanned and if certain keywords are spotted, the page is blocked. It’s automated and dynamic, but what keywords do you look for? “Sex”? Well, do you want to block “sex education” websites? “Porn”? That would block anti-porn discussion as well as the real thing.

The scanners can be a lot more sophisticated than this but the fundamental problem remains: there’s no way to be sure that they are blocking the correct content. Both good and bad sites are blocked, and still with no guarantee that nothing untoward gets through.

In all cases, if children can still access “adult” content with relative ease — both deliberately and accidentally — what’s the point?

Of course I’m not in favour of taking content without paying for it or exposing children to inappropriate material. But, to use a cliche, the genie is out the bottle. Like the reaction to WikiLeaks there is little point in pretending that nothing has changed or that the same techniques and tools can be used to fight them.

Instead, if you’re a publisher you need to make your content legally available and easier to access than the alternative. iTunes has showed that people are willing to pay. So far, you’ve mostly shown that you’d rather treat paying customers as criminals. That’s not helping.

As for protecting children, it all comes back to being a responsible parent. Put the computer in the living room. Talk to them. Sure, use white or black lists or filtering, just be aware that it can never be 100% effective and that not everyone has children that need protecting. Whatever the Daily Mail and your technically unaware MP says, you can’t say the connection is being checked, problem solved.

  1. I’m basically anti-censorship and in favour of personal responsibility. There are already laws covering the distribution obscene materials, why should there be restrictions on legal materials? []

Spectrum

You’ve probably seen that it’s the Sinclair Spectrum’s thirtieth birthday today. There are lots of great retrospectives — this is probably the best single source I read — so I’m not going to rehash all that. But I thought it might be worth a few words of my own.

Like many Brits my age, the Spectrum was my first computer. Technically it was the family computer, but after a few weeks I was pretty much the only one who used it.

I remember some of the first few hours using it. I remember, for example, ignoring most of the preamble in the manual and diving straight into typing in a programming example. Those who have used a Spectrum will realise that you can’t just type in code; you need to use special keyboard combinations to get the keywords to appear on screen. I didn’t know about that.

After a while I managed to persuade to let me type in the code. The computer didn’t really understand it and I didn’t know why. I can’t remember whether I found the right section in the manual before having to go to bed but even in my confusion I knew that I was hooked.

And really that was the start of my career in IT. I started really wanting to play games but I ended up spending more and more time programming and less and less loading Bomb Jack. Usually I saw something neat in a program and though “How do they do that?” Then I’d try to figure out what they’d done. I was quite proud of making some text slowly fade into view and then gradually disappear again. Obviously that was after the obligatory:

10 PRINT "Stephen was here!"
20 GO TO 10

I remember all that surprisingly well. Which makes the following line all the more shocking:

It may have been startlingly modern once, but at 30, the Sinclair Spectrum is as close in time to the world’s first commercial computers of the 50s as it is to the latest iPad.

The first commercial computers where created in the early fifties. The first computers — at least computers you’d kind of recognise as computers — were only built a few years before that. Computers are so powerful and connected these days that it’s difficult for me to remember what I even did with them. I wonder what we’ll make of the iPad in a few decades? I’m sure it’ll look just as dated.

The last thing I wanted to mention was about that weird, unique program entry system. In short, each key had a number of different keywords printed on it. The J key, for example, had LOAD (used to load a program from tape), VAL, VAL$ and the minus sign. When you entered code, the editor would be in one of a number of modes and, in addition to shift, there was a key called “Symbol Shift.”

I’ve never seen a sensible explanation for this. They all seem to say it was “easier” or a cue for users so they’d know all the valid keywords to Sinclair Basic. I never bought this. Is it really easier to remember a bunch of non-standard keyboard shortcuts rather than just type? Don’t think so.

And then when I was at university I did a course on compilers, the software that is used to turn human readable code into the binary gibberish that a computer can actually run.

The interesting bit was all around the grammars and recursive descent parsers and the mental challenge of writing a program that wrote other programs. But the first step of the process is called lexical analysis, which takes the jumble of letter that is your program and converts them into “words” that can be processed more easily, so PRINT is a keyword, 123.4 is a number and “Hello” is some text.

Given the resource limitations of writing a whole operating system and BASIC interpreter in 16k, my guess is that it was easier to write a strange keyboard entry system than a lexer.

Can anyone comment on the accuracy of this guess?

But back to nostalgia. From hazy memories, to university, to wild speculation and the iPad. We’ve come a long way. But it was the Sinclair Spectrum that started it all for me. Thank you Clive Sinclair!