this is nuts —
1987’s Acorn Archimedes was the first production RISC-based personal computer.
–
Let’s be honest: 2020 sucks. So much of this year has been a relentless slog of bad news and miserable events that it’s been hard to keep up. Yet most of us have kept up, and the way most of us do so is with the small handheld computers we carry with us at all times. At least in America, we still call these by the hilariously reductive name “phones.”
We can all use a feel-good underdog story right now, and luckily our doomscrolling 2020 selves don’t have to look very far. That’s because those same phones, and so much of our digital existence, run on the same thing: the ARM family of CPUs. And with Apple’s release of a whole new line of Macs based on their new M1 CPU—an ARM-based processor—and with those machines getting fantastic reviews, it’s a good time to remind everyone of the strange and unlikely source these world-controlling chips came from.
If you were writing reality as a screenplay, and, for some baffling reason, you had to specify what the most common central processing unit used in most phones, game consoles, ATMs, and other innumerable devices was, you’d likely pick one from one of the major manufacturers, like Intel. That state of affairs would make sense and fit in with the world as people understand it; the market dominance of some industry stalwart would raise no eyebrows or any other bits of hair on anyone.
But what if, instead, you decided to make those CPUs all hail from a barely-known company from a country usually not the first to come to mind as a global leader in high-tech innovations (well, not since, say, the 1800s)? And what if that CPU owed its existence, at least indirectly, to an educational TV show? Chances are the producers would tell you to dial this script back a bit; come on, take this seriously, already.
And yet, somehow, that’s how reality actually is.
In the beginning, there was TV
The ARM processor, the bit of silicon that controls over 130 billion devices all over the world and without which modernity would effectively come to a crashing halt, has a really strange origin story. Its journey is peppered with bits of seemingly bad luck that ended up providing crucial opportunities, unexpected technical benefits that would prove absolutely pivotal, and a start in some devices that would be considered abject failures.
But everything truly did sort of get set in motion by a TV show—a 1982 BBC program called The Computer Programme. This was an attempt by the BBC to educate Britons about just what the hell all these new fancy machines that looked like crappy typewriters connected to your telly were all about.
The show was part of a larger Computer Literacy Project started by the British government and the BBC as a response to fears that the UK was deeply and alarmingly unprepared for the new revolution in personal computing that was happening in America. Unlike most TV shows, the BBC wanted to feature a computer on the show that would be used to explain fundamental computing concepts and teach a bit of BASIC programming. The concepts included graphics and sound, the ability to connect to teletext networks, speech synthesis, and even some rudimentary AI. As a result, the computer needed for the show would have to be pretty good—in fact, the producers’ demands were initially so high that nothing on the market really satisfied the BBC’s aspirations.
So, the BBC put out a call to the UK’s young computer industry, which was then dominated by Sinclair, a company that made its fortune in calculators and tiny televisions. Ultimately, it was a much smaller upstart company that ended up getting the lucrative contract: Acorn Computers.
An Acorn blooms
Acorn was a Cambridge-based firm that started in 1979 after developing computer systems originally designed to run fruit machines—we call them slot machines—then turning them into small hobbyist computer systems based on 6502 processors. That was the same CPU family used in the Apple II, Atari 2600, and Commodore 64 computers, among many others. This CPU’s design will become important later, so, you know, don’t forget about it.
Acorn had developed a home computer called the Atom, and when the BBC opportunity arose, they started plans for the Atom’s successor to be developed into what would become the BBC Micro.
The BBC’s demanding list of features ensured the resulting machine would be quite powerful for the era, though not quite as powerful as Acorn’s original Atom-successor design. That Atom successor would have featured two CPUs, a tried-and-true 6502 and an as-yet undecided 16-bit CPU.
Acorn later dropped that CPU but kept an interface system, called the Tube, that would allow for additional CPUs to be connected to the machine. (This too will become more important later.)
The engineering of the BBC Micro really pushed Acorn’s limits, as it was a pretty state-of-the-art machine for the era. This resulted in some fascinatingly half-ass but workable engineering decisions, like having to replicate the placement of an engineer’s finger on the motherboard with a resistor pack in order to get the machine to work.
Nobody ever really figured out why the machine only worked when a finger was placed on a certain point on the motherboard, but once they were able to emulate the finger touch with resistors, they were just satisfied it worked, and moved on.
Here, listen to one of the key engineers tell you himself:
The BBC Micro proved to be a big success for Acorn, becoming the dominant educational computer in the UK in the 1980s.
As everyone with any urge to read this far likely knows, the 1980s were a very important time in the history of computing. IBM’s PC was released in 1981, setting the standard for personal computing for decades to come. The Apple Lisa in 1983 presaged the Mac and the whole revolution of the windows-icons-mouse graphical user interface that would dominate computing to come.
Acorn saw these developments happening and realized they would need something more powerful than the aging but reliable 6502 to power their future machines if they wanted to compete. Acorn had been experimenting with a lot of 16-bit CPUs: the 65816, the 16-bit variant of the 6502, the Motorola 68000 that powered the Apple Macintosh, and the comparatively rare National Semiconductor 32016.
None of these were really doing the job, though, and Acorn reached out to Intel to see about implementing the Intel 80286 CPUs into their new architecture.
Intel ignored them completely.