My semi-coherent musings on x86/ARM



It's hard to remember the time when your "real" computer wasn't an x86. Sure phones have lots of processing capability, but they're not yet where developers spend most of their time (foreshadowing alert).

But how did *that* happen? Did the market pick the best product? Sort of, but a few things had to happen for x86 to become the main computer architecture. In fact, it was one of those "perfect storms":

- The IBM PC happened. And it was relatively open (not free, but clones were possible since schematics were published along with BIOS source code - I still have the manual). Every system was (potentially) a development system, and the "software spiral" accelerated.
- However, there was competition. Motorola, 6502 (and follow-ons), etc. from people like Apple and Acorn (more foreshadowing). And it got worse - Apple, Motorola, and IBM teamed up to crush Intel, but the reverse happened. As it turns out, having 3 owners is a bad idea (Intel figured that out, eventually after BIIN, AKA Billions Invested In Nothing). So, even though Apple had gone through 3 processors (6502, 68K, PowerPC), eventually x86 looked better than the alternatives. In and of itself, that's not necessarily a big deal, but ...
- Apple coming to x86 happened to overlap with the rise of the .com era. During the .com craze, x86 was trying to make in into server land, bolstered by Intel's long-running stalwart "systems" business. Fortunately the (Clayton) Christensen Effect was in full stride, and it helped Intel make inroads against the incumbents of IBM, Sun, DEC, etc.
- With Apple Macs running OS X, they were "close" to what cloud developers were running - Linux. Along with Apple being "cool", Macs became the laptop of choice for developers.
- And AMD was a great competitor. Not *too much* market share, but enough so Intel didn't have a monopoly.
- As a backdrop, the desire for "platform independence" (or ISA independence), with attempts as far back as C in the 70s, never really happened. Binaries mattered (and still do). So, having the same ISA (the "API of the CPU") has been good for developers.

x86 simultaneously took over "personal computing" by beating the competition, aided by Intel's strong management. And x86 took over servers with a natural "up market" movement (Christensen Effect).

There is a wrinkle though; x86 ended up not making it to phones. And ARM, the stalwart's stalwart (remember Acorn?), kept going. It had a different business model, even more horizontal than x86. Now, we have ARM in phones, servers, and a few "personal computers" (initially laptops). Somewhat ironic, since ARM's initial market was for the personal computer (it essentially failed). But we also have x86. Is the world going "backwards" to multiple ISAs?

For ARM to unseat x86 (Intel and AMD), they will have to have their own "storm" (perhaps not a perfect storm, but something needs to happen):

- The (2) x86 vendors could falter, leaving the CPU market open. Could happen, but seems unlikely.
- More likely is that the phone's (mainly the iPhone) similarity to the developer's main machine (laptop) could be the "wedge" that ARM uses to become the developer's preferred machine. And code runs best on the machine on the developer's desk. Similarly, in the server/cloud space, ARM will need to push for excellent support from platform independent frameworks (e.g. Python, JavaScript, K8, etc.) and have a story for accelerated ML and AI ("graphics processing" isn't the only want to get "high performance"). ARM will need a strong development *and* deployment strategy. There will have to be very strong leadership in this area, including making the right investments.

Both sides have considerable tasks ahead of them. But competition is ultimately good for the consumer. I look forward to the race.