Andy Grove Was Right

The Verge’s Sean Hollister penned an excellent high-level summary of Pat Gelsinger’s ignominious ouster from Intel, under the headline “What Happened to Intel?” A wee bit of pussyfooting here, though, caught my eye:

Just how bad was it before Gelsinger took the top job?

Not great! There were bad bets, multiple generations of delayed chips, quality assurance issues, and then Apple decided to abandon Intel in favor of its homegrown Arm-based chips — which turned out to be good, seriously showing up Intel in the laptop performance and battery life realms. We wrote all about it in “The summer Intel fell behind.”

Intel had earlier misses, too: the company long regretted its decision not to put Intel inside the iPhone, and it failed to execute on phone chips for Android handsets as well. It arguably missed the boat on the entire mobile revolution.

There’s no argument about it. Intel completely missed mobile. iPhones never used Intel chips and Apple Silicon chips are all fabbed by TSMC. Apple’s chips are the best in the industry, also without argument, and the only mobile chips that can be seen as reasonable competition are from Qualcomm (and maybe Samsung). Intel has never been a player in that game, and it’s a game Intel needed not only to be a player in, but to dominate.

It’s not just that smartphones are now a bigger industry than the PC industry ever was, and that Intel has missed out on becoming a dominant supplier to phone makers. That’s bad, but it’s not the worst of it. It’s that those ARM-based mobile chips — Apple Silicon and Qualcomm’s Snapdragon lineup — got so good that they’re now taking over large swaths of the high end of the PC market. Partly from an obsessive focus on performance-per-watt efficiency, partly from the inherent advantages of ARM’s architecture, partly from engineering talent and strategy, and partly from the profound benefits of economies of scale as the mobile market exploded. Apple, as we all know, moved the entire Mac platform from Intel chips to Apple Silicon starting in 2020. The Mac “only” has 15 percent of the worldwide PC market, but the entirety of the Mac’s market share is at the premium end of the market. Losing the Mac was a huge loss for Intel. And now Qualcomm and Microsoft are pushing Windows laptops to ARM chips too, for the same reasons: not just performance-per-watt, but sheer performance. x86 CPUs are still dominant on gaming PCs, but even there, AMD is considered the cream of the crop.

Of all companies, Intel should have seen the potential for this to happen. Intel did not take “phone chips” seriously, but within a decade, those ostensibly toy “phone chips” were the best CPUs in the world for premium PC laptops, and their efficiency advantages make them advantageous in data centers too. And Apple has shown that they’re even superior for workstation-class desktops. That’s exactly how Intel became Intel back at the outset of the personal computing revolution. PCs were seen as mere toys by the “real” computer makers of the 1970s and early 1980s. IBM was caught so flatfooted that when they saw the need to enter the PC market, they went to Intel for the chips and Microsoft for DOS — decisions that both Intel and Microsoft capitalized upon, resulting in a tag-team hardware/software dominance of the entire computing industry that lasted a full quarter century, while IBM was left sidelined as just another maker of PCs. From Intel’s perspective, the x86 platform went from being a “toy” to being the dominant architecture for everything from cheap laptops all the way up to data-center-class servers.

ARM-based “phone chips” did the same thing to x86 that Intel’s x86 “PC chips” had done, decades earlier, to mainframes. Likewise, Nvidia turned “graphics cards for video game enthusiasts” — also once considered mere toys — into what is now, depending on stock market fluctuations, the most valuable company in the world. They’re neck and neck with the other company that pantsed Intel for silicon design leadership: Apple. Creating “the world’s best chips” remains an incredible, almost unfathomably profitable place to be as a business. Apple and Nvidia can both say that about the very different segments of the market in which their chips dominate. Intel can’t say that today about any of the segments for which it produces chips. TSMC, the company that fabs all chips for Apple Silicon and most of Nvidia’s leading chips, is 9th on the list of companies ranked by market cap, with a spot in the top 10 that Intel used to occupy. Today, Intel is 180th — and on a trajectory to fall out of the top 200.

Intel never should have been blithe to the threat. The company’s longtime CEO and chairman (and employee #3) Andy Grove titled his autobiography Only the Paranoid Survive. The full passage from which he drew the title:

Business success contains the seeds of its own destruction. Success breeds complacency. Complacency breeds failure. Only the paranoid survive.

Grove retired as CEO in 1998 and as chairman in 2005. It’s as though no one at Intel after him listened to a word he said. Grove’s words don’t read merely as advice — they read today as a postmortem synopsis for Intel’s own precipitous decline over the last 20 years.