By John Gruber
CoverSutra Is Back from the Dead — Your Music Sidekick, Right in the Menu Bar
Good piece by Dr. Drang, pondering why iPad owners seemingly are slow to upgrade older iPads for new ones:
What’s surprising to me is how slow iPad software has advanced in the seven years since its introduction. I’ve always thought of the iPad as the apotheosis of Steve Jobs’s conception of what a computer should be, what the Mac would have been in 1984 if the hardware were available. But think of what the Mac could do when it was seven years old:
You could write real Macintosh programs on it, both with third-party development software like THINK (née Lightspeed) C and Pascal and Apple’s Macintosh Programmer’s Workshop. You may not care about writing native apps, but the ability to do so brings with it a lot of other abilities you do care about, like the bringing together of documents from multiple sources.
You had a mature multi-tasking environment in the MultiFinder that worked with essentially every application that ran on the Mac.
You (and all your applications) had access to a real hierarchical file system.
You had what many people still consider the best personal software development kit in HyperCard.
Drang’s list is very much programming-centric, which is an area where the iPad is particularly weak compared to where the Mac was in 1991. It’s a good point — serious general-purpose personal computing platforms should be self-sufficient, by which I mean that you should be able to write software for the platform on the platform itself.
In the very earliest years of the Mac, you needed to write Mac apps using a Lisa; MPW 1.0 shipped in 1986, when the Mac was only two years old. In other words, from 1984 to 1986, the Lisa was to the Mac as the Mac now is to the iPad. It makes sense that making the Mac self-sufficient would have been a higher priority for Apple in the 1980s than it is today for the iPad. The Lisa was a failed platform; requiring developers to buy one in order to create apps for the Mac was a multi-thousand-dollar shit sandwich. And, it was obvious that the Mac should be self-hosting. Conceptually, based on how the system worked and was designed, there was no question that there needed to be a way to write Mac software on the Mac itself.
No one is arguing that there should be a way to develop Apple TV apps on an Apple TV, or even more preposterously, to develop Apple Watch apps on an Apple Watch. Those aren’t general-purpose personal computing platforms. Nor do I think there’s much reason to want Apple to enable the creation of iPhone apps from an iPhone. It is surely technically feasible, and conceptually iOS could handle it from a UI perspective. But as a practical matter, the displays are simply too small to make it worth the effort. The only imaginable scenario in which I can imagine a developer using an iPhone to create iPhone software is if the iPhone is the only computing device they own.1
The iPad, on the other hand, clearly could be a platform which would be perfectly suitable for developing iPad apps. I’m not so sure it should be, though. Or that it should be any higher a priority for Apple than it already is (Swift Playgrounds shows they’re moving in this direction, albeit slowly.) The Mac needed its own development tools because the Lisa was going away; the iPad doesn’t need its own development tools because the Mac remains a thriving platform with a bright future.
But put software development aside. I think the bigger problem for the iPad is that there are few productivity tasks, period, where iPad is hardware-constrained. Aldus PageMaker shipped for the Mac in 1985. By 1987 or 1988, it was easy to argue that the Mac was, hands-down, the best platform the world had ever seen for graphic designers and visual artists. By 1991 — seven years after the original Mac — I think it was inarguable. And the improvements in Mac software during those years drove demand for improved hardware. Photoshop, Illustrator, Freehand (R.I.P.), QuarkXpress — those apps pushed the limits of Mac hardware in those days.
That’s just not true for iPad. The iPad is a terrific platform for casual use. I think it’s better than a MacBook for reading and watching video. It’s great for casual gaming. I know plenty of people who much prefer the iPad as a tool for writing. Not because iPad writing apps are more powerful, but rather because they’re simpler, less distracting, and easier to focus upon. None of those are compelling reasons to upgrade an older iPad for a newer more powerful one. In fact, those are all good explanations for why owners of older iPads (especially starting around 2012’s iPad 4) see no reason to upgrade.
Apple Pencil changes this. An iPad Pro with Apple Pencil is the best portable drawing platform in the world. But you don’t even have to try it to know that there’s not much appeal if you don’t do some sort of drawing or sketching. And most people never draw anything.2
I don’t think there’s anything inherently wrong with the iPad being a platform where people buy one and use it for 5 or 6 years before thinking about replacing it. But if Apple wants it to be a platform where people are tempted to replace them more frequently, I think the software story needs to change. For that to happen, the iPad experience needs to be less like a big iPhone, and more like a touch-based Mac.
The fact that that doesn’t seem to be Apple’s goal for the iPad is the reason why I, unlike many others, continue to feel confident about the future of the Mac.
This piece on Mac doomsayers I wrote for Macworld in December of 2010 — when the iPad was less than a year old — feels like it could have been written yesterday. Here’s what I wrote then:
At typical companies, “legacy” technology is something you figure out how to carry forward. At Apple, legacy technology is something you figure out how to get rid of. The question isn’t whether iOS has a brighter future than the Mac. There is no doubt: it does. The question is whether the Mac has become “legacy”. Is the Mac slowing iOS down or in any way holding it back?
I say no. In fact, quite the opposite. For one thing, Mac OS X development has been slowed by the engineering resources Apple has shifted to iOS, not the other way around. Apple came right out and admitted as much, when Mac OS X 10.5 was delayed back in 2007. The company’s explanation: It had to shift key engineering resources to help the original iPhone ship on time.
The bigger reason, though, is that the existence and continuing growth of the Mac allows iOS to get away with doing less. The central conceit of the iPad is that it’s a portable computer that does less — and because it does less, what it does do, it does better, more simply, and more elegantly. Apple can only begin phasing out the Mac if and when iOS expands to allow us to do everything we can do on the Mac. It’s the heaviness of the Mac that allows iOS to remain light.
When I say that iOS has no baggage, that’s not because there is no baggage. It’s because the Mac is there to carry it. Long term — say, ten years out — well, all good things must come to an end. But in the short term, Mac OS X has an essential role in an iOS world: serving as the platform for complex, resource-intensive tasks.
Here we are, six years later, and the Mac’s role in the iOS world is only slightly less essential now than it was then.
I’m aware that having a smartphone as one’s only computing device is actually common in Asia, and for people with low incomes around the world. I don’t think there’s much, if any, demand from any of those people to be able to use their iPhones to develop iPhone apps. ↩︎
You can argue that most people in the early 1990s never did any graphic design, either. To which I’d say, yes, exactly. That’s why Apple typically sold fewer than 1 million Macs per quarter back then — often far fewer. Mac sales peaked (in that era) at 4.5 million units for the entire year of 1995. I think iPads with Apple Pencils are a similar niche today, with an appeal measured in millions of units per year, but not tens of millions. ↩︎︎