By John Gruber
Instabug: Understand how your app is doing with real-time contextual insights from your users.
iOS has evolved in a fairly predictable manner over the years. Apple has done a good job tackling the lowest-hanging fruit on the to-do list, year after year. They crossed off a lot of big obvious features over the first few years: third-party apps, cut-copy-paste, enterprise support, push notifications, better multitasking. Last year brought a few more: over-the-air software updates, cloud-based backups and wireless syncing, and a much improved notification interface.
Another good source for iOS feature predictions has been to survey the competition and identify the areas where iOS was lacking. Those items from last year, for example, were areas where Android was ahead.
iOS is by no means feature-complete. But it’s getting harder to identify the low-hanging fruit — the things you just know Apple has to be working on, not just the stuff you hope they are. The biggest one left is mapping. Today brings a report from 9to5Mac that Apple is set to switch the back-end data in iOS’s Maps app from Google to its own mapping services; John Paczkowski confirms it, quoting a source who claims the new Maps will “blow your head off”.
Here’s the thing. Apple’s homegrown mapping data has to be great.
Mapping is an essential phone feature. It’s one of those few features that almost everyone with an iPhone uses, and often relies upon. That’s why Apple has to do their own — they need to control essential technology.1 I suspect Apple would be pushing to do their own maps even if their relationship with Google were still hunky-dory, as it was circa 2007. (Remember Eric Schmidt coming on stage during the iPhone introduction?) But as things actually stand today between Apple and Google, relying on Google for mapping services is simply untenable.
This is a high-pressure switch for Apple. Regressions will not be acceptable. The purported whiz-bang 3D view stuff might be great, but users are going to have pitchforks and torches in hand if practical stuff like driving and walking directions are less accurate than they were with Google’s data. Keep in mind too, that Android phones ship with turn-by-turn navigation.
What else remains hanging low on the iOS new-features tree, though? I can think of a few:
Clever inter-application communication. Seems crazy that iOS, the direct descendant of NeXT, doesn’t have anything like Services, which were one of NeXT’s most touted features (and rightfully so). It’s also worth noting that Android has a pretty good Services-esque system in place, called “Intents”, and Windows 8 has an even richer concept called “Contracts”.
Third-party Notification Center widgets. Like the Stocks and Weather ones from Apple — information at a glance, without launching an app.
Third-party Siri APIs. Let other apps provide features you can interact with through Siri.
But that’s about it. And even the Siri API idea seems more like a “nice to have” feature idea than a low-hanging “Apple really has to do this sooner or later” idea. Again, I’m not saying Apple’s iOS to-do list is empty; I’m just saying the list of obvious they-gotta-do-it stuff is getting short.