Software as the Product of Obsession Times Voice

Back in 2009, Merlin Mann and I jointly gave a talk at SxSW titled “Obsession Times Voice”. Regarding how it turned out, I wrote:

My muse for the session was this quote from Walt Disney: “We don’t make movies to make money; we make money to make more movies.” To me, that’s it. That’s the thing.

Merlin and I were talking about independent writers and podcasters, because that’s what we were (and remain), but the concept applies just as perfectly to independent developers. This came to my mind after reading (and linking to) David Smith’s description of the new Pedometer++ today. Not just what it does, but why he spent six years making it. That’s the sort of productive obsession that fascinates me.

Ice water is always refreshing, but it tastes better when you’re on a road trip to hell. It feels like the world of software is bifurcating quality-wise. This whole thing about Adobe’s new craptacular “modern” UI language (a.k.a. “Spectrum”) exemplifies one side of that bifurcation — the bad-and-getting-worse side. Software that is the product not just of an ignorance of long-established principles of interaction design, but of a willful disdain for those principles. What Adobe is now shipping is just inexplicably bad UI, ignoring literally decades of great work and long-mastered concepts — a lot of which work was pioneered by Adobe itself!

The whole thing with MacOS 26 Tahoe is similar. To be clear, the UI crimes in Tahoe are deeply worrisome, but they are nowhere near as severe as those in Adobe’s Spectrum. But the problems with Tahoe are steps down the same fork in the road that Adobe took years ago. Spectrum is where Tahoe suggests that MacOS was headed under Alan Dye’s leadership: cross-platform sameness for the sake of sameness, with a complete disregard for longstanding platform nuances and idioms. In Spectrum’s case those platforms are MacOS and Windows and the web. In Tahoe’s case it’s MacOS and iOS.1

The other side of the software fork is not deserted. It’s just populated, more than ever, by the products of small independent developers who obsess, first and foremost, over quality and artistic vision. Remarkable new software gems exhibiting spectacular UI design appear all the time. They’re just not coming from the biggest companies, the ones whose apps, alas, dominate not just our desktops and pockets but our entire culture today.2

There’s always been software with poorly designed user interfaces. Much of it has been successful financially, sometimes spectacularly so. I’d argue, in all seriousness, that that’s the story of Microsoft in a nutshell. What’s new today is poorly designed software from developers from whom we expect better. In the old days there were people who would argue that prioritizing good user interface design was a waste of time — like spending hours decorating cupcakes destined for kindergarteners who are simply going to mash them into their mouths. (Again: cf. Microsoft’s undeniable market success.) What’s new today is people holding up objectively bad interaction design and proclaiming it to be good, and the product of teams that purportedly prioritize “design”, when it’s clear they have no idea what they’re talking about. It’s one thing to make something poorly designed and shrug on the grounds that it doesn’t matter. It’s another thing to make something poorly designed and hold it up as good design.

We are justified to expect nothing short of insane greatness from Apple, and solidly good design from Adobe. In principle, all software ought to have well-designed user interfaces. That’s never going to be the case. But software for designers — Adobe’s raison d’être — absolutely demands to be well-designed itself, like how a book on writing must itself be well-written.

Perhaps I was wrong, though, to describe Adobe’s new UI as inexplicable. It’s just indefensible. The explanation for so much software going so rotten from a UI-design perspective is, the more I think about it, related to Nilay Patel’s “Software Brain” theory, which I’ve commented on both directly and indirectly. Here’s Patel’s definition of “software brain”:

The simplest definition I’ve come up with is that it’s when you see the whole world as a series of databases that can be controlled with the structured language of software code. Like I said, this is a powerful way of seeing things. So much of our lives run through databases, and a bunch of important companies have been built around maintaining those databases and providing access to them.

Zillow is a database of houses. Uber is a database of cars and riders. YouTube is a database of videos. The Verge’s website is a database of stories. You can go on and on and on. Once you start seeing the world as a bunch of databases, it’s a small jump to feeling like you can control everything if you can just control the data.

But that doesn’t always work.

You might think it counterintuitive that a movement obsessed with software would be spearheading a severe decline in the design quality of software, but in Patel’s definition, there’s no concept of software as art, as a practice, as a craft. Software brain is purely an obsession with software as a medium in and of itself. A means with no consideration for the end.

Framed in Walt Disney’s adage, software brain makes software only to make more money. The idea of making money in order to make more software — to afford the time and talent to craft it — does not compute. Framed in the metaphor that Steve Jobs used to close his introduction of the original iPad, and returned to again to close his final keynote at WWDC 2011, software brain is nowhere near the intersection of technology and the liberal arts. Software brain is so far down Technology Street that it’s no longer in the same zip code as Liberal Arts Avenue. Another way, perhaps, to define software brain is that it’s the utter rejection of Jobs’s maxim that “technology is not enough”. With software brain, technology is all there is.


  1. I don’t want to belabor the similarities between Adobe’s Spectrum UI system and Apple’s Liquid Glass, because there are significant differences. Foremost, what’s wrong with Spectrum is wrong everywhere. Photoshop with Adobe’s new “modern” UI is, I suspect, just as bad a Windows app as it is a Mac app. Whereas the usability problems with Liquid Glass are lopsided platform-wise. It’s a litany of disasters on MacOS 26 Tahoe, but actually pretty good on Apple’s other version 26 OSes, especially iOS. There are aspects of Liquid Glass on iOS 26 that some people don’t like, but they’re literally skin-deep. Cosmetic details. Functionally, iOS 26 is pretty strong, and Apple made some very nice changes regarding the placement of things like search fields to improve consistency system-wide. I still have iOS 18 running on my year-old iPhone 16 Pro, and there are very few things I prefer in iOS 18 versus iOS 26. Whereas I’d be sick if I had to work in MacOS 26 Tahoe every day.

    That’s my point here. iOS 26 doesn’t suffer in any way — not even one teensy little single way — from MacOS UI idioms being inappropriately applied to the iPhone. On the iPad, maybe there’s a little of that, like, say, the weird way iPadOS 26 uses Mac-style red / yellow / green window control buttons but makes them too small to use, so before you use them, you need a gesture to embiggen them temporarily first. But the implementation of “Liquid Glass” on MacOS Tahoe is just riddled with iOS-isms that aren’t appropriate on MacOS. So many decades-old Mac UI nuances and idioms were just ignored. They weren’t changed, they weren’t updated, they were just ignored. You either see that this is true or you don’t, and if you don’t see it, you shouldn’t be designing the Mac user interface. ↩︎︎

  2. Consider the age of television. Television is the broadcast of motion pictures with sound. Cinema is an artform. But at the peak of television’s hegemony over western culture and mass media, the artistic quality of almost everything on TV was terrible. It was slop. It wallowed in its own sloppiness. This, despite the fact that cinematic artists had largely mastered the artform in the decades preceding TV. TV became popular in the 1950s and culturally dominant in the 1960s. But Citizen Kane came out in 1941. The network executives with “TV brain” in the second half of the 20th century didn’t even consider TV as a medium for art. They just cared that it was watched. It was judged only by ratings and ad revenue, not artistic merit. That’s what’s happening with software right now. But remember too, that as dreadful television programming rocketed to stratospheric popularity in the 1970s, that same decade saw a remarkable explosion in innovative filmmaking in movie theaters. Keep the faith. ↩︎︎