Pixel Perfect

I’ve spent my whole life thinking about dots, largely in the form of on-screen pixels. I remember first seeing a Pac-Man coin-op arcade game, wondering how it worked, and deducing the basic gist: the screen was a matrix of dots, like animated graph paper. I loved graph paper. The first font I ever loved — years before I even knew what a font was or took any interest whatsoever in typography — was the one used in Namco video games, a 7-by-7-grid pixel font used to render everything from your score to the current level to the dreaded-but-inevitable “GAME OVER”.

Dots were how computers rendered everything: pixels on screen, dots of ink/toner on paper. The trendline has been the same for all dots, print and pixel alike: big, chunky, jagged monochromatic at the outset, but ever smaller, ever more colorful and vibrant as time marches on.

Print always has been ahead, at least with regard to dot-density. The first printers I ever had regular access to were ImageWriters, attached to various members of the Apple II family. The technology behind the ImageWriter celebrated the nature of the beast: dot-matrix. Dot-matrix printers are seldom used today, but all modern printers render their output in a matrix of dots. We just don’t see the dots anymore. By today’s standards, the ImageWriter’s 144 DPI output was crude (and noisy) — but it was far sharper (less fuzzy, smaller dots) than our displays of the era.

I can think of almost nothing I have done professionally over the last 20 years that has not been rendered as a matrix of dots. Print design, web publishing, photography, filmmaking — it’s all just dots. And the smaller the dots, the better the output looks. When the individual dots are indistinguishably small, the illusion is complete.

When I went to college in 1991, my parents bought me an inkjet StyleWriter. I fell in love instantly, because its 360 DPI output was so obviously superior to the dot-matrix ImageWriter output to which I was accustomed. The output looked real. Or at least it did for a few weeks, until I saw the output of a 300 DPI LaserWriter (slightly fewer dots per inch than the StyleWriter, but far crisper and more precise than early ’90s inkjet technology could achieve).

That gave way to 600 DPI (and eventually a magnificent 1200 DPI model) HP LaserJets at the student newspaper. There, doing design destined for high-resolution print, the discrepancy between on-screen and on-paper dot pitch was never more tangible. Our displays offered about 90 pixels per inch, give or take — nowhere near high enough resolution to accurately render print-quality fonts. So, on-screen, our fonts were, well, inaccurately rendered. High-quality fonts of the era included both PostScript and bit-mapped pixel font variants. The printer used the PostScript font, of course, but on-screen, we saw the crude screen fonts. The screen fonts’ glyphs oftentimes bore but a vague resemblance to those of the actual typefaces, but they were legible when rendered with just a handful of relatively large pixels. Print fonts are made of perfectly smooth curves and straight lines; screen fonts are made of dots.

We loved our Macs and loved designing with them, but what we saw on-screen wasn’t our design work. It was a crude pixelated approximation of our design work. You had to print to see what you really had. Much in the way that developers must compile and run to test their apps, designers needed to print and look to check their designs. 600 DPI print output looked real; 90 DPI on-screen output looked not-real.

The crude big-pixel displays of the ’90s and early ’00s were what made designing for the web feel so janky, at least for anyone coming from the world of print. Our low-resolution comps were, on the web, the real shipping product, and our only font choices were those designed for low-resolution screen use. Good web design, at the time, embraced the pixelated nature of browser rendering.

It was all just dots, and looked like dots.


Today’s pre-retina Mac displays are excellent, especially when judged by historical standards. Brighter, more vibrant colors, and — again, by historical standards — smaller, sharper pixels. A regular 15-inch MacBook Pro ships with a 1440 × 900 pixel display at about 110 pixels per inch, and can be configured with a 1680 × 1050 display at about 130 pixels per inch. Both the 11- and 13-inch MacBooks Air sport resolutions of roughly 130 pixels per inch. Far beneath the retina threshold, but much nicer than our sub-100-PPI displays of the 90s, to say nothing of the mere 72 PPI display on the original 1984 Macintosh.

But we went from 72 PPI in 1984 to 132 PPI in 2012 gradually — a few more pixels per inch every few years. Along the way there was never a moment of celebration, no single great leap forward pixel-density-wise. Even the shift from bulky CRTs to slim flatscreen LCDs didn’t bring about a significant upgrade in terms of pixel size.

But now this. The 15-inch MacBook Pro With Retina Display. This is a boom. A revolution in resolution. The display I’ve been craving ever since I first saw high-resolution laser printer output.

There is much else to praise about the machine. It is wickedly fast, with benchmark scores that place it alongside the Mac Pro. Think about that: a laptop with no performance tradeoff compared to a high-end desktop. It wakes from sleep nearly instantly. It sports the best laptop keyboard I’ve ever used. It’s no Air but it’s noticeably and appreciably thinner and lighter than any previous MacBook Pro. Like no Apple device since the original 2007 iPhone, the new Retina 15-inch MacBook Pro feels like a device from the near future, something slightly beyond the ken of today’s cutting edge.

But the main thing is the display. That display. This display. Oh my.

220 PPI is less than the Retina iPad (264 PPI), which in turn is less than the iPhone 4/4S (326 PPI). Part of that is simply a factor of viewing distance. You hold your phone closer to your eyes than an iPad, and your iPad closer to your eyes than a MacBook. But there’s something else. Retina text looks better on the MacBook Pro than on the iPhone or iPad, even when you move in pretty close to the screen — and non-retina text and graphics (on the web, or UI elements in not-optimized-for-retina-yet apps) look far worse on the MacBook Pro than they do on the iPad or iPhone (or did, perhaps, insofar as non-retina graphics are a thing of the past on iOS other than graphics on web pages, which, even there, are often zoomed out).

I can only tell you how profound the difference is. I can’t show you. Because if I included example graphics and you viewed them on a non-retina Mac display, your display wouldn’t be capable of rendering what I see. And if you’re using a MacBook Pro With Retina Display yourself, you already know what I mean.

One conclusion I’ve made in the weeks I’ve been using this machine: sub-pixel anti-aliasing matters. iOS doesn’t offer sub-pixel anti-aliasing; Mac OS X does. And I believe it’s one reason on-screen text looks even better on the retina MacBook Pro than it does on the ostensibly higher-resolution iPad and iPhone. There was an idea — espoused even by yours truly at one point — that with sufficient pixel-per-inch density, sub-pixel anti-aliasing would be superfluous. In practice, it’s the other way around. On the retina MacBook Pro, sub-pixel anti-aliasing no longer carries any trade-offs — no visible color fringes, no slight emboldening of letterforms. It’s just pure icing on the text rendering cake. To say that text is rendered at print quality is to imply that you have one hell of a professional-grade printer.

Good on-screen design targeting this caliber of display cannot embrace the pixel, any more than print design can target individual dots-on-paper from a laser printer. The sort of rich, data-dense information design espoused by Edward Tufte can now not only be made on the computer screen but also enjoyed on one. Regarding font choices, you not only need not choose a font optimized for rendering on screen, but should not. Fonts optimized for screen rendering look cheap on the retina MacBook Pro — sometimes downright cheesy — in the same way they do when printed in a glossy magazine.

Great fonts, intricately designed for high-resolution output, aren’t just allowed, they are necessary for a design that truly sings on this display. In fact, if anything falls down on the software side, it’s Lucida Grande, Mac OS X’s system font. It was a stellar choice by Apple in 2001 and has served ably for more than a decade, primarily because it renders so crisply through Apple’s anti-aliasing algorithms.1 In short, Lucida Grande renders better than most fonts on pre-retina displays. But on the retina MacBook Pro, it looks like what it is: a font optimized for low-resolution displays. There’s a reason you seldom see Lucida Grande used in print.2

When I first started using the retina MacBook Pro, the whole thing felt fake, like I was using a demo version of Mac OS X ginned up in After Effects for shooting closeups of the screen for, say, an Apple commercial in which they didn’t want UI elements to look pixelated. Some degree of pixelation has always been part of my Mac experience.

Consider this cursor (shown at 4× magnification):

Mac OS X arrow cursor.

I’ve been staring at a variation of that cursor since the 1980s. Perfectly vertical on the left, stair-stepped at precisely 45° on the right. Now, though, on this machine, it’s a perfect arrow, as perfectly un-jaggedly straight on the diagonal as it is on the vertical.

The pixels are still there, as you can see when the retina arrow cursor is blown up:

Mac OS X arrow cursor at retina display resolution.

but at actual size, there seemingly are no pixels. Just an ideal arrow.

And as summer has worn on and I’ve used the retina MacBook Pro more and more, my impression has been pulled inside out. Now, only the retina MacBook Pro feels real to me, and all my other Macs feel ersatz.3 Low-resolution approximations of the ideal that now sits before my eyes.


  1. In fact, a well-placed little birdie once told me that Apple’s Core Graphics framework is riddled with special-case exemptions specifically to make individual glyphs in Lucida Grande — just Lucida Grande, specifically — render sharper on low-resolution displays. ↩︎

  2. Which raises some intriguing questions. How long will Lucida Grande remain the Mac OS X system font? If it’s replaced — and I think it should be — by what? Helvetica Neue is an obvious choice, given its use as the system font on the golden child iOS. My longshot bet, though, is Myriad. ↩︎

  3. Which puts me in a dilemma, personally. I gave up using a 15-inch MacBook Pro several years ago. Instead, I use one machine at my desk, connected to a standalone display, and I use an 11-inch MacBook Air when I’m away from my desk. When I’m at my desk I want a big standalone display; when I’m away from the desk I want the smallest, lightest MacBook possible. The 15-inch retina MacBook Pro doesn’t fit this model. It’s way heavier and clumsier than the Air when used as a portable (especially on airplanes, a frequent mobile use case for me), and it would be criminal to put this machine on my desk only to hook it up to a fat-pixeled non-retina Cinema Display. There is simply no doubt in my mind that this is the best computer Apple has ever made, by a long shot, but I don’t think I’m going to buy one for myself. A 13-inch MacBook Pro With Retina Display, though, might be a good tradeoff as a replacement for my current (two-year-old) Air. I just can’t see ever again buying a new non-retina Mac of any sort, the extra weight of a 13-inch MacBook Pro compared to the Air be damned. ↩︎