Linked List: September 5, 2017

Brian X. Chen on the Samsung Galaxy Note 8 

Brian X. Chen, reviewing the new flagship phone from Samsung for The New York Times:

That brings us to what stinks about the Note 8. Some of the biometrics, including the ability to unlock your phone by scanning your face or irises, are so poorly executed that they feel like marketing gimmicks as opposed to actual security features.

The iris scanner shines infrared light in your eyes to identify you and unlock the phone. That sounds futuristic, but when you set up the feature, it is laden with disclaimers from Samsung. The caveats include: Iris scanning might not work well if you are wearing glasses or contact lenses; it might not work in direct sunlight; it might not work if there is dirt on the sensor.

I don’t wear glasses or contact lenses and could only get the iris scanner to scan my eyes properly one out of five times I tried it.

Shipping features like this is what separates Apple from Samsung.

Pixelmator Pro 

Coming this fall:

The modern, dark single-window interface of Pixelmator Pro has been created exclusively for working with images. Its streamlined, macOS-inspired design provides a completely native Mac app experience and is fully consistent with the look and feel of macOS. And a reimagined, user-centered workflow design makes the professional editing tools in Pixelmator Pro incredibly accessible, even to first-time users.

Pixelmator Pro pushes the boundaries of image editing, using breakthrough machine learning to deliver more intelligent editing tools and features. Integrated via the new, blazing fast Core ML framework, machine learning lets Pixelmator Pro detect and understand various features within images, bringing a number of groundbreaking advancements, such as jaw-droppingly accurate automatic layer naming, automatic horizon detection, stunningly realistic object removal, and intelligent quick selections.

Looks beautiful, and such a great example of an app taking advantage of the APIs in MacOS. Moving from a bunch of floating palettes to a single-window interface feels like the modern way to go.

Behind the Scenes at Apple’s Fitness Lab 

Ben Court, in a feature story for Men’s Health:

Located on a side street in Cupertino — a few miles from Apple’s shiny new headquarters — the single-story building these Apple workers are entering looks like any anonymous suburban office block. Inside, once I clear security and get buzzed past a solid white door, I enter an invite-only secret exercise lab. On a recent morning, about 40 employees are sweating away on different contraptions — rowers, treadmills, cable machines — as 13 exercise physiologists and 29 nurses and medics monitor data. Many of the exercisers are hooked up to a metabolic cart and ECG and are wearing a $40,000 mask apparatus that analyzes their calorie burn, oxygen consumption, and VO2 max. Down one hall there’s a studio for group fitness; behind another white door an endless pool; and over there, three chambers where temperatures can be set to mimic Arctic conditions (subfreezing) to Saharan heat (100°F-plus). At Apple every room has a name, and these climate-controlled chambers are called Higher, Faster, and Stronger.

The labels are appropriate, because the company that transformed the way you enjoy music and video is now sinking its teeth into a meatier challenge: new ways you can optimize your health. “Our lab has collected more data on activity and exercise than any other human performance study in history,” says Jay Blahnik, Apple’s director of fitness for health technologies, in a rare interview. “Over the past five years, we’ve logged 33,000 sessions with over 66,000 hours of data, involving more than 10,000 unique participants.” A typical clinical trial enrolls fewer than a hundred participants.

Donning my Apple Kremlinologist hat, I take this story as a strong sign that we’ll see new Apple Watch hardware at next week’s event. Otherwise, why do it now?

The Red Sox Used Electronic Devices, Including Apple Watch, to Steal Signs Against Yankees 

Michael S. Schmidt, reporting for The New York Times:

Investigators for Major League Baseball have determined that the Boston Red Sox, who are in first place in the American League East and likely headed to the playoffs, executed a scheme to illicitly steal hand signals from opponents’ catchers in games against the second-place Yankees and other teams, according to several people briefed on the matter. [...]

The Yankees, who had long been suspicious of the Red Sox stealing catchers’ signs in Fenway Park, contended the video showed a member of the Red Sox training staff looking at his Apple Watch in the dugout and then relaying a message to players, who may have then been able to use the information to know the type of pitch that was going to be thrown, according to the people familiar with the case.

Baseball investigators corroborated the Yankees’ claims based on video the commissioner’s office uses for instant replay and broadcasts, the people said. The commissioner’s office then confronted the Red Sox, who admitted that their trainers had received signals from video replay personnel and then relayed that information to some players — an operation that had been in place for at least several weeks.

What is it with these New England teams and their need to cheat?

Also: has there ever been a more Daring Fireball-worthy story than this?

The Very Bad Economics of Killing DACA 

Paul Krugman:

So this is a double blow to the U.S. economy; it will make everyone worse off. There is no upside whatever to this cruelty, unless you just want to have fewer people with brown skin and Hispanic surnames around. Which is, of course, what this is really all about.

Tech Industry Calls for Legislative Action After Trump Administration Announces End to ‘Dreamer’ Immigration Program 

Nat Levy has a good roundup of tech industry leaders’ responses to the Trump administration’s announcement today that it will end the DACA program in six months. The CEOs from Microsoft, Google, Facebook, Apple, and more are all in unison on this.

Cortana and Alexa, Sitting in a Tree 

Nick Wingfield, reporting for The New York Times:

For the past year, the two companies have been coordinating behind the scenes to make Alexa and Cortana communicate with each other. The partnership, which the companies plan to announce early Wednesday, will allow people to summon Cortana using Alexa, and vice versa, by the end of the year. [...]

Initially, getting the two systems to work together is going to be a little awkward. Someone working with an Alexa device will have to say “Alexa, Open Cortana” followed by their command, while someone starting with a Cortana machine will have to say “Cortana, Open Alexa.”

It’s certainly interesting that Microsoft and Amazon are collaborating on this, but telling one of them to “open” the other is really awkward. You, the user, shouldn’t have to memorize which tasks go with which assistant.

A.I. assistants remain in their infancy. This feels like getting two babies to talk to each other, when what we really need is to nurture them so that they can mature as fast as possible.