By John Gruber
Manage GRC Faster with Drata’s Agentic Trust Management Platform
Brian Krebs:
I cannot recall a previous data breach in which the breached company’s public outreach and response has been so haphazard and ill-conceived as the one coming right now from big-three credit bureau Equifax, which rather clumsily announced Thursday that an intrusion jeopardized Social security numbers and other information on 143 million Americans.
Bloomberg moved a story yesterday indicating that three top executives at Equifax sold millions of dollars worth of stock during the time between when the company says it discovered the breach and when it notified the public and investors.
Shares of Equifax’s stock on the New York Stock Exchange were down more than 13 percent at time of publication versus yesterday’s price.
The executives reportedly told Bloomberg they didn’t know about the breach when they sold their shares. A law firm in New York has already announced it is investigating potential insider trading claims against Equifax.
Raise your hand if you believe they really weren’t aware of the breach.
This piece published last month in Apple’s Machine Learning Journal has a table at the bottom with audio clips comparing how the U.S. Siri voice sounded in iOS 9, 10, and 11. It truly is striking how much better she sounds now — and the improvements last year in iOS 10 were pretty good, too.
David Pierce, writing for Wired:
When I ask Acero what he learned about why the voice worked so well, he laughs because the answer is so obvious. “It is natural!” he says. “It was not robotic!” This hardly counts as a revelation for Acero. Mostly, it confirmed that his team at Apple has spent the last few years on the right project: making Siri sound more human.
This fall, when iOS 11 hits millions of iPhones and iPads around the world, the new software will give Siri a new voice. It doesn’t include many new features or tell better jokes, but you’ll notice the difference. Siri now takes more pauses in sentences, elongates syllables right before a pause, and the speech lilts up and down as it speaks. The words sound more fluid and Siri speaks more languages, too. It’s nicer to listen to, and to talk to.
Siri’s voice does sound more natural in iOS 11, and this is most definitely a good thing. It’s the voice assistant equivalent to getting a better UI font or retina graphics for a visual UI. But: if given a choice between a Siri that sounds better but works the same, or a Siri that sounds the same but works better, I don’t know anyone who wouldn’t choose the latter.
Such slow-moving has cost Apple its lead in many people’s eyes, as Amazon and Google hoover up developer support and race ahead in features. Joswiak at least projects patience. The question, he says, is not how many things Siri could do. “It’s ‘how do you do it right?’ Because what we didn’t want to do is become prescriptive.” He bristles at Amazon’s and Google’s demanding syntax, which require you to say things like, “Alexa, ask Daily Horoscopes about Taurus” or “OK Google, let me talk to Todoist.” He’d rather wait until you just say what you want, however you want, and have it happen. Apple, as always, prefers doing nothing to doing something halfway.
I get this, and agree with Apple’s sentiments here. I think the rigid, convoluted syntax required by Alexa is maddening. It’s like speaking a command line, not talking. But even so, Siri, as it stands today, is at best a halfway product. Again, I’m pro-Siri in the voice assistant debate, but even so I think it’s generous to describe it as “halfway”. The whole category is garbage, Siri included. And frankly, it just doesn’t feel like Apple has made as much progress in six years as they should have.
Something went wrong in Siri’s development, and it wasn’t the voice quality.
The Wall Street Journal, under the byline “Yoko Kubota in Tokyo, Tripp Mickle in San Francisco, and Takashi Mochizuki in Tokyo”:
The production delays earlier this summer stemmed in part from Apple’s decision to build new phones using organic light-emitting diode, or OLED, screens similar to those used by rival Samsung Electronics Co. At the same time, Apple decided to ditch the physical home button that contains fingerprint sensors for unlocking the device. Apple tried to embed the Touch ID function, or fingerprint scanner, in the new display, which proved difficult, the people familiar with the process said.
As deadlines approached, Apple eventually abandoned the fingerprint scanner, the people said, and users will unlock the phone using either an old-fashioned password or what is expected to be a new facial-recognition feature. Nonetheless, precious time was lost and production was put back by about a month, according to people familiar with the situation.
By this account, it sounds like Apple is rushing D22 to market, before it’s actually good enough to ship. But it really depends who the Journal’s sources are. With two of the bylined reporters in Tokyo, it seems very possible the sources are in the supply chain, not at Apple.
It all comes down to how good the facial recognition is. If it’s as fast, reliable, trustworthy, and convenient as Touch ID, then omitting Touch ID is a legitimate design choice. Forward progress on biometrics. If it’s worse than Touch ID in any meaningful way, it’s an inexcusable mistake.
Mark Wilson, writing for Fast Company:
Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear. […]
An intruder who wanted to “open the backdoor” would already need to be inside your home, close to your Echo. But hacking an iPhone seems like no problem at all. A hacker would nearly need to walk by you in a crowd. They’d have their phone out, playing a command in frequencies you wouldn’t hear, and you’d have your own phone dangling in your hand. So maybe you wouldn’t see as Safari or Chrome loaded a site, the site ran code to install malware, and the contents and communications of your phone were open season for them to explore.
It’s a clever hack, and something Apple, Amazon, Google, et al ought to address. But if you have a passcode on your iPhone (and you should), Siri won’t open websites while locked. It will place phone calls, though.