By John Gruber
WorkOS: APIs to ship SSO, SCIM, FGA, and User Management in minutes. Check out their launch week.
Carole Cadwalladr, in an eye-opening piece for The Guardian, “Google, Democracy, and the Truth About Internet Search”:
Here’s what you don’t want to do late on a Sunday night. You do not want to type seven letters into Google. That’s all I did. I typed: “a-r-e”. And then “j-e-w-s”. Since 2008, Google has attempted to predict what question you might be asking and offers you a choice. And this is what it did. It offered me a choice of potential questions it thought I might want to ask: “are jews a race?”, “are jews white?”, “are jews christians?”, and finally, “are jews evil?”
Are Jews evil? It’s not a question I’ve ever thought of asking. I hadn’t gone looking for it. But there it was. I press enter. A page of results appears. This was Google’s question. And this was Google’s answer: Jews are evil. Because there, on my screen, was the proof: an entire page of results, nine out of 10 of which “confirm” this. The top result, from a site called Listovative, has the headline: “Top 10 Major Reasons Why People Hate Jews.” I click on it: “Jews today have taken over marketing, militia, medicinal, technological, media, industrial, cinema challenges etc and continue to face the worlds [sic] envy through unexplained success stories given their inglorious past and vermin like repression all over Europe.”
The top suggestion for a query starting with “are women” was “are women evil”, and the top suggested result displayed with a preview on the results page, beginning with “Every woman has some degree of prostitute in her. Every woman has a little evil in her.”
A few days later, I talk to Danny Sullivan, the founding editor of SearchEngineLand.com. He’s been recommended to me by several academics as one of the most knowledgeable experts on search. Am I just being naive, I ask him? Should I have known this was out there? “No, you’re not being naive,” he says. “This is awful. It’s horrible. It’s the equivalent of going into a library and asking a librarian about Judaism and being handed 10 books of hate. Google is doing a horrible, horrible job of delivering answers here. It can and should do better.”
He’s surprised too. “I thought they stopped offering autocomplete suggestions for religions in 2011.” And then he types “are women” into his own computer. “Good lord! That answer at the top. It’s a featured result. It’s called a “direct answer”. This is supposed to be indisputable. It’s Google’s highest endorsement.” That every women has some degree of prostitute in her? “Yes. This is Google’s algorithm going terribly wrong.”
Turns out, being a passive hands-off player in the world’s information means that people who put bigotry out there win simply for playing.
In other words, in the knowledge that bigoted, motivated people exist, inaction or indifference is an immoral and unethical decision.
I truly believe Google is staffed by great people who are not bigoted. But as a company, they treat bigotry as mere “opinion”, not as harm.
★ Monday, 5 December 2016