By John Gruber
WorkOS: APIs to ship SSO, SCIM, FGA, and User Management in minutes. Check out their launch week.
Kevin Roose, Mike Isaac, and Sheera Frenkel, reporting for The New York Times:
Typically, N.E.Q. scores play a minor role in determining what appears on users’ feeds. But several days after the election, Mr. Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to N.E.Q. scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.
The change was part of the “break glass” plans Facebook had spent months developing for the aftermath of a contested election. It resulted in a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible, the employees said.
It was a vision of what a calmer, less divisive Facebook might look like. Some employees argued the change should become permanent, even if it was unclear how that might affect the amount of time people spent on Facebook.
Facebook shouldn’t need to inject emergency doses of truth and reality into their newsfeed. It should be the norm, full stop.
In the past several months, as Facebook has come under more scrutiny for its role in amplifying false and divisive information, its employees have clashed over the company’s future. On one side are idealists, including many rank-and-file workers and some executives, who want to do more to limit misinformation and polarizing content. On the other side are pragmatists who fear those measures could hurt Facebook’s growth, or provoke a political backlash that leads to painful regulation.
This is a good report from the Times, but calling the one side “idealists” and the other “pragmatists” is a disservice to both sides. Those who want to limit misinformation and polarizing content are good, honest people. There’s nothing “idealistic” about that. And the other side, who are on the side of pushing misinformation and polarizing content, despite knowing how harmful it is, are not “pragmatists”.
Sociopath is the word. The definition fits to a T: a person with a personality disorder manifesting itself in extreme antisocial attitudes and behavior and a lack of conscience. There’s no better word to describe Facebook’s leadership:
The company had surveyed users about whether certain posts they had seen were “good for the world” or “bad for the world.” They found that high-reach posts — posts seen by many users — were more likely to be considered “bad for the world,” a finding that some employees said alarmed them.
So the team trained a machine-learning algorithm to predict posts that users would consider “bad for the world” and demote them in news feeds. In early tests, the new algorithm successfully reduced the visibility of objectionable content. But it also lowered the number of times users opened Facebook, an internal metric known as “sessions” that executives monitor closely.
Facebook knowingly pushes polarizing misinformation, particularly to conservatives, because it’s addictive and despite knowing exactly what they’re doing and why it’s wrong and that it’s making the world worse.
Mark Zuckerberg is a sociopath. A real-life Bond villain.
★ Tuesday, 24 November 2020