Republishing on Facebook as “good for the world” or “bad for the world” (NY Times, 2020/11/24)

An online social network reproduces content partially based on algorithms, and partially based on the judgements made by human beings. Either may be viewed as positive or negative.

The trade-offs came into focus this month [November 2020], when Facebook engineers and data scientists posted the results of a series of experiments called “P(Bad for the World).”

The company had surveyed users about whether certain posts they had seen were “good for the world” or “bad for the world.” They found that high-reach posts — posts seen by many users — were more likely to be considered “bad for the world,” a finding that some employees said alarmed them.

So the team trained a machine-learning algorithm to predict posts that users would consider “bad for the world” and demote them in news feeds. In early tests, the new algorithm successfully reduced the visibility of objectionable content. But it also lowered the number of times users opened Facebook, an internal metric known as “sessions” that executives monitor closely.

“The results were good except that it led to a decrease in sessions, which motivated us to try a different approach,” according to a summary of the results, which was posted to Facebook’s internal network and reviewed by The Times.

The team then ran a second experiment, tweaking the algorithm so that a larger set of “bad for the world” content would be demoted less strongly. While that left more objectionable posts in users’ feeds, it did not reduce their sessions or time spent.

That change was ultimately approved. But other features employees developed before the election never were.

Roose, Isaac and Frenkel (2020), New York Times

Scholars might ask if the information system is designed as an open system that sweeps in new information, or if the information that circulates becomes self-referential and seal-sealing.

This might lead to a deeper look on the inquiring system behind the creation and dissemination of knowledge, e.g. “Inquiring systems and asking the right question | Mitroff and Linstone (1993)“.

References

Kevin RooseMike Isaac and Sheera Frenkel | “Facebook Struggles to Balance Civility and Growth” | New York Times, November 24, 2020 at https://www.nytimes.com/2020/11/24/technology/facebook-election-misinformation.html (reprinted in the Toronto Star, December 19, 2020).

#algorithms, #facebook, #inquiring-system, #inquiry-system