Science editors raise new doubts on Meta’s claims it isn’t polarizing

Meta said it had been transparent with researchers about its actions during the time of the study. (Reuters)
Meta said it had been transparent with researchers about its actions during the time of the study. (Reuters)

Summary

The editorial warns that Facebook’s changes after the 2020 election may have swayed the conclusions of a prominent study.

Meta Platforms’ claims that Facebook doesn’t polarize Americans came under new doubt as the journal Science raised questions about a prominent research paper the tech giant has cited to support its position.

In an editorial Thursday, Science said that Meta’s emergency efforts to calm its platforms in the wake of the 2020 election may have swayed the conclusions of the paper, which the journal published in July 2023.

The editorial, titled “Context matters in social media," was prompted by a letter that Science also published presenting new criticism of the paper. Because the study of Facebook’s algorithms relied on data provided by Meta when it was undertaking extraordinary efforts to restrain incendiary political content, the letter’s authors argue that the paper may have overstated the case that social media algorithms didn’t contribute to political polarization.

Such criticisms of peer-reviewed research often appear below papers in academic journals, but Science’s editors felt their editorial was needed to more prominently caveat this original paper’s conclusions, said Holden Thorp, Science’s editor in chief.

“It was incumbent on us to come up with a way somehow that people who would come to the paper would know of these concerns," Thorp said in an interview. While no correction was warranted, he said, “There’s an election coming up, and we care about people citing this paper."

Meta said it had been transparent with researchers about its actions during the time of the study, and the company and its research partners say it had no control over the Science paper’s conclusions. Meta called debates of the sort aired on Thursday as part of the research process.

“Questions about the role of social media and society will be discussed for many decades to come," said Meta spokesman Andy Stone. “We believe the data we’ve shared and will continue to share with researchers will contribute to these conversations in a meaningful and helpful way."

The new criticisms are the latest example of tension over how Meta interacts with researchers seeking access to its data as they try to understand the effects of its platforms. The study at issue was one of several published in the summer of 2023 as part of a collaboration between Meta and scholars about social media’s effects on politics—an effort that spawned disagreement over the conclusions even before the results were published.

The study in question—titled “How do social media feed algorithms affect attitudes and behavior in an election campaign?"—involved an experiment in which Meta turned off its normal algorithmically curated stream of posts for a group of volunteers, showing them posts in strictly chronological order for 90 days. Researchers then looked at what users saw and surveyed them to detect changes in their political sentiments.

The resulting paper concluded that disabling algorithmic content sorting didn’t lower polarization or exposure to misinformation, but did reduce satisfaction with Facebook. While noting that the findings were strictly applicable only to Facebook’s algorithms “as they existed in the fall of 2020," the researchers suggested their work cast doubt on the idea that social media could be a root cause of polarization.

Meta embraced this interpretation. Nick Clegg, president of global affairs, cited the research as debunking claims that Facebook and Instagram “serve people content that keeps them divided," a position that Stone said Meta still holds.

Both Science and some of the paper’s authors called Meta’s statements an overreach at the time, but the journal’s new editorial said that the letter from critics published Thursday “further casts doubt on Facebook’s contentions."

During part of the period when the study’s data was collected, the critical letter notes, Facebook temporarily altered its algorithms to reduce election-related misinformation and incitement. Known at Meta as “break the glass" measures, these algorithm changes restricted the spread of posts from users known to post misinformation, heightened precautions against pushing violent content and shifted toward favoring “higher quality" news outlets.

Such changes undermined the study’s ability to characterize the effects of Facebook’s algorithm under regular conditions, the critics said.

A rebuttal from the original paper’s authors disputed that Meta’s emergency changes fundamentally altered their conclusions, but said that the potential impact of the emergency algorithm changes should have received more emphasis.

“In conclusion, we appreciate the opportunity to reiterate the limitations of our paper," the authors wrote.

Information about the “break the glass" measures also was publicly available before the paper’s publication, including in Wall Street Journal articles published in 2021.

Limitations of data-sharing partnerships with Meta have prompted some researchers to push for government-mandated access to social-media companies’ data.

“We’ve seen over and over that we can’t rely on voluntary efforts to get the sort of transparency that we want from large platforms," said Brandon Silverman, the creator of CrowdTangle, a data-transparency tool that Meta operated before shutting it down last month.

Still, such data-sharing partnerships continue. A new one, to be run by the Center for Open Science, will involve the effects of Instagram on user mental health—though Meta won’t share what content Instagram users post, share and consume, significantly restricting the project’s scope.

Brian Nosek, head of the center, said that some researchers involved in previous efforts had warned him about the difficulties of voluntary collaborations with Meta. He chose to pursue the coming Instagram mental health research project, he said, because of “relentless optimism" and a belief that research into the societal effects of social media was too important not to try.

“It’s still worth giving it a shot," he said.

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

MINT SPECIALS