70% off

Does Facebook Polarize Users? Meta Disagrees With Partners Over Research Conclusions

Company and researchers differ in how to interpret the findings from studies of 2020 presidential election Meta executive Nick Clegg says the studies undermine claims that the company’s algorithms ‘serve people content that keeps them divided.’ Photo: kenzo tribouillard/Agence France-Presse/Getty Images By Jeff Horwitz July 27, 2023 3:53 pm ET A multiyear research collaboration between Meta Platforms and a group of scholars about social media’s effects on politics stumbled into a disagreement over the conclusions before the results were even announced. Four peer-reviewed papers from the research effort were published in the journals Science and Nature on Thursday, the beginning of what is expected to be many more based on data that Meta provided to the independent resea

A person who loves writing, loves novels, and loves life.Seeking objective truth, hoping for world peace, and wishing for a world without wars.
Does Facebook Polarize Users? Meta Disagrees With Partners Over Research Conclusions
Company and researchers differ in how to interpret the findings from studies of 2020 presidential election

Meta executive Nick Clegg says the studies undermine claims that the company’s algorithms ‘serve people content that keeps them divided.’

Photo: kenzo tribouillard/Agence France-Presse/Getty Images

A multiyear research collaboration between Meta Platforms and a group of scholars about social media’s effects on politics stumbled into a disagreement over the conclusions before the results were even announced.

Four peer-reviewed papers from the research effort were published in the journals Science and Nature on Thursday, the beginning of what is expected to be many more based on data that Meta provided to the independent researchers.

The research examined the role of Meta’s Facebook unit in politics during the 2020 election and analyzed questions such as whether the platform plays a role in polarization. Meta has long disputed the claims of those, including some of its own researchers, who say it exacerbates divides in part by how it filters information that people see. 

The studies covered in the first four articles had different focuses. They broadly showed the influence that Facebook holds over what information users consume on the platform but said, in part, that it isn’t clear such content affects their political views and behavior.

An embargoed statement from Meta said the studies refute accusations that its platforms play a central role in polarizing users and inflaming political discourse. That prompted the company’s academic partners, officials at Science and an independent observer brought in to oversee the research effort to say that Meta was overstating or mischaracterizing some of the findings.

Meta said it took issue with the title of Science’s print package on the findings, “Wired to Split,” which it says falsely characterizes the work. Representatives of the publication said Meta and outside researchers had asked for a question mark to be added to the title to reflect uncertainty, but that the publication considers its presentation of the research to be fair. 

Facebook parent company Meta has launched Threads, a stand-alone microblogging app that rivals Elon Musk’s Twitter. Within seven hours of its launch, the app gained 10 million sign-ups, according to Meta CEO Mark Zuckerberg. Photo: Yui Mok/Zuma Press

Using data gathered over several months around the 2020 U.S. elections, one paper found that Facebook Pages and Groups had played an outsize role in spreading false news stories among American conservatives on the platform. 

Others of the studies found that there was no clear effect on users’ political opinions or behavior when researchers experimented with changes to how Facebook delivers posts to users. Those experiments included temporarily restricting reshared content in users’ feeds, which tend to be more political than those recommended by Facebook’s algorithms, and temporarily showing users posts in chronological sequence rather than the algorithmically determined order Facebook usually uses.

Meta said that no other tech company has given researchers similar data access and that the studies added to a growing body of research refuting the idea that “key features of Meta’s platforms alone” cause harmful polarization. That statement was from a blog post by Nick Clegg,

Meta’s global affairs president, that the company prepared for publishing alongside the findings. 

While the research won’t settle every debate about social media and democracy, Clegg wrote, the papers published Thursday undermine claims that the company’s algorithms “serve people content that keeps them divided.”

That characterization of the papers in Meta’s statement triggered disagreement. 

The leaders of the academics, New York University professor Joshua Tucker and University of Texas at Austin professor Talia Stroud,

said that while the studies demonstrated that the simple algorithm tweaks didn’t make test subjects less polarized, the papers contained caveats and potential explanations for why such limited alterations conducted in the final months of the 2020 election wouldn’t have changed users’ overall outlook on politics. 

“The conclusions of these papers don’t support all of those statements,” said Stroud. Clegg’s comment is “not the statement we would make.”

Science warned Meta earlier this week that it would publicly dispute an assertion that the published studies should be read as largely exonerating Meta of a contributing role in societal divisions, said Meagan Phelan, who oversees the communication of Science’s findings.

“The findings of the research suggest Meta algorithms are an important part of what is keeping people divided,” Phelan told Meta’s communications team on Monday, according to an excerpt of her message she shared with The Wall Street Journal. She added that one of the studies found that “compared to liberals, politically conservative users were far more siloed in their news sources, driven in part by algorithmic processes, and especially apparent on Facebook’s Pages and Groups.”

In response, Meta slightly altered the wording of Clegg’s blog post, but stood by its overall characterization of the findings, saying disagreements over research interpretation are normal.

In a brief interview, Science editor Holden Thorp called Meta’s revisions to Clegg’s statement inconsequential. 

“Science is right to disagree,” said Michael Wagner, a political scientist at the University of Wisconsin-Madison’s School of Journalism jointly chosen by Meta and the outside researchers as an independent rapporteur to document the collaborative research process and evaluate its independence and rigor.

Wagner and others familiar with the research said they expect a number of additional papers from the effort to find significant cause for concern regarding how Meta’s platforms shape political discourse. 

Meta spent more than $20 million on its work facilitating the research, and its staffers worked with the researchers, but it agreed not to have control over what they said, according to Wagner. Funding for the researchers’ work came from other sources.

Wagner said “Meta demonstrated a strong commitment to rigorous social scientific research.” But he said Meta “set the agenda in ways that affected the overall independence of the researchers,” such as through decisions over workflow.

Ravi Iyer, a former Meta data scientist focused on polarization issues, said that past Meta research had also found evidence that Facebook’s mechanics sometimes favored incendiary content and conflict.

“I applaud the effort of everyone involved, but it is important to contextualize these experiments alongside the dozens of other similar studies that Meta has done on the effects of reshares,” Iyer, who now manages the University of Southern California’s Psychology of Technology Institute, said in an interview.

Write to Jeff Horwitz at [email protected]

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Media Union

Contact us >