Fraud Management & Cybercrime , Governance & Risk Management , Privacy
Experts Slam Social Media Platforms' Data Policies
Hearing: Researchers Liken Major Platforms to a 'Disinformation Black Box'Cybersecurity and computer science experts testifying before Congress on Tuesday expressed concerns about their inability to access key social media data sets that could allow them to analyze and potentially help counter the spread of misinformation.
See Also: OnDemand | 2024 Phishing Insights: What 11.9 Million User Behaviors Reveal About Your Risk
In a hearing entitled "The Disinformation Black Box: Researching Social Media Data" before the House Committee on Science, Space, and Technology's Subcommittee on Investigations and Oversight, academic researchers from Northeastern University, New York University and the University of Illinois Urbana-Champaign told lawmakers that the business model for major social media platforms - which they describe as "maximizing user engagement" - is flawed and may perpetuate the flow of misinformation.
Witnesses called on Congress to develop a regulatory regime that could mandate that social media platforms make their user data available to the research community.
Similarly, a data portability bill being considered in the House - the ACCESS Act of 2021 - would mandate that companies allow users to move their data across platforms, spurring competition and heightened data visibility.
A Facebook spokesperson declined to comment specifically on the hearing. And Twitter did not immediately respond to Information Security Media Group's request for comment.
Facebook has previously said it "aims to support the growth of scientific knowledge in the areas of misinformation, polarization, information quality, and social conflict on social media and social technology platforms." In announcing winners for a research competition around these subjects in June, Facebook's Head of Research, Pratiti Raychoudhury, said, "Our collaborations with researchers from all over the world are critical to advancing our understanding of how technology impacts people and society."
Striking a Balance
Rep. Bill Foster, D-Ill., chairman of the subcommittee, said in his opening statement that while some social media companies may want to keep "a veil" over their inner workings, they should not be shielded from "outside accounting of how their platforms may be endangering public health and safety."
Foster continued, "You must strike a balance between protecting user privacy and confidential business information, while also acknowledging that objective, independent research is necessary to understand how these platforms influence modern society."
Rep. Jay Obernolte, R-Calif., the subcommittee's ranking member, said, "On one hand, we want to be a society that honors the exercise of free speech, but that is fundamentally in tension with the idea that we also have an obligation to stop the spread of misinformation."
Fact vs. Fiction
Dr. Alan Mislove, professor and interim dean of Khoury College of Computer Sciences at Northeastern University, who is a member of the university's Cybersecurity and Privacy Institute, testified that social media platforms are "very hesitant" to release data and "have often only released aggregated coarse-grained data in the face of scandal and public backlash."
His key message: "Researchers need Congress to sign into law requirements for platforms to make data available. … We need Congress' help to enable researchers to have sufficient access to data and social media platforms and work to ensure that the benefits of these platforms do not come at a cost that is too high for society."
Laura Edelson, co-director with Cybersecurity for Democracy at New York University, a research initiative aimed at exposing online threats, said Twitter "is the only major social media platform that allows most researchers access to public data … albeit at a high financial cost."
She said Facebook's 2016 acquisition of social monitoring platform CrowdTangle "offered access to public Facebook data," however, she called for expanded access, saying "very few researchers are allowed to [view] this tool."
"[Facebook's] own internal research has shown that the way they have built their algorithm disproportionately promotes misinformation and extreme content," Edelson told the committee. "To study these issues, all researchers need access to much more data than Facebook or most other platforms provide."
Dr. Kevin Leicht, professor in the University of Illinois Urbana-Champaign's Department of Sociology, who heads a multidisciplinary team studying social media misinformation, told lawmakers, "We know [misinformation is] not necessarily spread by nefarious individuals on the dark web, and we know what types of people are susceptible to consuming misinformation. We also know that combating [it] is harder the more [it is] repeated."
And Edelson continued, "I think we all want to get to a place where misinformation isn't prioritized and is not in the fast lane against factual content."
'Inherent, Systemic Problem'
Asked whether she believes Facebook chooses to promote potentially controversial content, Edelson told the committee, "I want to be clear about one thing. I don't think Facebook … has sat down and made the choice, 'We will promote this information.' [But] it has chosen to promote the most engaging content, and when its own internal research told it that the most engaging content was misinformation, or it was the most polarizing content, or hateful content, it didn't do anything about it."
Edelson told the subcommittee that it is "an inherent, systemic problem" when platforms are built around maximizing user engagement. "We probably need some regulation for this industry in the same way that we have regulation for pharmaceutical companies, or for banks," she said.
In a CNBC op-ed in May, Nick Clegg, Facebook's vice president of global affairs, and former deputy prime minister of the U.K., advised Congress to "set out clear rules on data portability to better enable people to move their data between services and 'vote with their feet.'
"It could also create rules to govern how platforms should share data for the public good. As society grapples with how to address misinformation, harmful content and rising polarization, Facebook research could provide insights that help design evidence-based solutions. But to do that, there needs to be a clear regulatory framework for data research that preserves individual privacy."
And on the still-pending ACCESS Act of 2021, Jerry Nadler, D-N.Y., chairman of the House Judiciary Committee, previously said, "[It] gives the Federal Trade Commission new authority and enforcement tools to establish pro-competitive rules for interoperability and data portability online. ... Importantly, the ACCESS Act also protects user privacy and data security. The bill empowers users to determine how and with whom their data is shared."