Busca en Nuestros Archivos

Busca en Nuestro Blog

Translate / Traducir

11 octubre, 2024

From Disinformation to Publication: A Case Study in Academic Failure

 

In the realm of academic publishing, a 1300-fold error should raise more than eyebrows--it should sound alarms. Yet, at the 14th ACM Web Science Conference 2022, sponsored by SIGWEB, such a colossal miscalculation found its way into the proceedings, calling into question the very foundations of academic integrity in the digital age.

The journey of misinformation from a flawed report to peer-reviewed academic literature is a cautionary tale that underscores the vulnerabilities in our systems of knowledge production and dissemination. This article examines how the controversial "Disinformation Dozen" report, despite its significant methodological flaws, managed to infiltrate academic discourse and spawn a series of studies that further entrenched its dubious claims.

The Origin: The CCDH Report

On March 24, 2021, the Center for Countering Digital Hate (CCDH) released a 40-page report titled "The Disinformation Dozen," which claimed that just 12 individuals were responsible for 65% of anti-vaccine content on social media platforms.1 The report named:

  1. Dr. Joseph Mercola
  2. Robert F. Kennedy Jr.
  3. Ty and Charlene Bollinger
  4. Dr. Sherri Tenpenny
  5. Rizza Islam
  6. Dr. Rashid Buttar
  7. Erin Elizabeth
  8. Sayer Ji
  9. Dr. Kelly Brogan
  10. Dr. Christiane Northrup
  11. Dr. Ben Tapper
  12. Kevin Jenkins

The Flaws Emerge: Meta's Rebuttal

On August 18, 2021, Meta (formerly Facebook) released a statement disputing the CCDH's findings. According to Meta, there was no evidence to support the claim that the "Disinformation Dozen" were responsible for such a significant portion of anti-vaccine content on their platforms. In fact, Meta revealed that these 12 individuals were actually responsible for only about 0.05% of all views of vaccine-related content on Facebook, including both accurate and inaccurate posts.2

This revelation exposed a staggering 1300-fold discrepancy between the CCDH's assertions and the actual data, calling into question the entire premise of the "Disinformation Dozen" narrative.

Methodological Shortcomings

A closer examination of the CCDH report reveals several critical flaws in its methodology:

  1. Limited Sample Size: The report analyzed only 483 pieces of content over six weeks.
  2. Narrow Focus: The content was drawn from just 30 groups, some with as few as 2,500 members.
  3. Lack of Representativeness: This sample is not representative of the billions of posts about COVID-19 vaccines shared on Facebook and Instagram.
  4. Unclear Criteria: The CCDH failed to provide clear explanations for how they identified content as "anti-vax" or chose the groups they included in their analysis.
  5. Disregard for Platform Efforts: The report failed to acknowledge the steps Meta had taken to combat what it called 'vaccine misinformation,' including removing over three dozen pages, groups, and Facebook or Instagram accounts linked to the "Disinformation Dozen."

From Flawed Report to Academic Paper: The Swiss Connection

Despite these glaring issues, the "Disinformation Dozen" narrative found its way into academic research. At the 14th ACM Web Science Conference 2022 (WebSci '22), sponsored by SIGWEB (Special Interest Group on Hypertext and the Web), a paper titled "The Disinformation Dozen: An Exploratory Analysis of Covid-19 Disinformation Proliferation on Twitter" was presented and published in the proceedings.3

This research was not conducted in isolation. It was funded by a grant from the Swiss National Science Foundation (SNSF), raising questions about the role of public funding in potentially flawed research.

Grant Details:

  • Grant number: 195707
  • Funding scheme: Spark
  • Call: Spark 2020 March
  • Approved amount: 96,450 CHF
  • Status: Completed
  • Research institution: University of Applied Sciences and Arts of Southern Switzerland (SUPSI)
  • Institute: NetLab ISIN - DTI SUPSI

Principal Investigator:

Silvia Giordano, Thermo-Fluid Dynamics Laboratory, Department of Innovative Technologies, SUPSI, Switzerland

The Ripple Effect: Proliferation in Academic Literature

The inclusion of this flawed research in a reputable academic conference has had far-reaching consequences. Numerous peer-reviewed studies have since cited and built upon the "Disinformation Dozen" narrative, creating a self-reinforcing cycle of misinformation within academic literature. Here's a comprehensive list of studies that have referenced or built upon this concept:

  • Ng, L., Kloo, I., Clark, S., & Carley, K. (2024). An exploratory analysis of COVID bot vs human disinformation dissemination stemming from the Disinformation Dozen on Telegram. Journal of Computational Social Science, 7(1), 695-720.
  • Sharma, S., Sharma, R., & Datta, A. (2024). (Mis)leading the COVID-19 Vaccination Discourse on Twitter: An Exploratory Study of Infodemic Around the Pandemic. IEEE Transactions on Computational Social Systems, 11(1), 352-362.
  • Bârgăoanu, A., Buturoiu, R., & Durach, F. (2024). Predictors of COVID-19 Vaccine Acceptance: The Role of Trust and the Influence of Social Media. Social Work in Public Health, 39(1), 20-35.
  • Ahmed, W., Önkal, D., Das, R., Krishnan, S., Olan, F., Hardey, M., & Fenton, A. (2024). Developing Techniques to Support Technological Solutions to Disinformation by Analyzing Four Conspiracy Networks During COVID-19. IEEE Transactions on Engineering Management, 71, 13327-13344.
  • Pierri, F., DeVerna, M., Yang, K., Axelrod, D., Bryden, J., & Menczer, F. (2023). One Year of COVID-19 Vaccine Misinformation on Twitter: Longitudinal Study. Journal of Medical Internet Research, 25, e42227.
  • Kareklas, I., Bhattacharya, D., Muehling, D., & Kisekka, V. (2023). Reexamining health messages in the political age: The politicization of the COVID‐19 pandemic and its detrimental effects on vaccine hesitancy. Journal of Consumer Affairs, 57(3), 1120-1150.
  • McKenzie, A., Avshman, E., Shegog, R., Savas, L., & Shay, L. (2024). Facebook's shared articles on HPV vaccination: analysis of persuasive strategies. BMC Public Health, 24(1).
  • Hassoun, A., Borenstein, G., Osborn, K., McAuliffe, J., & Goldberg, B. (2024). Sowing "seeds of doubt": Cottage industries of election and medical misinformation in Brazil and the United States. New Media & Society.
  • DeVerna, M., Aiyappa, R., Pacheco, D., Bryden, J., Menczer, F., & Guarino, S. (2024). Identifying and characterizing superspreaders of low-credibility content on Twitter. PLOS ONE, 19(5), e0302201.
  • Lyall, B., & Marple, P. (2024). Parliament, petitions and pandemic: Conspiracism in Australia's federal e‐petitions system, 2020-2021. Policy & Internet.
  • Panizza, F., Ronzani, P., Morisseau, T., Mattavelli, S., & Martini, C. (2023). How do online users respond to crowdsourced fact-checking?. Humanities and Social Sciences Communications, 10(1).
  • Ezzeddine, F., Ayoub, O., Giordano, S., Nogara, G., Sbeity, I., Ferrara, E., & Luceri, L. (2023). Exposing influence campaigns in the age of LLMs: a behavioral-based AI approach to detecting state-sponsored trolls. EPJ Data Science, 12(1).
  • Pierri, F., Luceri, L., Chen, E., & Ferrara, E. (2023). How does Twitter account moderation work? Dynamics of account creation and suspension on Twitter during major geopolitical events. EPJ Data Science, 12(1).
  • Rathje, S., Robertson, C., Brady, W., & Van Bavel, J. (2023). People Think That Social Media Platforms Do (but Should Not) Amplify Divisive Content. Perspectives on Psychological Science.
  • Gatta, V., Luceri, L., Fabbri, F., & Ferrara, E. (2023). The Interconnected Nature of Online Harm and Moderation. Proceedings of the 34th ACM Conference on Hypertext and Social Media, 1-10.
  • Schafer, J., Starbird, K., & Rosner, D. (2023). Participatory Design and Power in Misinformation, Disinformation, and Online Hate Research. Proceedings of the 2023 ACM Designing Interactive Systems Conference, 1724-1739.

This extensive yet non-exhaustive list demonstrates how deeply the flawed "Disinformation Dozen" concept has permeated various fields of research, from computational social science to public health and media studies.

Implications and Conclusion

The case of the "Disinformation Dozen" report and its subsequent proliferation through academic channels serves as a cautionary tale about the power of misinformation in the digital age. It underscores several critical issues:

  1. Methodological Rigor: The importance of robust and transparent research methodologies cannot be overstated, especially when dealing with contentious topics that may influence public policy.
  2. Peer Review Process: The fact that such a fundamentally flawed study made it through peer review and into a reputable conference proceedings raises questions about the efficacy of current academic vetting processes.
  3. Funding Accountability: The role of public funding bodies, such as the Swiss National Science Foundation, in ensuring the quality and accuracy of the research they support needs to be examined.
  4. Citation Echo Chambers: The way in which subsequent studies uncritically cited and built upon the flawed "Disinformation Dozen" concept highlights the dangers of citation bias and the potential for creating academic echo chambers.
  5. Interdisciplinary Challenges: The complex nature of studying online misinformation requires expertise from multiple disciplines. This case demonstrates the need for more robust interdisciplinary review processes.
  6. Academic Accountability: The lack of response from the authors to inquiries about their work raises concerns about accountability in academic research, particularly when errors or flaws are identified post-publication.
  7. Societal Impact: Given the potential influence of such research on public policy and discourse, there is a pressing need for mechanisms to quickly correct the academic record when significant errors are discovered.

Moving forward, several steps could be taken to address these issues:

  1. Enhance peer review processes, particularly for studies making significant claims about contentious issues.
  2. Implement stricter fact-checking protocols for cited sources in academic publications.
  3. Develop better mechanisms for post-publication critique and correction in academic literature.
  4. Foster a research culture that values quality and reproducibility over novelty and impact.
  5. Improve training for researchers and reviewers in identifying potential red flags in methodology and data interpretation.
  6. Encourage more open dialogue between researchers, critics, and the subjects of their research.

Conclusion: Steps Towards Accountability

While the proliferation of the "Disinformation Dozen" narrative through academic and media channels is concerning, there are signs of progress in addressing this issue. Some mainstream media outlets have shown a willingness to correct the record when presented with evidence of the CCDH report's flaws.

For instance, The Independent, a prominent UK-based news outlet, added the following update to their article on the "Disinformation Dozen" after being contacted by concerned parties:

"Update, 7 August 2024: five months after the publication of this report, Meta issued a statement that disputed the claims contained and the methodology used by its authors."4

This acknowledgment, while brief, represents a step in the right direction. It demonstrates that media organizations can be responsive to new information and are willing to update their reporting to reflect a more accurate picture of events.

Such actions by media outlets are crucial in the fight against misinformation. They serve as a reminder that the pursuit of truth is an ongoing process, and that responsible journalism involves not just reporting but also correcting and updating information as new evidence comes to light.

As we move forward, it is hoped that more media organizations and academic institutions will follow suit, critically examining the sources they rely on and being open to correction when errors are identified. This approach not only serves the interests of accuracy but also helps to rebuild trust in institutions that play a vital role in informing public discourse.

The journey from misinformation to correction is often long and challenging, but each step taken towards accountability and transparency is a victory for the integrity of public information. As we continue to navigate the complex landscape of digital information, such responsiveness offers a glimmer of hope for a more informed and discerning society.


References

1.: Center for Countering Digital Hate, "The Disinformation Dozen: Why Platforms Must Act on Twelve Leading Online Anti-Vaxxers," March 2021, https://www.counterhate.com/disinformationdozen.

2.: Monika Bickert, "How We're Taking Action Against Vaccine Misinformation Superspreaders," Meta, August 18, 2021, https://about.fb.com/news/2021/08/taking-action-against-vaccine-misinformation-superspreaders/.

3.: Gianluca Nogara et al., "The Disinformation Dozen: An Exploratory Analysis of Covid-19 Disinformation Proliferation on Twitter," in WebSci '22: 14th ACM Web Science Conference 2022 (Barcelona, Spain: ACM, 2022).

4.: Mayank Aggarwal, "Study names 12 most dangerous anti-vaxxers in America," The Independent, March 26, 2021, https://www.independent.co.uk/news/world/americas/us-politics/disinformation-dozen-study-anti-vaxxers-b1822308.html.

No hay comentarios:

Publicar un comentario