Skip to content

Tech & Digitalisation

Response to Full Fact- Consultation Response on Framework for Information Incidents


Briefing17th May 2021

In March 2021, Full Fact  a nonprofit organization that fact check claims made by politicians, public institutions, and journalists, as well as viral content online  published a consultation document on its framework on how to respond to information incidents that occur during moments of crisis. This response was originally submitted on 13th May 2021. 

Summary

In April 2021, Google’s video platform, YouTube, launched a new public service announcement series in partnership with the Vaccine Confidence Project and the London School of Hygiene & Tropical Medicine. For the series, YouTube pairs public health experts such as the White House Chief Medical Advisor, Dr. Anthony Fauci, with online influencers including music artists, comedians, and FarmTubers from around the world, to discuss Covid-19. The series intends to reach users in new ways and help address questions around Covid-19 vaccine safety and efficacy. In essence, the partnership leverages the existing social-media advertising infrastructure to launch a public-health influencer strategy to counter COVID-19 misinformation and vaccine hesitancy. 

Alongside recent social mobilization initiatives such as YouTube's PSA series, social media platforms have quickly adapted and responded to Covid-19 misinformation, often changing their community standards to catch and remove false information. Governments, media, fact-checkers, academics, and civil society have also creatively responded to the vast Covid-19 misinformation environment, often in partnership with internet companies. But, as Full Fact writes, these emergency measures and initiatives have "revealed the need for greater discussion on principles, proportionality, and the use of evidence in responding to other types of  information incidents in the future".

We strongly welcome Full Fact's framework as a necessary policy document to guide the effective management of the information environment. The effort to define information incidents and recommend proportionate responses is a good example of how to tackle complex internet regulation issues among stakeholders with varied interests. Born from our extensive research and consultations in the area of online harms and extremism, our feedback seeks to add insight and recommendations for effective management of misinformation and disinformation, specifically, online. 

We believe that while physical events and the accompanying news have virtual aspects which affect the information environment, engineered and accidental virtual events in themselves create information environments that require equally robust proportionate responses. Moreover, meaningful response to any information incident will require useful definitions for online harms and extremism content while effective real-time response will require regulatory frameworks for consistent procedural review and audit systems. In summary, our response argues that the framework should: 

  1. Expand the information incidents categories to include virtual events within their own right 

  1. Consider extremism content on the web as an ever-present threat to the information environment and that a meaningful strategy to tackle this event type requires international cooperation to define online harms and extremism content 

  1. Recommend consistent procedural review and audit systems as a requirement for the effectiveness of real-time urgency of response and collaboration guidelines for the online information environment  

Virtual Events as a Category of Information Incidents

Q1. Is there anything missing, either as a category of information incident, or significant type of situation that might fall within a category? 

We believe that 'virtual events', both coordinated and accidental, should constitute an information incident category. In our paper, Social Media Futures: What is Brigading?, we make the case that coordinated online behaviours have a significant impact on the safety of the virtual and physical information environment. In a recent example of brigading, an online forum, WallStreetBets, became notorious for its role in the GameStop short squeeze which caused over US$70 Billion in losses for some U.S. firms and inevitably shifted the financial information environment in both the short and long term. Accidental virtual events such as the rise and mainstreaming of NFTs have similarly shifted the information environment for the creator economy (and increasingly, the environment), through tokenizing ownership of digital assets.   

Tackling Extremist Content Online Through International Collaboration

Q2. Thinking about efforts to combat misinformation in exceptional circumstances, are there any important challenges missing or challenges that you would characterise differently? 

To meaningfully combat misinformation in exceptional circumstances, international collaboration is required to define online harms and categorize/list extremist content and networks so that the urgency of response and collaboration guidelines within Full Fact’s framework may achieve their desired aim.  

Our paper, Tech Companies and the Response to White Supremacist Content Online, illuminates a clear scenario of the use case: following the devastating attack on Muslim worshippers in Christchurch, New Zealand, and the subsequent live streaming of the event on 15 March 2019, Facebook said it removed some 1.5 million videos of the attack in the first 24 hours. Though significant, this was the tip of the iceberg of related content online. While policymakers called on tech companies to do more, consistent and relatively accurate systems that use machine learning and artificial intelligence to uncover Islamist terrorist content and online networks could not be similarly applied. The automated tools for uncovering Islamist terrorist content are based on the internationally recognised UN list of proscribed terrorist groups and people and such a defined list of groups, peoples, and networks are not defined for far-right and white supremacist terrorism. 

While content moderation presents a strong case for international cooperation, the broader state of internet governance also threatens a safe, free, and open internet. The increasing and divergent restrictions in internet regulation — both across the world and throughout the internet stack — threatens the fundamental stability, interdependence, and openness of the internet in the long term. Our upcoming report argues that liberal democracies must take a more proactive role in shaping global internet infrastructure and standards in the near term, while completely rebooting the institutional model for the long term if we are to maintain freedom of expression online. 

The Need For Procedural and Audit Systems During Normal Business Times

Q3. Looking at the high-level aims, are there any missing aims and/or significant responses which should be included here? (Please state the number of the challenge you are referring to.) 

We also recommend that times of normal business activity (Level 1) require the use of procedural and audit systems to ensure a safe online information environment. We argue in our paper, Online Harms: Bring in the Auditors, that in the long term, a new independent tier of regulatory audit can provide effective regulatory scrutiny addressing the information asymmetry that exists when governments work with big tech. In this way, ongoing transparent reporting based on globally defined qualitative auditing standards can provide consistent proactive responses which will also benefit the online information environment tremendously when information incidents occur. 

Finally, in our article, A Shot in the Arm for Platform Integrity, we present key takeaways from our Globalism Study – polls conducted in collaboration with YouGov between July and August 2020. Our findings on the effect of social media on belief in Covid-19 related conspiracies further support our recommendations and Full Fact's approach – that alongside strong transparency and procedural review and audit systems, governments should exercise their regulatory authority and convening power to encourage platforms to share learnings and data they have gathered on misinformation with researchers and with each other to inform information incidents response while continuing to develop robust social-mobilisation campaigns of their own. 

Article Tags


Newsletter

Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions