GFMD Policy & Advocacy Center
GFMD Homepage
  • GFMD Policy & Advocacy
  • GFMD Initiatives
    • Tech and Journalism Crisis and Emergency Mechanism (T&JM)
      • Consultations/meetings Reports
        • Consultation Report - Jan, 25th, 2023 - Riga
        • Consultation Report - Feb, 21st 2023 - Paris
      • Monitoring Organisations
      • Resources and Literature Review
    • Dynamic Coalition on the Sustainability of Journalism and News Media
      • Articles & Resources
      • Conferences, events, and session recordings
        • IGF 2023 Session: Data, Access & Transparency: A Trifecta for Sustainable News
        • IGF 2022 Session: Unbreaking the news: Media sustainability in the digital age
        • Frenemies: reinventing the Big Tech versus journalism dynamic (RightsCon 2022)
    • EU Media Advocacy Working Group
      • EU Advocacy - 2025 Priorities
      • EU Advocacy - 2024 Priorities
        • 2024 EU Elections
      • Previous groups/initiatives
      • Members & Observers
    • Working Group on UN Advocacy
      • The Fourth International Conference on Financing for Development (FFD4)
        • GFMD statement to the Fourth Preparatory Committee Session
        • Advocacy Toolkit
        • Relevant Resources
      • WSIS+20 Review
      • Summit of the Future
    • Journalism Cloud Alliance Inaugural Meeting
      • Meeting agenda
      • Speakers
      • Literature Review
      • Press release
  • Policy meetings
    • 2025
      • GFMD Policy and Advocacy Meeting (April 2025)
        • Meeting Agenda
        • Key recommendations
      • GFMD Policy Meeting (March 2025)
        • Meeting agenda
        • Literature review
      • Connecting the dots: How to use existing mechanisms to protect media freedom online? (January 2025)
        • T&JM Final Case Digest
        • Meeting report
        • Meeting agenda
        • Literature review
    • 2024
      • Post-Summit of the Future Updates and Upcoming Opportunities (November 2024)
        • Meeting agenda
        • Literature review
    • 2023
      • Workshop on Encryption and Media Freedom (June '23)
        • Workshop Report
        • Resources
    • 2022
      • Gender Equality in Media Regulation (May '22)
        • Meeting Report
        • Literature Review
  • Resources
    • Advocacy for Funding: Key Messages, Data and Resources
    • Featured resources
    • Advocacy for Media and Journalism Funding
    • Internet Governance
      • 10 FAQs on Internet Governance
      • Internet Governance
        • Academic Studies
        • Policy papers & briefings
        • Handbooks & Guides
        • Articles
        • Research & Reports
      • Journalism & Media Development – Digital Media
        • Toolkits
        • Handbooks
        • Videos
        • Academic Studies
        • Books
        • Articles
        • Research & Reports
      • Digital Media Literacy
        • Articles
        • Handbooks & Manuals
        • Academic Studies
        • Reports
      • Media Sustainability & Digital Markets
        • Interviews, speeches, videos, and talks
        • Toolkits, Newsletter, Indexes, guides, tools, & courses
        • Academic Studies
        • Policy Papers & Briefings
        • Books
        • Articles
        • Research, handbooks & reports
      • Artificial Intelligence
        • Toolkits
        • Networks
        • Academic Studies
        • Policy Papers & Briefings
        • Articles
        • Videos
        • Research & Reports
      • Content-related resources
      • Data Protection & Privacy
        • Toolkits & Newsletters
        • Academic Studies
        • Policy Papers & Briefings
        • Articles
        • Research & Reports
      • Disinformation and Misinformation – Human Rights
        • Toolkits
        • Handbooks & Guides
        • Academic Studies
        • Policy Papers & Briefings
        • Articles
        • Research & Reports
      • Digital Violence & Security
        • Toolkits & Networks
        • Handbooks & Guides
        • Academic Studies
        • Policy Papers & Briefings
        • Articles
        • Research & Reports
        • Webinars
    • AI and Journalism
    • Public Access to Information – SDG 16.10
  • Articles and reports about US funding freeze
  • Policies and legislation
    • EU's Multiannual Financial Framework
    • Artificial Intelligence Act (AI Act)
      • Official Documents
    • European Media Freedom Act (EMFA)
      • Official Documents
      • Briefing Breakfast on the European Media Freedom Act
      • Joint Letters, Policy Briefs and other resources
    • Global Digital Compact
    • Transparency and targeting of political advertising
      • Policy Briefs and other resources
    • Digital Markets Act (DMA)
      • DMA proposed activities 2021
    • Digital Services Act
      • Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Sing
      • Resources (DSA)
    • Rule of Law and Mechanisms
      • RoL proposed activities 2021
    • SLAPPs - Strategic Litigation Against Public Participation
      • SLAPPs proposed activities 2021
      • Coalition Against SLAPPs in Europe (CASE)
      • Resources & Reports
    • UNESCO Guidelines
  • Actors
    • Institutions & Other Organisations
      • Access Now
      • Association for Progressive Communications (APC)
      • Council of Europe (CoE)
      • Committee to Protect Journalists (CPJ)
      • DiploFoundation and GIP Digital Watch
      • Freedom House
      • GigaNET
      • Global Network Initiative (GNI)
      • Global Partners Digital (GPD)
      • ICANN
      • Institute of Electrical and Electronics Engineers (IEEE)
      • International Federation of Library Associations and Institutions (IFLA)
      • International Telecommunication Union (ITU)
      • Internet Engineering Task Force (IETF)
      • Internet Governance Caucus
      • Internet Governance Project
      • Internet Society (ISOC)
      • Media and Development Forum (FoME)
      • Mozilla
      • openDemocracy
      • Open Internet for Democracy
      • Ranking Digital Rights (RDR)
      • Reporters Without Borders (RSF)
      • Reuters Institute
      • UNESCO
      • Web Foundation
  • Advocacy & capacity building
  • Events and Training
    • Trainings and Capacity Building
      • Summer schools & courses
        • Indexes, guides, & tools
    • Conference & fora
    • Organisations & initiatives
  • ABOUT
    • GFMD Homepage
Powered by GitBook
On this page
  • The Market of Disinformation (Oxford Internet Institute)
  • The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation (University of Oxford)
  • The Biology of Disinformation: Memes, Media Viruses, and Cultural Inoculation (Institute for the Future)
  • State-sponsored Trolling: How Governments are Deploying Disinformation as Part of Broader Digital Harassment Campaigns (Institute for the Future)
  • Regulating disinformation with artificial intelligence (European Parliamentary Research Service)
  • How Internet Platforms Are Combating Disinformation and Misinformation in the Age of COVID-19 (Open Technology Institute/New America)
  • Getting it Right: Strategies for truth-telling in a time of misinformation and polarization (American Press Institute)
  • “Fake News Influences Real News” – Study finds fact-checkers have little influence on online news media (Boston University)
  • Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content (Open Technology Institute/New America)
  • Digital Planet 2017: How Competitiveness and Trust in Digital Economies Vary Across the World (The Fletcher School at Tufts University)
  • Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation (Oxford University)

Was this helpful?

  1. Resources
  2. Internet Governance
  3. Disinformation and Misinformation – Human Rights

Academic Studies

This resource section has leant heavily on the "Platform companies and news media" section of the reading list curated by the Reuters Institute for the Study of Journalism at the University of Oxford.

PreviousHandbooks & GuidesNextPolicy Papers & Briefings

Last updated 2 years ago

Was this helpful?

The Market of Disinformation ()

The2016 USelectionscontinueto cast a long shadow over democratic processes around the world. Morethan 40 countries are ponderinglegislative responses (Bradshaw, Neudert, & Howard, 2018). Meanwhile,the tech platforms have made more than125 announcementsdescribinghow, through self-regulation, they will solve themanipulation of their platforms by bad actors (Taylor, Walsh, & Bradshaw, 2018).

Among the more frequentlyreferencedself-regulatorymeasuresare changes to algorithmsand the use of artificial intelligence (AI)to demote disinformation and junk news.We askwhetherthese changes tookplace, and if so, have they had the intended impactof reducing the spread of disinformation on social media platforms?To date, much of the policy debate has focused on paid-for advertising on the platforms, but what about the viral spread of unpaid, organic content? The ‘black box’nature of today’s most widelyused platforms makesitdifficult for researchers and journalists to understand how algorithmic changes might be affecting both legitimate political campaigning and disinformation. It is essential that any reform of electoral regulation or oversight in the UK is informed by an understanding ofthe techniques usedin both the paid and the unpaid marketsofdisinformation.

The digital marketing industry can offer insights,albeit incomplete and heuristic in nature, intothe impact of algorithmic changes.Social mediamarketing and searchengine optimization (SEO)–that is, the practice of guessing, testing,and experimenting with algorithmsso that searches for particular wordsappear higher in search results–are part ofa multi-billion-dollar industry built upon understanding how these obscure technical systems rank, order, sort, and prioritize information. By interviewing professionalsand reviewing reports from the digital marketing industry, we can gain insight intothe impact that algorithmic changes might have had on the distribution of content online. The findings provide an additional evidencebase that caninform the Oxford Technology and Elections Commission’s project to identify potential regulatory reform of elections.

The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation ()

Over the past three years, we have monitored the global organization of social media manipulation by governments and political parties. Our 2019 report analyses the trends of computational propaganda and the evolving tools, capacities, strategies, and resources.

The Biology of Disinformation: Memes, Media Viruses, and Cultural Inoculation ()

In this research effort, they are focusing instead on the greater shift to a memetic landscape—no matter the origins and sponsorship of particular memes—and the impact of memetic activity on the media, social, and political environment. What does the migration from broadcast propaganda to social and memetic propaganda do to the social organism and its resistance to manipulation? Can technological fixes and government regulations adequately address the problem of propaganda in a computational environment, or must we look at ways to promote a more resilient social fabric? In short, which is the most effective approach to restoring the integrity of public discourse in an age of weaponized memetics: better technological protections, or a more resistant social psyche?

State-sponsored Trolling: How Governments are Deploying Disinformation as Part of Broader Digital Harassment Campaigns ()

In this paper, we examine the emergence of a new phenomenon: state-sponsored trolling. We define this phenomenon as the use by states of targeted online hate and harassment campaigns to intimidate and silence individuals critical of the state. There is evidence that governments around the world, leveraging the surveillance and hacking possibilities afforded by a new era of pervasive technology, are using new digital tactics to persecute perceived opponents at scale. These campaigns can take on the scale and speed of the modern internet with pinpoint personalization from troves of personal data afforded by cheap surveillance technologies and data brokers

Regulating disinformation with artificial intelligence ()

This study examines the consequences of the increasingly prevalent use of artificial intelligence (AI) disinformation initiatives upon freedom of expression, pluralism and the functioning of a democratic polity. The study examines the trade-offs in using automated technology to limit the spread of disinformation online. It presents options (from self-regulatory to legislative) to regulate automated content recognition (ACR) technologies in this context. Special attention is paid to the opportunities for the European Union as a whole to take the lead in setting the framework for designing these technologies in a way that enhances accountability and transparency and respects free speech. The present project reviews some of the key academic and policy ideas on technology and disinformation and highlights their relevance to European policy.

The unprecedented spread of COVID-19 across the globe has sparked a significant new wave of misinformation and disinformation online. In a time when the public must be armed with the most accurate information to combat this pandemic, many internet platforms have developed policies and practices to combat misleading and inaccurate information related to the virus. This report provides an overview of how various internet platforms are individually addressing the rapid spread of COVID-19-related misinformation and disinformation on their services. While this report aims to be comprehensive, it is important to note that platforms’ response efforts to the virus are rapidly changing and expanding, and as a result, this report may not encompass all efforts instituted by these companies. This report also offers recommendations on how these platforms can improve the efficacy of their efforts and also provide greater transparency to their users and the public. Further, the report includes recommendations on how U.S. policymakers can encourage further accountability and support efforts to combat the spread of misinformation and disinformation during this time.

It highlights:

  • Journalists need a new set of skills and strategies to operate in an information ecosystem infused with misinformation, to fend off attacks on their work as biased or “fake” and to reach polarized audiences.

  • There are a number of strategies for reporting on falsehoods without amplifying them. One is the “truth sandwich,” which involves stating a true fact, then the falsehood, then the true fact again. Journalists also must know their “tipping point” — the point at which a story about false information becomes too big to ignore.

  • Journalists can respond to attacks on their credibility by being transparent in their reporting, quickly acknowledging and correcting mistakes, and avoiding a “war footing” with antagonistic public figures.

  • Reaching polarized audiences calls for better listening and creating more opportunities for journalists to get out in the field. It also calls for more complex, nuanced stories that avoid moral outrage phrases or keywords that contribute to groupishness.

A recent study by two BU professors sought to measure the extent that fake news dictates part of the agenda for online media. More research is needed, they say, but their study showed that taken together, all those stories tied up a remarkable amount of news media resources that could have been devoted to other important issues.

The DEI 2017 is a data-driven holistic evaluation of the progress of the digital economy across 60 countries, combining more than 100 different indicators across four key drivers: Supply Conditions, Demand Conditions, Institutional Environment, and Innovation and Change. The resulting framework captures both the state and rate of digital evolution and identifies implications for investment, innovation, and policy priorities. DEI 2017 also highlights the evolving nature of the risks being created by our continuing reliance on digital technology. Towards this end, the study covers a key question of “digital trust.“ The DEI 2017 incorporates a newly devised analysis of digital trust that takes into account the trustworthiness of the digital environment for each country; the quality of users’ experience; attitudes towards key institutions and organizations; and users’ behavior when they interact with the digital world. This subject is of great interest to all participants in the digital economy, given the concerns about security of essential information, cyber-attacks, and consumers’ apprehensions—about the digital systems and their reliability, the digital companies and their growing dominance, and about the leaders of digital companies.The DEI framework segments the 60 countries into Stand Outs, Stall Outs, Break Outs and Watch Outs. Three countries are notable as standouts even within the Stand Out segment: Singapore, New Zealand, and the UAE. Each has a unique policy-led digital strategy and a narrative that may be considered by other nations as worthy of emulation or adoption. The Nordic countries and Switzerland are at the top of the DEI 2017 rankings. China, once again, tops the list of countries in terms of the pace of change in its digital evolution, or momentum.

The manipulation of public opinion over social media platforms has emerged as a critical threat to public life. Around the world,a range ofgovernment agencies and political parties are exploiting social media platforms tospreadjunk news and disinformation, exercisecensorship and control, and underminetrust in the media, public institutions, and science. At a time when news consumption is increasingly digital, artificial intelligence, big data analytics, and “black-box”algorithms arebeing leveraged to challenge truth and trust: the cornerstones of our democratic society. In 2017, the first Global Cyber Troops inventoryshed light on the global organization of social media manipulationby government and political party actors. This 2018 reportanalyses thenew trends of organized media manipulation,and the growing capacities, strategies and resources that supportthis phenomenon

How Internet Platforms Are Combating Disinformation and Misinformation in the Age of COVID-19 ()

Getting it Right: Strategies for truth-telling in a time of misinformation and polarization ()

“Fake News Influences Real News” – Study finds fact-checkers have little influence on online news media ()

Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content ()

Internet platforms are increasingly adopting artificial intelligence and machine-learning tools in order to shape the content we see and engage with online. The use of algorithmic decision-making is becoming particularly prevalent in online content moderation, as companies attempt to comply with speech-related legal frameworks while also trying to promote safety, positive user experiences, and free expression on their platforms. This report is the first in a that will explore different issues regarding how automated tools are being used by internet platforms to shape the content we see and influence how this content is delivered to us. The reports will focus on content moderation based on a platform’s content policies, the ranking of content on newsfeeds and in search results, the optimization and targeting of advertisement delivery, and content recommendations to users based on their prior content consumption. These reports will also explore how internet platforms, policymakers, and researchers can better promote fairness, accountability, and transparency around these automated tools and decision-making practices.

Digital Planet 2017: How Competitiveness and Trust in Digital Economies Vary Across the World ()

Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation ()

Oxford Internet Institute
University of Oxford
Institute for the Future
Institute for the Future
European Parliamentary Research Service
Open Technology Institute/New America
American Press Institute
Boston University
Open Technology Institute/New America
series of four reports
The Fletcher School at Tufts University
Oxford University