Consultation Report - Feb, 21st 2023 - Paris

The second consultation was held during UNESCO’s Internet for Trust Conference in Paris on 21 February 2023, co-organized by GFMD and UCLA’s Institute for Technology, Law and Policy.

Key takeaways

  • In Ukraine and other conflict zones, journalism and media organizations face unique challenges when it comes to content moderation on social media platforms. Social media platforms seem to lack capacity for effectively and promptly dealing with content moderation challenges and media organisations frequently experience delays and a lack of follow-up from platforms after flagging moderation issues. By the time they make decisions, news items become outdated.

  • Some platforms remove graphic content related to war and potential violations of human rights. Media outlets often practice self-censorship to avoid being banned or to prevent their accounts from being suspended. There is a need to ensure that content about war crimes and human rights violations is not lost.

  • Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.

  • To effectively address crises, companies should have a cross-functional team that can be work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.

  • Certifying "content" can be challenging. Therefore, the recommended approach is to focus on the account level, e.g. the media outlet itself and its practices. Ethical Journalism Network (EJN) and Journalism Trust Initiative (JTI) recommend ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance. They also propose defining a ready-to-use framework as soon as media outlets are in a crisis situation (or even better, in advance). A network of nominating sources, such as national press freedom organizations or councils, should be used to nominate media outlets that should be encompassed in this emergency protocol. However, there was also acknowledgement that crises come in many different shapes and sizes, including prolonged, slow burning ones, so that should be addressed in future iterations,

  • Participants advocated the development of an early warning system with a multistakeholder approach. This mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.

  • The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.

Issues identified

In Ukraine and other conflict zones, journalism and media organisations face unique challenges when it comes to content moderation on social media platforms. The Lviv Media Forum has experienced delays and a lack of follow-up from platforms after flagging moderation issues, indicating a need for clear mechanisms of cooperation and involvement of local actors in responding to content moderation issues.

Some platforms prioritise creating a "joyful, happy environment," leading to the removal of graphic content related to war and potential violations of human rights. While platforms lack the capacity to promptly and effectively deal with content moderation challenges, media outlets often practise self-censorship to avoid being banned or to prevent their accounts from being suspended. Evidence preservation and speech suppression were mentioned by several participants. It is necessary to ensure that information about war crimes and human rights violations is preserved.

Media organisations also face physical risks, cybersecurity threats, and legal threats linked to their online activities. Frequent trolling campaigns are designed to manipulate algorithmic systems, causing them to automatically remove or suppress content. Trusted Partner programmes exist, but are unevenly and not transparently deployed. A systematic approach is lacking, with issues often addressed on a case-by-case basis.

Navigating these challenges requires a collaborative effort between social media platforms, media organisations, and local actors to establish clear protocols and support for fair and effective content moderation practices during crisis. Local communities recognised by international organisations can provide recommendations and validation that media organisations observing ethical and professional standards.

Lessons learned from existing initiatives

There are numerous mechanisms and initiatives in place to deal with content moderating in crises and emergencies. Many of them are focused on countering hate speech and terrorist content online, like the Christchurch Call, which was launched by New Zealand and France; Terrorist Content Analytics Platform’s (T-CAP) crisis protocol policy employs an alert system, archives live streams and footage, and trains AI to detect violent terrorist content; the Global Internet Forum to Counter Terrorism (GIFCT) was an initiative founded by tech platforms to prevent terrorists and violent extremists from exploiting digital platforms. Although the focus of these initiatives is narrow and targets a specific type of content, it is still difficult to evaluate their effectiveness. There is a lack of information on their goals, metrics, and benchmarks for success. Claims of success are often based solely on quantitative data, such as the number of takedowns or actions taken against content, provided by companies without access to independent review or audit. The focus on quantitative measures raises questions about the logic behind these mechanisms and whether they are truly effective in addressing the underlying issues.

There are also crisis response plans in place for social media platforms such as Twitter, Google, YouTube, and Meta. Another initiative from the private sector is the Global Alliance for Responsible Media, which focuses on advertising next to problematic content. From the civil society side, Access Now’s Digital Security Helpline, which provides rapid-response emergency assistance for online attacks (for more details please consult Annex I).

Discussion

Emergency and crisis responses

Crises can be defined in various ways, including natural disasters. The distinction between a continuing war or an invasion is that they present peaks in communication issues at different times.

Issues arising from crises and emergencies are multifaceted, involving various actors in different locations who will influence how information is shared and how people, organisations, and governments share and respond to it.

Many existing initiatives have focused on the content side and automated moderation that aims to take the burden off of individuals to review and deal with every single piece of content. Trusted flaggers/partners programs exist to leverage the expertise from the local context.

To effectively address crises, companies should have a cross-functional team that can work across issues and areas, and are able to work both externally and internally, with the ability to make decisions on both content issues and future directions a crisis may take.

Additionally, in a situation of crisis, the platform's perspective is to understand who the key actors are, where the information comes from, and mitigate state-aligned information operations since there is little that individual journalists or media outlets can do to mitigate against these. By understanding key actors and sources of information, companies can better respond to crises and ensure that valuable information is preserved.

A long-term framework is necessary to address crises, including protocols for the initial crisis and learning from the initial implementation. They need to look at the systemic challenges of individual companies, what is already working, where are the problems.

Identification of credible and trusted journalism actors

Sometimes, the regulation of hate speech and illegal content on platforms goes beyond the capacity of industry mechanisms because these issues stem from structural problems in society such as misogyny, homophobia, or racism.

To address this, the Ethical Journalism Network (EJN) recommends ethical audits to media outlets as a self-assessment procedure to evaluate their ethical standards and governance. Smaller media outlets such as fact-checking, non-profit, and younger newsrooms are easier to work with, especially in relation to the self-assessment program promoted by EJN. However, certifying "content" can be challenging. Therefore, the approach is to focus on the media outlet itself and its practices.

During a crisis, it is difficult for media outlets to help formulate frameworks or mechanisms. To address this, Reporters Without Borders (RSF) and the Journalism Trust Initiative (JTI) propose defining a framework in the early stages of a crisis or even before a crisis. All stakeholders involved should agree on the adoption of a ready-to-use framework as soon as media outlets are in a crisis situation. A network of nominating sources, such as national press freedom organizations or councils, could be used to nominate media outlets that should be encompassed in this emergency protocol.

Establishing a voluntary multistakeholder mechanism

The Chilling, ICFJ and UNESCO report reveals that online violence, which operates at the intersection of disinformation and other forms of hate speech, can be incredibly damaging due to its chilling effect. Among its recommendations, the study advocates for the development of an early warning system with a multistakeholder approach. This mechanism should be integrated into the UN Plan of Action on Safety of Journalists and its national plans of action, as the problem directly affects press freedom and the safety of journalists.

The report also calls for the creation of a system that helps to predict, monitor, and ultimately prevent the escalation of online violence to offline harm. Shockingly, 20% of the women journalists interviewed for the report experienced offline attacks, abuse, and harassment that they believed had been seeded online.

Platforms are comfortable with outsourcing risk assessments and monitoring of online violence escalation and targeting of journalists to civil society organizations. However, this poses many challenges and is not always efficient. Platforms should be asked to commit and invest more resources in this area.

The non-responsiveness of platforms is a problem globally. Data and cases need to be systematically gathered to showcase the scale and depth of the problem.

Governments need to be part of the conversation and should create regulatory frameworks to address this issue. The mechanism must be data and evidence-driven, especially to support advocacy on social media accountability and facilitate systematic, sustainable collaboration between civil society and tech platforms.

Background information

The Global Forum for Media Development (GFMD), is launching a multistakeholder mechanism Tech and Journalism Crisis and Emergency Mechanism (T&JM), to establish an escalation channel between tech platforms and independent and public interest media, and community and investigative journalism organisations. After the first workshop with local partners in Riga, GFMD and UCLA’s Institute for Technology, Law and Policy co-organised a consultation during UNESCO’s Internet for Trust Conference in Paris in February 2023. The consultation sought views from a wider range of partners on how the mechanism should be operationalized, how different actors should be involved, and what the expectations of such a mechanism were.

Based on the findings from the Riga workshop, the consultation emphasised the importance of local news is and a need to protect local news organisations and journalists. They often lack a channel to communicate with platforms when their content or accounts are blocked or removed due to algorithmic decisions, false flagging, trolling or cyber-attacked. In the past, GFMD has been hearing about these same issues from members and partners on a regular basis, particularly in countries with small markets that do not receive a lot of commercial interest from the platforms. Local voices are finding it increasingly difficult to operate in an environment that can often be hostile to smaller players.

Annex I: Existing crisis/emergency response initiatives

INITIATIVE

LED/FOUNDED BY

Brief description

Internet Governance Forum (IGF)

United Nations

The IGF is a global multistakeholder platform that facilitates the discussion of public policy issues pertaining to the Internet. Whereas it is not a crisis/emergency response platform per se, it is a key place to discuss and debate policies in relation to the digital environment.

Global Network Initiative (GNI)

Private Sector, Academia, CSOs

Global Network Initiative aims to protect and advance freedom of expression and privacy rights in the ICT industry by setting a global standard for responsible company decision making and serving as a multistakeholder voice in the face of government restrictions and demands.

Global Internet Forum to Counter Terrorism (GIFCT)

Private sector

The Global Internet Forum to Counter Terrorism (GIFCT) is an NGO designed to prevent terrorists and violent extremists from exploiting digital platforms. Founded by Facebook, Microsoft, Twitter, and YouTube in 2017, the Forum was established to foster technical collaboration among member companies, advance relevant research, and share knowledge with smaller platforms.

Christchurch Call and Christchurch Call Advisory Network (CCAN)

Governments

Initiative started by New Zealand and France, set pledges for companies to sign up and for civil society to sign up as supporters through the Advisory Network.

Terrorist Content Analytics platform crisis protocol policy (T-CAP)

United Nations / Governments

Tech Against Terrorism is an initiative launched and supported by the United Nations Counter Terrorism Executive Directorate (UN CTED) working with the global tech industry to tackle terrorist use of the internet whilst respecting human rights. With support from Public Safety Canada, Tech Against Terrorism launched the Terrorist Content Analytics Platform (TCAP) to prevent the spread of violent terrorist content and flagging it to content moderation teams.

Internet referral units

Law enforcement units that are able to refer content to platforms when this violates their terms of services. Little oversight or availability to audit the impact of those referrals.

Crisis Protocols by social media platforms

Private Sector

Twitter: crisis misinformation policy in the context of armed conflict

Google: Crisis response focused on natural disasters, relying on AI driven predictive modelling

Youtube: crisis resource panel where they partner with verified service partners in specific countries or regions. Additionally, the panel has an open forum that anyone can submit through.

Meta: Crisis policy protocol aimed at anticipating risks, also partnering with different stakeholders.

Civil Society

Global Alliance for Responsible Media (GARM)

Private sector

Initiative through the World Economic Forum, association with advertisers: set of sixteen categories of problematic content where advertisers do not want their ads to show up next to.

Annex II: Existing initiatives for identification of trusted journalism actors

INITIATIVE

LED BY

Brief description

Internews, WAN-IFRA, PubMatic, Group m

Initiative of United for News, a non-profit coalition of global media industry and international brands

RSF

International standard for showcasing and promoting trustworthy journalism

Global Media Registry

Social enterprise

EJN

Coalition of journalists, editors, press owners and media support

Journalism and technology tool that rates the credibility of news and information websites

International consortium of news organizations building standards of transparency

Tool that rates news outlets based on the "probability of disinformation

Trust.txt

Machine-readable text file that news publishers add to their websites to signal their affiliations with other trusted news organizations

Annex III: Agenda of the Side-Event and participants list

Tech and Journalism Crisis and Emergency Mechanism: Side-Event UNESCO Conference 2023 - 21st February 9:00h

09:00 to 9:30

Registration and refreshments

09:30 - 09:50

Welcome and introduction of the T&JM initiative

Mira Milosevic (GFMD)

Courtney Radsch and Michael Karanicolas (ITLP)

09:50 - 10:20

Experiences of journalism and media organisations in Ukraine and the region

10:20 - 10.50

Crisis and emergency protocols, escalation channels, and mechanisms for identification and verification

  • What are the lessons learned from recognised international crisis protocols and processes?

  • What are existing avenues of collaboration with platforms and other private sector stakeholders?

  • What are the expectations for crisis/emergency protocols and communication and escalation channels? What are the possible actions that platforms could take with the mechanism in place during a crisis or emergency?

10:50 to 11:15

Coffee break

11:15 to 11:45

Processes and criteria for identification of credible and trusted journalism actors online

  • What external references, such as news integrity and trust initiatives, professional and ethical self-regulation bodies, and donor and funder audits, could be used to identify credible and trusted journalism actors online?

  • What should be the minimum requirements for demonstrating adherence to professional and ethical journalistic standards?

11:45 to 12:15

Key elements of establishing a voluntary multistakeholder mechanism

  • What stakeholders and communities will be brought together to establish this mechanism? What should the level and scope of participation be for different types of organisations?

  • How can local leadership and ownership be ensured?

  • What role could international organisations play in data collection, monitoring, and advocacy with tech platforms?

  • How can risk assessment, and transparency be best integrated?

12:15 to 12:30

Next steps: conclusions and recommendations

Contributions by:

  • Olga Myrovych, Chief Executive Officer, Lviv Media Forum, Ukraine

  • Tetiana Avdeieiva, Project Manager on AI, Digital Security Lab, Ukraine

  • Courtney Radsch, Fellow UCLA, Institute for Technology, Law & Policy, US

  • Stephen Turner, former Director of EU Public Policy at Twitter, Belgium

  • Danica Ilic, Program Specialist, Ethical Journalism Network (EJN), UK

  • Thibaut Bruttin, Assistant Director General, Reporters Without Borders (RSF), France

  • Julie Posetti, Deputy Vice President and Global Director of Research, International Center for Journalists (ICFJ), US

  • Ruth Kronenburg, Executive Director, Free Press Unlimited (FPU), Netherlands

Attendees (Organisations)

  1. ARTICLE 19

  2. BBC Media Action

  3. Cafeyn

  4. Cardiff University

  5. Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE)

  6. Center for Media and Information Literacy

  7. Center for Law and Democracy

  8. CREOpoint

  9. Délégation permanente du Royaume du Maroc auprès de l'UNESCO

  10. Digital Security Lab

  11. EDRi

  12. Ethical Journalism Network

  13. European Endowment for Democracy

  14. Free Press Unlimited

  15. Ghana News Agency

  16. Global Forum for Media Development

  17. Global Media Registry

  18. Global Partners Digital

  19. Google

  20. Islamic World Educational, Scientific and Cultural Organization (ICESCO)

  21. International Center for Journalists (ICFJ)

  22. International Fund for Public Interest Media (IFPIM)

  23. International Media Support

  24. Institute for Rebooting Social Media, BKC, Harvard

  25. International Press Institute

  26. Internews

  27. Kaalmo legal Aid Center

  28. Lviv Media Forum

  29. Maharat Foundation

  30. Media Council of Malawi

  31. METÏS INNODEV SAS

  32. Ministry of Information, Publicity and Broadcasting Services (Zimbawe)

  33. Organization for Security and Co-operation in Europe (OSCE)

  34. Open Society Foundation (OSF)

  35. Meta Oversight Board

  36. Permanent Delegation of Türkiye to UNESCO

  37. RNW Media

  38. Reporters Without Borders (RSF)

  39. Surfshark

  40. Tech 4 Peace

  41. Thomson Reuters Foundation

  42. Youth Committee on UNESCO Media and Information Alliance

Last updated