Tech and Journalism Crisis and Emergency Mechanism (T&JM)

The Tech and Journalism Crisis and Emergency Mechanism (T&JM) has been established as a platform to monitor and report account-related incidents and ongoing issues affecting trusted and verified independent news media organisations in prominent tech platforms.

The dedicated page for the pilot project in Ukraine is accessible here.

The concept note is also available here.

About T&JM

Tech and Journalism Crisis and Emergency Mechanism (T&JM) will act as a forum where all stakeholders, journalism organisations, civil society organisations, companies and states can discuss, develop recommendations and adopt crisis and emergency protocols, share best practices in transparency reporting and account moderation of journalism and media organisations in the digital space. It will use a voluntary-compliance approach where all stakeholders sign up to a model that does not create legal obligations and where they voluntarily implement the T&JM’s decisions and recommendations.

The T&JM will be a multi-stakeholder process where representatives of the different interested groups come together to improve the practises of the sector. The regional mechanism should involve and should be led by local and regional actors.

  • Cross-industry collaboration: Engage a leading group of local and regional organisations representing professional communities of ethical journalism organisations, digital and journalism experts, academics as well as technology companies, and facilitate multi-stakeholder conversations. It will ensure multistakeholderism, open partnerships, and avoid duplication of efforts.

  • Solutions: Private sector companies, journalists, media and their professional communities, academics, experts, international multilateral organisations and civil society collaborate to share best practices as they develop and implement new solutions through a joint cross-industry mechanism.

  • Civil society engagement and empowerment: Empower journalism and media communities to participate in policy forums in order to support and promote journalism as a public good. The mechanism should involve and should be led by local and regional actors. It will also link the initiative with international and regional players to collect data, advocate, monitor policies, and supervise the proper functioning of the mechanism.

  • Knowledge, data sharing and transparency: We work towards the goal that no knowledge and insights are lost, and that experiences and lessons learned at all levels are collected and processed – from global to local, directly from the field as well as in academia.

The problem

At the start of Russia's invasion of Ukraine, in February 2022, major tech platforms didn't have enough information on who the country's and region's local trustworthy voices were. Despite the fact that platforms collaborated with international partners on numerous news trust initiatives and indicators, they lacked reliable mechanisms to identify professional and trusted local news sources in smaller markets. Ukrainian Public Broadcaster and Euromaidan Press, for example, were neither verified on Twitter nor recognised by Facebook as news publishers, whereas Russia Today continued to earn advertising revenue on Facebook and YouTube until February 2022.

While technology companies have made significant efforts to combat the spread of dis- and misinformation, very few systems are in place to distinguish credible and trusted content creators, such as high-quality journalists and media organisations. At the time of SembraMedia's Inflection Point International report 62% of the media organisations interviewed [1] were not verified on Twitter and 64% were not verified on Facebook. Overall, only 35% of the media organisations said they had a point person to speak with in connection to the social platforms. Small and local media, investigative journalism organisations, journalists, NGOs, and other professional content creators often face prohibitive content moderation practices when reporting on current events and topics of public interest. “The Chilling: A global study of online violence against women journalists” report found that the percentage of women journalists surveyed who reported online violence to Facebook is 39%, Twitter 26%, Instagram 16%. Also, at least 20% of online gendered violence incidents result in physical violence. Furthermore, many news outlets and journalists struggle to effectively protect their accounts, appeal bogus account suspensions, or to quickly restore wrongly removed or restricted content.

As they lack recognition, their content and accounts are often negatively affected by the platforms' current moderation systems and malicious actors. Annex II gives examples of such restrictions. In addition, recent series of articles by Forbidden Stories partners expose the insidious black market for silencing journalists and their stories. There aren't many, easy to access, procedures in place to provide early warnings, an urgent response, or channels for communication in crisis for trusted content and accounts. Without such a mechanism, attacks on journalists will persist and mis- and dis-information will continue to thrive [2]

On the other hand, there are users of digital platforms that enjoy privileges. Recommender systems, or algorithms that enable content moderation, benefit those who have achieved a privileged status or recognition. Facebook's XCheck programme has provided special treatment to celebrities, politicians, and other high-profile users, shielding millions of VIP users from the company's normal enforcement process. While XCheck grew to at least 5.8 million users by 2020, only selected media companies in the most lucrative markets are designated as news publishers on platforms.

Finally, majority of content moderation efforts today focus on online speech-related harms and algorithmic moderation of all content, with only sporadic measures looking at affirmative action, safety protection and online recognition of credible actors and accounts. Maria Ressa explained it perfectly with a metaphor: “When we focus only on content moderation – it’s like there’s a polluted river. We take a glass. We scoop out the water. We clean up the water, and dump it back in the river. However, what we have to do is to go all the way to the factory polluting the river, shut it down, and then resuscitate the river.”

The proposed solution

Although some media outlets and journalism organisations have received recognition from the platforms, many others have been left behind. This has been especially true for small, independent, and investigative media outlets in nations and regions that aren't viewed as major tech markets. The invasion of Ukraine demonstrated that the current approach is not only ineffective but also potentially harmful to local media and journalists and overall information spaces online.

To address the current bottleneck approach and ensure that credible and professional voices continue to exist and operate freely in digital environments, GFMD proposes a multistakeholder process to establish the Tech and Journalism Crisis and Emergency Mechanism (T&JM), starting with the region of Ukraine and the neighbouring countries [3]. This initiative aims to strengthen content and account moderation systems by establishing an emergency and crisis mechanism[4] for journalists and media organisations and thus safeguard freedom of media and freedom of expression online. The project specifically targets small and medium-sized media, community and investigative journalism organisations and their professional communities.

The key objective of the Tech and Journalism Crisis and Emergency Mechanism's (T&JM) is to engage journalism organisations, civil society, companies, and academics and experts to develop collaboratively:

  1. Processes and criteria for identification of credible and trusted journalism actors online, their communities and representative groups;

  2. Crisis and emergency protocols, case escalation criteria, and functioning escalation channels;

  3. Key elements and processes for establishing a voluntary multistakeholder and emergency and crisis mechanism.

The process

The process of developing the T&JM mechanism is multistakeholder and consultative, with the mission of being locally driven and empowering local actors.

The first in a series of consultative meetings and gatherings to discuss the initiative's practical implementation, was organised by the Global Forum for Media Development (GFMD) and the Baltic Centre for Media Excellence (BCME), in January 2023 in Riga, Latvia with a particular focus on the actors and experiences from organisations working in Ukraine and the region, as well as exiled media due to the conflict. During the workshop, participants discussed how the initiative could work in practice: the definition of "recognition," the identification of trusted communities, and existing emergency/crisis protocols. The report and list of participants can be seen here.

A second consultation was held during UNESCO’s Internet for Trust Conference in Paris on 21 February 2023, co-organized by GFMD and UCLA’s Institute for Technology, Law and Policy, seeking views from a wider range of partners on how the mechanism should be operationalized, how players should be involved, and what the expectations of such a mechanism were. This report presents the key takeaways and findings from the consultation.

The roadmap

Draft project roadmap for developing a Tech and Journalism Crisis and Emergency Mechanism (T&JM) includes future consultations in smaller working groups to address the implementation of the pilot, including:

  1. Define key elements and processes for establishing a voluntary multistakeholder emergency and crisis mechanism.

    1. Define the purpose and goals of T&JM:

      • Define T&JM’s mission and vision and its intended outcomes.

    2. Determine the scope and composition of organisations setting up T&JM:

      • Define the scope of the T&JM, including the platforms that will be included and the stakeholders to be involved and their roles.

    3. Define and establish the governance framework:

      • Develop a governance framework that outlines the roles, responsibilities, and decision-making processes and criteria.

      • Develop transparency, accountability and risk assessment processes and reporting

    4. Monitoring and evaluation of work of T&JM:

      • Establish metrics and KPIs for T&JM performance, and regularly monitor and evaluate the T&JM performance against these benchmarks.

    5. Optimisation of the mechanism:

      • Continuously refine and optimise the T&JM practices based on performance feedback, emerging trends, and changing needs.

  2. Processes and criteria for identification of credible and trusted journalism actors online, their communities and representative groups;

    1. Technical expertise and main elements of the mechanism:

      • Existing standards and information integrity approaches that could provide a framework or guidance

      • Defining external references, such as news integrity and trust initiatives, professional and ethical self-regulation bodies, and donor and funder audits, that could be used to identify credible and trusted journalism actors online;

      • Defining processes and minimum criteria for demonstrating adherence to professional and ethical journalistic standards;

    2. Collection of data/evidence around the treatment of credible and trusted journalism actors online, their communities and representative groups

      • Identify research to inform the efforts and guide future technical and policy decisions around the identification and relevant issues.

  3. Crisis and emergency protocols, case escalation criteria, and functioning escalation channels;

    1. Create policies and procedures for crisis and emergency escalation and management of cases:

      • Develop policies and procedures for crisis and emergency escalation and management of cases that align with the scope and governance framework.

    2. Develop case management and escalation and monitoring tools and technologies:

      • Establish escalation communication channels, procedures and responses from platforms;

      • Develop and implement tools and technologies that can facilitate T&JM work.

    3. Train T&JM members on policies, processes and procedures:

      • Provide training and support to members.

About GFMD’s work in this area

Global Forum for Media Development’s (GFMD) core mandate is to promote policies, programmes, strategies, and opportunities for the work of the journalism and media support sector in order to enhance journalism as a public good. We are actively supporting and promoting reputable journalism and civil society organisations on digital platforms as part of our ongoing work to strengthen digital information ecosystems. We are collaborating with Twitter's and Meta's public policy teams to verify the accounts of journalists, news media, and journalism support organisations, particularly those in Ukraine and Eastern Europe. So far, this process has taken the form of ad hoc, informal communication between policy teams at tech platforms and civil society groups like GFMD.

Footnotes

[1] The first study was based on 100 interviews conducted in Argentina, Brazil, Colombia, and Mexico. For the 2021 report, in addition to Latin America, eight more countries were added; 49 digital media organisations from Ghana, Kenya, Nigeria, and South Africa were interviewed; and another 52 from Indonesia, Philippines, Malaysia, and Thailand.

[2] Recognizing the problem, the European Commission proposed a regulation for establishing a common framework for media services in the internal market, European Media Freedom Act, which aims to address the sector's specificities by including a section on media service provision in the digital environment. Article 17.5 emphasises the need for more data on platform interference with news media content by requesting that platforms publicly disclose the number of times they imposed any restriction or suspension on news content creators.

[3] Initially focused on Ukraine and the neighbouring countries, this phase of the project is seen as a pilot for possible wider application in other regions.

[4] This mechanism aims to build on previous efforts such as Access Now’s Declaration of principles for content and platform governance in times of crisis, the Santa Clara Principles and others, in line with the international standards on freedom of expression and other fundamental rights.

Last updated