Consultation Report - Jan, 25th, 2023 - Riga

The first consultative meeting to discuss the initiative's practical implementation, was organised by the Global Forum for Media Development (GFMD) and the Baltic Centre for Media Excellence (BCME)

Key takeaways

  • Similar issues regarding content and accounts moderation during the conflict have been experienced across different platforms. Language is a challenge in itself, also impacting other regions using Cyrillic or Arabic alphabets as well as other minority languages.

  • Communication channels with platforms are rarely and inconsistently accessible.

  • Platforms' responsiveness to content and account moderation issues has been arbitrary, and during crises, often inconsistent and inadequate.

  • Ad-hoc communication channels are insufficient, not standardised, and not available to everyone, and, when they are available, there is limited opportunity to follow up on issues raised with tech companies.

  • Joint advocacy is needed to reach out to tech platforms and to improve communication with them, but more important, to establish a system or protocol.

  • The development of standards and ethical guidelines is needed for credibility and effective communication with tech platforms and policymakers.

  • The mechanism should be scalable across different regions and within digital platforms, taking into consideration local contexts. Information is valuable for platforms to receive data and new contact points.

  • A systematised mechanism should be designed with transparency and accountability in mind, including regular follow-ups and updates on the outcomes after an issue is flagged or resolved.

Issues identified:

  • Lack of established communication channels and effective notice, appeal, and remedy mechanisms: Content from media outlets and journalists reporting from Ukraine and about events in Ukraine is often removed from platforms without a satisfactory response from official channels for complaints. Although each case is unique, journalism organisations usually had to find ways to contact platform representatives through informal channels, sometimes without success. Accounts or content were frequently restored after 10 days, but the content had already lost most of its value or the account's prominence had suffered. Ad-hoc communication channels are available only to a small number of players, and despite them, there’s little opportunity to follow up on the issues that have been raised with the tech companies.

  • GDPR and DMCA takedown notices are often used as a pretext to report journalism organisations to platforms and cause removal of content: Content has not only been taken down due to internal policies regarding sensitive content, or blocked because of the mention of certain trigger words that algorithms identified as Russian propaganda, but also because of false claims regarding copyright attributions coming from third parties.

  • Self censorship and inability to post war- and violence- related content are becoming commonplace for social media editors who are trying to avoid removal of the content or suspension of accounts: Media outlets have to take precautions with what they post, sometimes even rewriting the text or avoiding posting too much content that might risk being potentially blocked (especially photos or videos). Finding an appropriate balance between telling the story and allowing the story to be published on the platforms might lead to self-censorship on certain occasions. Platforms’ moderation policies should not restrict the dissemination of newsworthy content especially in countries at war (i.e., Bucha pictures).

  • Russian media in exile faced an additional layer of challenges due to Russian restrictions on digital platforms, impacting the functioning of digital advertising. Furthermore, some media outlets were unable to register on platform apps or promote their content online. Russian independent media were facing double censorship: on one side, they were banned in Russia for being opposition media and on the other, they were banned in Europe for being Russian media.

  • Languages: Language is a challenge for COMO, also impacting other regions using Cyrillic or Arabic alphabets as well as other minority languages.

Suggested solutions:

  • Joint advocacy: It is hard for media on their own to reach out to big-tech platforms; however, sometimes, media don’t have the capacity to organise themselves, especially in conflict areas. While some of the interactions at the beginning of the conflict were successful despite the chaotic situation, this needs to be scaled to regular and systematised communications with platforms.

  • Explore existing mechanisms of tech platforms and how they can be used to better channel cases of journalism organisations:

    • Crisis Policy Protocol: Advance of a crisis: develop a baseline understanding of different risk factors in different countries basing that on media freedom index scores, levels of independent press, democracy or similar to identify potential risk. Problems with over enforcement of content, important to continue conversations and a mechanism to identify what has happened, whether it is a flaw on the enforcement or the policy itself.

    • Trusted partner programme ー Establish relationships with local/regional civil society organisations predominantly around the world. Ask trusted partners to give us information about what is happening on the ground. Potential expansion of the types of organisations that will be part of the programme.

    • Cross-Check programme: List organisations, individuals, or pages; when their content gets flagged, it will go under a second review by a human to check that it is not a case of over enforcement. The programme has been under review over the last few months, and there will also be a similar programme in which media organisations will be included for a second review. After the invasion of Ukraine some media organisations were also added to that programme. A conversation about how we can use the programme to include independent public interest media, especially in small markets?

Systematised mechanism recommendations:

  • Potential actions/solutions by platforms:

    • Urgency: Platforms should be bound to react to major crises and emergencies by addressing over/under moderation issues related to local media in conflict/crisis areas.

    • Name recognition: Sometimes platforms won’t respond to local/regional media but they respond to international organisations. Many of these international organisations have been contacting platforms in parallel. To avoid duplication, streamline communication, and alleviate the burden on all parties, we propose creating a regionally coordinated system/mechanism for verification of accounts. The pilot = EEurope.

    • Numbers: Platforms, which are dealing with huge amounts of information, should be giving priority to find systematic, transparent solutions to avoid having issues that need to be addressed individually.

  • To give the mechanism some level of transparency and general agreement, consideration should be given to the development of a set of standards or ethical guidelines, designed by and building on existing frameworks in the community, to which news organisations could be required to adhere if they are to be included in the mechanism. Such standards and inclusion processes would likely improve the credibility of the mechanism, facilitating more effective responsiveness from and with tech platforms and policymakers.

  • Scalability should be considered since media and journalists all across the world experience the same censorial issues noted during the session. These issues are exponentially worse in those regions using Cyrillic or Arabic alphabets as well as other under-resourced digital languages. However, local contexts are unique, and from previous initiatives it has been noted that it is necessary to realistically identify what is achievable for each specific context.

  • Accountability: Such mechanism should ensure regular follow-ups and updates on the outcomes after an issue has been flagged or resolved by the platforms.

  • Database of organisations or communities?:The system should not decide what is true or not; rather, it should, for example, create an independent media database based on certain criteria, which is not very complex to implement and can be built on at the local level.

Challenges ahead:

  • The creation of such a mechanism takes time, and meanwhile, most of the urgent issues remain unaddressed. Instead of reinventing the wheel, existing systems should be built upon, even those systems coming from fact-checkers and the disinformation community. Think of information integrity as a part of the solution to counter disinformation and to counter violent extremism and terrorism online. Other initiatives can inform the mechanism:

  • Tech companies might prefer informal structures to address these issues. However, such informal channels often lack transparency and accountability and benefit a small number of privileged players who have access to their communications channels. To overcome this challenge, the mechanism should find an appropriate balance between ensuring transparency and accountability and being easily accessible to and navigable by tech platforms, local journalists, and media organisations and mitigating the potential for abuse.

  • In general, in many parts of the world, there is an almost arbitrary difference in experience between media organisations and platforms’ responsiveness. Inconsistent and inadequate responses are even more obvious during crises: companies often fail to respect human rights by adopting ineffective mitigating measures that have an adverse impact on human rights protection. No general rule exists to identify which platforms are responsive and which are not, because that depends on the organisation in their context. While a systematised escalation channel could have added value, it needs to be effective in practice by leading to increased responsiveness.

  • Tech companies operate in different ways and are incentivised in different ways.

  • International organisations also operate in different ways and often have different standards or criteria for defining who should be considered as journalists or media. The shift should not be only at the platforms’ level but also in the way civil society operates.

Next steps:

Define priorities:

Platform’s moderation challenges (transparency, language, algorithmic…) and escalation of issues. False flagging, trolling, cyber-attacks.

Which organisations should be involved and how?

The mechanism should involve and be led by local and regional actors. International actors, on the other hand, are better placed to advocate with the global tech platforms for adoption of such a locally- designed protocol and would have a supportive role, focusing more on monitoring and advocating for the creation and proper functioning of the mechanism.

Define structure and main elements:

Risk Assessment of this approach and type of mechanism. Existing standards and information integrity approaches that could provide a framework or guidance

Participants

  1. BBC Media Action

  2. Hromadske, Ukraine

  3. Article 19

  4. Reporters sans frontières

  5. UCLA

  6. Google

  7. The Fix

  8. Access Now

  9. Free Press Unlimited

  10. Open Society Foundations

  11. International Media Support

  12. DW Akademie

  13. Meta

  14. IREX

  15. Lviv Media Forum

  16. DT Global

  17. Reporters sans frontières

Last updated