MMfD organises regional dialogue on gendered misinformation with journalists & women human rights defenders

MMfD organises regional dialogue on gendered misinformation with journalists & women human rights defenders

Media Matters for Democracy (MMfD) — in collaboration with the Association for Progressive Communications (APC) and IFEX — organised its first regional consultation titled, “Building Collective Resilience: Women Journalists and Human Rights Defenders Against Gendered Misinformation”. Held in Bangkok, Thailand, from 4th to 6th September, the event featured a diverse group of women journalists and women human rights defenders (WHRDs) from across South Asia, including Pakistan, Myanmar, Indonesia, India, Malaysia, Thailand, and the Philippines. Each participant brought their own stories of struggle and resilience, contributing significantly to our collaborative efforts against gendered misinformation. 

Day 1

The event opened with welcoming remarks and the organisers highlighted the unique approach of the event — DIVERSITY. In attendance were digital rights activists, with whom APC maintains regular interactions, and journalists who are routinely engaged by MMfD. However, the event was also graced by a number of WHRD’s from various fields. The remarks echoed the need for uniting different stakeholder groups to find collaborative solutions to combat the pervasive issue of gendered disinformation. 

The discussion focused on how disinformation exploits pre-existing gender stereotypes to enable and fuel attacks on different individuals, disproportionately affecting women journalists and human rights defenders. It highlighted the multifaceted challenges of gendered disinformation and focused on the difficulties of dealing with powerful tech corporations, particularly in the Global South. During the discussion, it was pointed out that fact-checking, though useful to an extent, may not always be effective, especially when misleading content has a high potential for virality. Policy-based solutions in regressive countries can be problematic, as granting states the power to determine disinformation can lead to abuse of authority and suppression of progressive ideas. 

Following the opening remarks and a round of introduction, APC shed light on how the organisation has been addressing online violence. APC’s decade-long commitment to addressing online gender-based violence, advancing policies to safeguard women and gender-diverse individuals online, and fostering community responses to such issues. Through these efforts, APC recognises that disinformation often serves as a medium for spreading harmful narratives against women and gender-diverse individuals. MMfD also explained its role in fighting against gendered misinformation, which primarily includes identifying disinformation and assisting the media with relevant reporting. 

In summary: Disinformation creates a cesspool of misinformation, which can directly harm individuals and vulnerable groups. Media often lacks an understanding of the impact and harm resulting from coordinated disinformation campaigns. As a result, many consume disinformation as truth, especially when it aligns with their political beliefs. The media needs to build on the capacity to monitor, document, or report on these campaigns effectively. 

A- Notes from the Field

The next session, titled “Still I Rise: Stories of Resilience against Gendered Disinformation and Online Misogyny”, focused on rampant online gender-based violence in the Philippines, and the treatment of the transgender community in Pakistan. The focus was then turned to the bigger picture illustrating the coordination and financing behind disinformation campaigns, which are executed to either target specific individuals or to achieve certain political objectives. The discussion reiterated the importance of creating strong counter-narratives to challenge and neutralise disinformation. For instance, when a woman journalist in Pakistan becomes the target of a hate campaign, efforts are made to collaborate with media organisations to produce investigative stories. These stories dig deep into the origins of troll campaigns, identifying networks, tracing funding sources, and uncovering motives. While these counter-narratives may not reach as many people as the misinformation that necessitated them, they serve an educational purpose, helping the audience become more aware of hate campaigns on social media.

The discussion also focused on the role of mainstream media in the battle against disinformation. While the valuable contributions of smaller media startups were acknowledged, it was argued that conventional media, with its extensive reach, wields significant power in shaping narratives, but it has a long way to go. To effectively harness this power, a three-fold strategy was proposed: 

1- Actively engage with mainstream media outlets,

2- Equip and incentivise them to cover human rights and related topics,

3- Ensure their content is backed by solid evidence and data, making it robust and credible.

The discussion was then steered towards the media landscape in Pakistan, with participants from the journalistic community sharing instances of harassment at the hands of media owners and how a woman journalist’s efforts to expose it online can result in significant exposure and threats of legal action. The focal points of the discussion included power dynamics: Those in more influential positions can voice concerns and bring their stories of harassment to the public, while others without this privilege are forced to remain silent. The role of media owners, who lack experience in journalism, in perpetuating disinformation campaigns was brought to light.

In summary: Gender-specific disinformation campaigns carry recurring themes of culture, race, and religion in. While these campaigns may appear to revolve around these three frameworks, their core motivations are inherently political. They intensify around significant political events like elections, policy changes, and law enactments. 

B- Algorithmic Amplification and the Disinformation Economy

The next session, “Algorithmic Amplification and the Disinformation Economy”, reflected on the evolving relationship between tech companies and civil society. The objective of this session was to understand why algorithmic amplification may work for certain perspectives or narratives, but not for others. The assumption that the digital space is an open marketplace of ideas was challenged. The complex relationship between algorithmic amplification and artificial intelligence (AI), where machine learning and deep learning play crucial roles, was also discussed.

In summary: Divisive content often elicits stronger user engagement than neutral or universally positive content, as it polarises opinions and leads to more interaction. Platforms like YouTube heavily rely on their recommendation system to drive user engagement and the principles behind amplification of algorithms remain largely unknown. This lack of transparency makes it difficult to prove causation, i.e., exposure to certain content led to specific changes in views or beliefs. This challenges tech companies’ responsibility in legal situations.

3- Listening in: Monitoring Social Media for Organized Disinformation

Next came the workshop, “Listening in: Monitoring Social Media for Organized Disinformation“. It was a demonstration of technical tools developed by MMfD to investigate disinformation. The objective behind this initiative is to equip journalists with the skills and tools to explore organised disinformation, enabling them to counteract coordinated and hateful narratives across digital platforms. The toolkit has been limited to trusted newsrooms and individuals sharing the same values. MMfD also plans to launch a “hate monitoring unit” soon to actively monitor hate speech against vulnerable groups in Pakistan.

The workshop also featured insights into the interconnected challenges of gendered disinformation and censorship, which often suppresses the voices of women by limiting their freedom to report. In contrast, disinformation thrives on these very limitations by propagating false narratives and harmful stereotypes, further marginalising and discrediting women journalists and human rights defenders. 

The session highlighted the need for investigating online hate campaigns and their impact on victims, both virtually and physically. The aim of digital investigations is multifaceted: raising awareness around the ability to trace online aggressors, amplifying the voices of victims, holding perpetrators accountable, advocating for change, supporting victims, and promoting change within organisational practices.

A 2021 study by MMfD on the effects of online harassment on women journalists in Pakistan was presented during the discussion. The research found that a significant 78% of women journalists felt compelled to practise self-censorship due to the online environment. This behaviour had become so ingrained that 72% observed their colleagues adopting the same cautious approach. A significant 75% cited fear of online harassment as a reason for self-censorship.

Following the study, the toolkit, called “Trends Monitor”, was unveiled. Developed exclusively by MMfD, Trends Monitor can help extract publicly available data not only from Twitter, but also TikTok, Facebook, and, notably, WhatsApp.

Day 2

The second day featured a design thinking session, which was aimed at creating communal solutions by bringing people together to accelerate the idea generation process. The session included various activities to help participants get to know each other better and involved responses to questions that were not thematic or expertise-based; participants were encouraged to share their responses without any hesitation or discomfort.

During the activity, participants were instructed to take a minute and take down the worst instances of gendered comments, trolling, hate speech, or sexist content they have encountered, whether directed at themselves or others. These comments captured the depth of the ugliness and harm typically found in malicious online behaviour. Participants were then asked to pen their feelings related to the previous exercise and their perceptions of gendered misinformation and disinformation. 

The session also included an activity for creating a needs statement. The attendees were instructed to craft specific problem statements as they were divided in four different teams. Each team was tasked with writing four distinct problem statements. To inject some fun and creativity into the session, each team was tasked with devising a unique team title and logo within five minutes, which they would use throughout the day.

Day 3

The third day of the event started with a discussion on collaborative action for the safety of WHRDs and women journalists. A feedback round was conducted for the previous two days before proceeding with the last day’s objectives. The discussion primarily elaborated on UNESCO’s ongoing process of developing guidelines for regulating the Big Tech, which included the creation of a separate regulator for social media platforms. There have been concerns, especially from the Asia-Pacific region, about the lack of deep engagement with human rights implications; the guidelines do not adequately address concerns from people in authoritarian regimes. The session aimed to collectively strategise on how to make voices heard on platforms which, despite public consultations, might not fully consider their perspectives.

Participants were familiarised with Digital Rights Monitor (DRM), a news website launched by MMfD exclusively for the coverage of tech and corporate accountability developments around the world. It was noted, however, while there may be an improvement in tech reporting in Pakistan, most of the content is recycled from international publications without an in-depth analysis capturing local nuances and implications. 

Then came a comprehensive overview of APC’s “Safety for Voices” policy strategy, which focuses on the protection of human rights defenders and journalists. The primary aim is to work in ways that complement each other’s work. For the SfV project, the main focus is on women human rights defenders in the Global South. The project will not engage male human rights defenders or those from the Global North. The unique aspect of SfV is to inform the advocacy agenda with voices from communities that are not traditionally in the digital rights space.

Closing: It is important to forge alliances, particularly with newsrooms, to counteract gendered disinformation. Joint efforts could be financially advantageous for newsrooms, besides technical enhancements such as AI applications and toolkits to expand their reach. Mutual collaborations will foster benefits for all stakeholders.