Search CRC
165 results found with an empty search
- CRC Weekly: Cyber-based Hostile influence campaigns 20th-26th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. [Contents] [Introduction] [Report Highlights] [Weekly Review] Matryoshka Campaign Deploys Synthetic Media to Attack Journalism Credibility Russia Trains Local Journalists to Spread Pro-Kremlin Narratives in Africa Russia Pushes False Arctic Narrative to Mask Arctic Military Expansion Pro-Kremlin Actors Use AI and Data Collection to Target Ukraine-EU Relations Beijing Combines Cultural Diplomacy with AI-Driven Influence in Europe American Fugitive in Moscow Runs AI-Powered Pro-Kremlin Fake News Network Russia Engages in 'LLM Grooming' to Manipulate AI Chatbots Climate Action Hindered by Coordinated Disinformation and Greenwashing Campaigns EU-Funded 'Digital Detectives' Initiative Trains Uzbek Journalists to Counter Falsehoods Europe's Counter-Disinformation Efforts Face External Threats and Internal Resistance [CRC Glossary] [Download Report] [ Report Highlights] An American fugitive in Moscow is behind a network of 141 fake news sites powered by an AI programmed to insert bizarre and irrelevant praise for Vladimir Putin into unrelated articles. - NewsGuard Through a strategy dubbed "LLM grooming" aims to manipulate AI chatbots by flooding the internet with pro-Kremlin content, effectively weaponizing the models to reproduce false narratives. - EUvsDisinfo Posing as legitimate news agencies, covert Russian entities are expanding hybrid warfare in Africa by training local journalists and influencers to spread pro-Kremlin narratives. - European Council on Foreign Relations To mask its own aggressive military expansion, a Russian information operation inverts reality by accusing Canada and NATO of militarizing the Arctic. - DisinfoWatch China's hybrid influence campaigns in Europe combine soft-power tactics through cultural and academic channels with advanced AI-driven digital operations. - Taipei Times Recognizing that information manipulation by fossil fuel interests is a primary obstacle to progress, the COP30 climate summit will make public trust a central issue for the first time. - Global Witness An EU-funded "Digital Detectives" project is building a nationwide network in Uzbekistan by training local experts to equip journalists and fact-checkers with advanced verification skills. - EEAS [The Week In Review] Matryoshka Campaign Deploys Synthetic Media to Attack Journalism Credibility The Russian Matryoshka network is impersonating reputable media organizations to spread fabricated stories and undermine trust in Western journalism. A report from NewsGuard details how the hostile influence campaign uses AI-generated videos and fake social media accounts to circulate false claims about political scandals in Germany and France. The videos have falsely attributed quotes to NewsGuard executives and presented entirely invented events, such as Germany suing the organization for exposing war preparations. Matryoshka’s strategy mirrors the very information manipulation tactics it accuses others of employing. Its content relies on AI voice-overs, manipulated footage, and fictitious experts, all designed to exploit real-world controversies, like France's 2023 bedbug panic, to insert Russian narratives into public discourse. The operation highlights a sophisticated use of synthetic media to attack the credibility of established news and research entities. Source: Newsguard, Why Russia Puts Words in NewsGuard’s Mouth, Available Online: ( https://www.newsguardrealitycheck.com/p/why-russia-puts-words-in-newsguards ) Top Of Page Russia Trains Local Journalists to Spread Pro-Kremlin Narratives in Africa Russia has intensified its hybrid warfare tactics in Africa, employing information operations to influence public opinion and destabilize regional politics. The Kremlin established entities like the Africa Corps and the Africa Initiative to bolster its presence and spread pro-Russian narratives across the continent. These operations involve training local journalists, influencers, and activists to disseminate content in multiple languages, including English, French, Arabic, and regional languages like Hausa and Swahili. A report by the European Council on Foreign Relations (ECFR) notes that the Africa Initiative operates covertly, posing as a news agency while engaging in information manipulation. The ECFR highlights the need for a coordinated European response, suggesting current anti-disinformation policies are ineffective. Recommendations include investing in local media and using platforms like WhatsApp to counteract hostile narratives, as Europe risks ceding influence to Russia in Africa's information ecosystem. Source: European Council on Foreign Relations, The bear and the bot farm: Countering Russian hybrid warfare in Africa , Available Online: https://ecfr.eu/publication/the-bear-and-the-bot-farm-countering-russian-hybrid-warfare-in-africa/#recommendations Top Of Page Russia Pushes False Arctic Narrative to Mask Arctic Military Expansion Russian state media is amplifying a narrative that Canada and NATO are promoting "war rhetoric" in the Arctic, while portraying Russia as a peaceful actor. This information operation inverts reality, as Russia has aggressively expanded its military infrastructure in the region since 2021, whereas recent Canadian measures are defensive. The Kremlin uses tactics including selective omission, projection, and euphemism laundering to present its maximalist Arctic claims as benign while framing allied defensive actions as provocative. The campaign is amplified through Russian diplomatic channels, Telegram, and pro-Kremlin outlets, reflecting a broader strategic goal of weakening allied cohesion and chilling Canadian Arctic policy. A DisinfoWatch report notes that by framing Russia as restrained, the campaign seeks to normalize its jurisdictional ambitions and discourage deterrence investments, following a recurring Kremlin pattern of "peaceful Russia/militarizing NATO." Source: DisinfoWatch, Russian MFA Accuses West and Canada of Militarizing The Arctic , Available Online: https://disinfowatch.org/disinfo/russian-mfa-accuses-west-and-canada-of-militarizing-the-arctic/ Top Of Page Pro-Kremlin Actors Use AI and Data Collection to Target Ukraine-EU Relations Pro-Kremlin propagandists have intensified information operations aimed at undermining Ukraine-EU relations and demoralizing Ukrainians. According to a report by the Delegation of the European Union to Ukraine and the DARE Project, these campaigns use Telegram channels, Facebook groups, and fake news websites to spread false narratives. The fabricated stories include claims that the EU is "prolonging the war," accusations of aggressive policies toward Russia, and false stories about refugee conditions and child trade schemes. The report highlights that pro-Kremlin actors are using sophisticated strategies, including emotional manipulation, AI-generated visuals, and fake media outlets. Regional patterns revealed tailored falsehoods in Kherson, Donetsk, and Odesa, with claims about "combat moths" imported from the EU and the sale of cities to foreign interests. Some campaigns also collected personal data, illustrating a dual strategy of psychological influence and opportunistic exploitation. Source: EEAS, Results of pro-Russian information manipulation and disinformation monitoring targeting Ukraine-EU relations during June – August, 2025 , Available Online: https://www.eeas.europa.eu/delegations/ukraine/results-pro-russian-information-manipulation-and-disinformation-monitoring-targeting-ukraine-eu_en Top Of Page Beijing Combines Cultural Diplomacy with AI-Driven Influence in Europe Concerns are growing over Beijing's disinformation and hybrid influence campaigns across Europe, even as some nations distance themselves diplomatically. A recent Italian Senate conference highlighted how China continues to exert pressure through psychological manipulation, propaganda, and economic coercion, despite Italy’s 2023 withdrawal from the Belt and Road Initiative. As published by the Taipei Times , Chinese influence persists through academic and cultural channels, including Confucius Institutes and the suppression of performances by groups critical of the Chinese Communist Party. The digital dimension of these operations leverages platforms like DeepSeek and AI-driven tools to manipulate public perception and amplify state-controlled messaging. This technological aspect has raised alarms among European governments, which now view China's use of AI and data tracking as a severe national security threat, prompting new measures to strengthen democratic resilience and curb foreign manipulation. Source: Taipei Times, EU facing increased interference from China , Available Online: https://www.taipeitimes.com/News/editorials/archives/2025/10/26/2003787875 Top Of Page American Fugitive in Moscow Runs AI-Powered Pro-Kremlin Fake News Network John Mark Dougan, a former Florida deputy now based in Moscow, has become a key figure in Russia's digital influence operations, using a self-trained generative AI system to create large volumes of fake news. An investigation from NewsGuard identifies Dougan as part of the pro-Kremlin influence group Storm-1516. His recent campaign involves 141 French-language websites spreading Russian propaganda and false claims aimed at undermining Western democracies. A notable feature of the AI-generated articles is the consistent insertion of exaggerated and irrelevant praise for Russian President Vladimir Putin, regardless of the topic. Evidence from cybersecurity researchers suggests Dougan's AI is programmed with a pro-Russia, anti-West bias, even leaving behind visible AI prompts that instruct it on how to frame content. While Dougan denies responsibility, he has publicly boasted about receiving a Russian state honor for his "work in the information sphere." Source: NewsGuard, Russian AI Sites Can’t Stop Gushing About Putin , Available Online: https://www.newsguardtech.com/special-reports/ai-driven-john-mark-dougan-pro-kremlin-disinformation-campaign/ Top Of Page Russia Engages in 'LLM Grooming' to Manipulate AI Chatbots Russia has shifted its information warfare tactics to target artificial intelligence, deliberately manipulating large language models (LLMs) through a strategy known as "LLM grooming." This involves flooding the internet with millions of low-quality articles and content from pro-Kremlin websites, including the Pravda network, to ensure AI chatbots reproduce false narratives. The goal is to weaponize AI to spread misleading information, such as fabricated claims about Ukraine's President Zelenskyy. According to analysis by EUvsDisinfo , the campaigns involve multiple actors, including Russian state media, pro-Kremlin influencers, and offshoots of the Internet Research Agency. The broader significance lies in the Kremlin's ability to shape digital information ecosystems, erode trust in AI-generated knowledge, and amplify global security risks as automated disinformation becomes harder to detect and counter, threatening the integrity of online fact-finding. Source: EUvsDisinfo, Large language models: the new battlefield of Russian information warfare , Available Online: https://euvsdisinfo.eu/large-language-models-the-new-battlefield-of-russian-information-warfare/ Top Of Page Climate Action Hindered by Coordinated Disinformation and Greenwashing Campaigns Information manipulation has become one of the most significant obstacles to meaningful climate action, as fossil fuel companies and their allies use influence campaigns to cast doubt on climate science and delay policy responses. These tactics range from outright denial to more insidious strategies like greenwashing, where polluters portray themselves as environmentally responsible while expanding fossil fuel production. Social media algorithms amplify such content, rewarding polarization over accuracy. The growing recognition of this threat has pushed information integrity into the spotlight, with COP30 set to make public trust a central issue for the first time. A Global Witness article states that while informing people of the fossil fuel industry's deception can increase support for accountability, Big Tech's failure to curb falsehoods continues to erode public understanding. Experts now call for stronger oversight and education, arguing that defending information integrity is inseparable from defending the planet. Source: Global Witness, What does information integrity have to do with climate? , Available Online: https://globalwitness.org/en/campaigns/digital-threats/what-does-information-integrity-have-to-do-with-climate/ Top Of Page EU-Funded 'Digital Detectives' Initiative Trains Uzbek Journalists to Counter Falsehoods A new initiative in Uzbekistan, the "Digital Detectives" project, aims to strengthen the country's defenses against disinformation and promote media literacy. Funded by the European Union and implemented by the Modern Journalism Development Centre, the project has launched its first Training of Trainers session in Tashkent to establish a nationwide network of experts. These trainers will assist journalists and fact-checkers across Uzbekistan in identifying and countering false information more effectively. As published by the EEAS , participants explored key fact-checking strategies, including promise tracking, detecting fake news, and utilizing digital verification tools such as the Wayback Machine. They also discussed the importance of storytelling as a method for strengthening credibility and public trust. By empowering local media professionals, the project represents a proactive effort to create a more resilient information environment and safeguard the public sphere against manipulation. Source: EEAS, “Digital Detectives” Project Launches First Training of Trainers on Fact-Checking in Uzbekistan , Available Online: https://www.eeas.europa.eu/delegations/uzbekistan/%E2%80%9Cdigital-detectives%E2%80%9D-project-launches-first-training-trainers-fact-checking-uzbekistan_en Top Of Page Europe's Counter-Disinformation Efforts Face External Threats and Internal Resistance Europe's battle against information manipulation has reached a critical turning point, as new and complex challenges undermine progress. Foreign Information Manipulation and Interference (FIMI), fueled by geopolitical conflicts and hybrid warfare, continues to expand, while generative AI has lowered the barriers for malicious actors to produce large-scale propaganda. At the same time, the fight against disinformation is facing growing internal resistance, with some nationalist movements portraying counter-disinformation efforts as censorship, thereby weakening institutional trust. A recent article from the EU Disinfo Lab notes that major digital platforms have also reversed some commitments to content moderation, allowing false narratives to spread more easily. This has created a dual threat from external state-backed propaganda and domestic disengagement. The report concludes that Europe's resilience depends on enforcing regulations, empowering civil society, and achieving strategic digital autonomy. Source: EU Disinfo Lab, Documenting the setbacks: The new environment for counter-disinformation in Europe and Germany , Available Online: https://www.disinfo.eu/publications/documenting-the-setbacks-the-new-environment-for-counter-disinformation-in-europe-and-germany/ Top Of Page [CRC Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. However, across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms relating to Hostile Influence and Cyfluence. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top Of Page [Download Report] Top Of Page
- Tonga Before the Election: Influence and the Information Space
Background On 20 November 2025, Tongans will head to the polls to directly elect 17 representatives to the Legislative Assembly, while the country’s nobles choose another nine members. The final composition of parliament will include these and up to 4 additional seats determined through the established procedure. [i] As the only constitutional monarchy in the Pacific, Tonga blends democratic governance with deeply rooted traditional structures, where the monarchy retains significant influence over national affairs. Despite its small population of roughly 105,000, Tonga holds strategic importance in the South Pacific. [ii] It sits at the crossroads of a tense China-U.S. rivalry, with Australia and New Zealand playing a key role. Tonga’s strategic location and information environment make it an interesting case study for understanding information flows and cognitive resilience in small island democracies. Influence Vectors Tonga’s internal dynamics and international relations are shaped by a combination of financial dependence, migration trends, regional security cooperation, and diaspora engagement. At the same time, the country’s media landscape has largely shifted to the digital realm [iii] , where outlets face mounting challenges as social media increasingly dominates public discourse. This environment has made Tonga more vulnerable to information disorder [iv] , illustrated by incidents such as deepfake audio clips [v] , fabricated political letters [vi] , and COVID-19 conspiracy theories. Although these cases have largely been domestic and organic rather than coordinated foreign operations, they underscore the country’s vulnerability to information manipulation. Efforts to strengthen resilience are emerging, exemplified by local fact-checkers such as “Supa Mario” [vii], who has gained attention for his debunking work, and by education programs supported by international partners like ASPI–ROI [viii] . Nevertheless, systematic monitoring and institutional frameworks to counter information disorder remain scarce. Economic and Development Assistance Recently, the United States has reduced its direct presence in the Pacific, while Australia, Japan, and New Zealand remain Tonga’s primary security, development, and disaster-response partners. They maintain military and police cooperation programs that provide training, capacity-building, and regional security coordination. Figure 1 – Development financing by partner, Courtesy of Lowy Institute [ix] Meanwhile, China’s role is increasingly apparent: roughly two-thirds of Tonga’s foreign debt (≈USD 195 million) is owed to Beijing. Loan servicing consumes about 4% of GDP annually [x] , raising concerns about long-term strategic dependency. [xi] Chinese aid projects and infrastructure investments have increased visibly in the run-up to the 2025 elections, including a new agricultural agreement signed in October 2025. [xii] Aid, Physical Support, and On-the-Ground Presence Tonga’s 150th Constitution celebrations, held from 31 October to 4 November 2025, illustrated how external actors employ visible, on-the-ground engagement to assert presence. The Chinese Embassy sponsored the official fireworks display and supported the participation of over 300 members of the Chinese community in the float parade. Figure 2 – Posts of the Chinese Embassy in Tonga, Courtesy of Facebook Australia demonstrated its presence through the largest float parade, combining official and community representation to underline partnership and historical connection. Both governments extended these actions to digital platforms, where their embassies documented and circulated images, official statements, and hashtags. This online communication amplified the reach of their physical presence, turning local acts of participation into enduring digital signals of influence and engagement. Figure 3 – Posts of the Australian Embassy in Tonga, Courtesy of Facebook Migration and the Local Economy In recent years, Chinese immigrants have transformed Tonga’s small business landscape. Although consumers benefit from lower prices and greater availability of goods, many local businesses struggle to compete with Chinese-owned shops. Public opinion is therefore divided, with some Tongans expressing concerns over the country’s financial sovereignty. [xiii] Diaspora Influence Tonga’s diaspora, which is larger [xiv] than its domestic population, plays an outsized role in shaping opinions back home. Communities in Australia, New Zealand, and the U.S. frequently engage in online debates about domestic politics, often injecting or amplifying narratives from afar. In contrast, external actors’ ability to leverage coordinated inauthentic behavior (CIB) is limited. Tonga’s tight-knit social networks and small population size make it harder to utilize sockpuppet accounts and operational assets effectively. In essence, diaspora-based involvement acts as a force multiplier in Tonga’s digital information ecosystem, primarily through Facebook, which reaches over 64% of the population. [xv] Conclusion Tonga’s 2025 elections will unfold in an information environment inherently different from that of European nations, where foreign information manipulation and interference (FIMI) activities have had a significant impact. Notable examples include the recent elections in Czechia and Moldova , which are attributed to Russia. Ahead of the upcoming election, there are a few key takeaways for stakeholders, particularly Cyfluence Defence practitioners: Although there’s currently no evidence indicating ongoing coordinated FIMI efforts targeting the Pacific nation and its democratic processes, past misinformation incidents exhibit nascent vulnerabilities. The limited analytical and monitoring capacity within Tonga’s media and civil society means potential influence activities could go undetected. Empowering local institutions, including independent investigative journalism, is crucial. Media literacy and cognitive resilience must be seen as strategic assets that are essential to safeguard trust in public institutions and electoral integrity, and to ensure societal cohesion. [Footnotes:] [i] Inter-Parliamentary Union (IPU), 2025. Tonga – Legislative Assembly (Fale Alea) . [online] Available at: https://data.ipu.org/parliament/TO/TO-LC01/ [ii] Congressional Research Service, J. G. Tupuola, 2025. Tonga: Background and Issues for Congress . [online] pp. 1-2 Published 11 September 2025. Available at: https://www.congress.gov/crs_external_products/IF/PDF/IF12866/IF12866.3.pdf [iii] ABC International Development, 2025. State of the Media: Tonga, 2025 . [online] Published 4 March 2025. Available at: https://www.abc.net.au/abc-international-development/state-of-the-media-tonga-2025/105005712 [iv] ABC International Development, T. Kami Enoka & P.’Ulikae’eva Havili, 2023. Tonga’s Star Fact-Checker Helps Fight COVID-19 Vaccine Misinformation and Government Corruption . [online] Published 14 March 2023; updated 16 March 2023. Available at: https://www.abc.net.au/abc-international-development/pacmas-tonga-fact-checking/102073118 [v] Australian Strategic Policy Institute (ASPI), B. Johnson, F. Fakafanua & S. Vikilani, 2024. As technology distorts information, Pacific governments and media must cooperate . [online] Published 17 July 2024. Available at: https://www.aspistrategist.org.au/as-technology-distorts-information-pacific-governments-and-media-must-cooperate/#:~:text=In%20Tonga%2C%20we%20have%20also,the%20reputation%20of%20those%20involved [vi] Radio New Zealand (RNZ), 2017. Tonga police investigate letter claiming to be from PM . [online] Published 24 February 2017. Available at: https://www.rnz.co.nz/international/pacific-news/325222/tonga-police-investigate-letter-claiming-to-be-from-pm [vii] Ibid. [viii] Royal Oceania Institute, 2024. Training Program for Tonga: “Disinformation: Government and Media Challenges” . [online] Published 8 May 2024. Available at: https://royaloceaniainstitute.org/2024/05/08/training-program-for-tonga-disinformation-government-and-media-challenges/ [ix] Lowy Institute, 2025. Tonga – Pacific Aid Map . [online] Available at: https://pacificaidmap.lowyinstitute.org/country/tonga/ [x] Congressional Research Service, J. G. Tupuola, 2025. Tonga: Background and Issues for Congress . [online] pp. 1-2 Published 11 September 2025. Available at: https://www.congress.gov/crs_external_products/IF/PDF/IF12866/IF12866.3.pdf [xi] Pacific Media Network, A. Vailala, 2025. No debt forgiveness from China, analyst warns as Tonga faces repayment pressure . [online] Published 30 April 2025. Available at: https://pmn.co.nz/read/political/no-debt-forgiveness-from-china-analyst-warns-as-tonga-faces-repayment-pressure [xii] Radio New Zealand (RNZ), C. Rovoi, 2025. Tonga bets on China deal to modernise farming ahead of general election . [online] Published 30 October 2025. Available at: https://www.rnz.co.nz/international/pacific-news/577307/tonga-bets-on-china-deal-to-modernise-farming-ahead-of-general-election [xiii] Tonga Independent News, 2025. ‘Trust Is More Important Than Money’: Inside One Chinese Businessman’s Vision for Tonga . [online] Published 14 August 2025. Available at: https://tongaindependent.com/trust-is-more-important-than-money-inside-one-chinese-businessmans-vision-for-tonga/ [xiv] United Nations, 2022. The Kingdom of Tonga: National Voluntary GCM Review – Implementing the Global Compact for Safe, Orderly and Regular Migration . [online] Published 2022. Available at: https://www.un.org/sites/un2.un.org/files/imrf-tonga.pdf [xv] DataReportal, n.d. Digital 2024: Tonga . [online] Published 2024. Available at: https://datareportal.com/reports/digital-2024-tonga
- Information Warfare in the Early Stages of the Russia-Ukraine War
The prelude and opening stages of Russia's 2022 invasion of Ukraine were one of history's most intense periods of hostile cyber and influence activity . Alongside conventional warfare, both states engaged in a sophisticated battle for influence, deploying digital propaganda , psychological operations , and cyberattacks . This study examines the conflict's information dimension from late 2021 to April 2022 via a novel analytical paradigm adapted from strategic marketing and audience segmentation. By focusing on who the target is, when they are susceptible, and how operations are executed, analysts can systematically map cyber, influence, and hybrid (Cyfluence) operations across time and audience, identifying strategic and operational intent, as well as potential cardinal indicators for conflict escalation. Applying this analytical model to the early stages of the Russia-Ukraine Information War provides valuable insights and strategic context from a pivotal moment in the evolution of hybrid warfare . The analysis breaks down the key events and examines and expands on the key strategic and operational implications. The lessons drawn from this analysis are relevant for countries in the Southeast Asian and Indo-Pacific region, as they grapple with the realization that they too may face a similar threat to Ukraine. China , for example, is closely following Russia’s playbook , is coordinating with Russian cyber-influence agencies, and has shown willingness to deploy its own advanced capabilities in the region. And for European countries , while more familiar with Russian doctrines of hybrid warfare, the idea of a future hybrid conflict taking place in their backyard is more immediate. They too might benefit from a new analytical model on how to better predict, detect and defend against future hybrid conflict s . [Full Report Below]
- Cyfluence: The Latest Frontier of Cognitive Warfare
The term 'Cyfluence' refers to the full spectrum of integrated cyber–influence operations that combine technical and informational tactics within a unified framework. It encompasses both cyberattacks conducted to shape perceptions or behavior and influence campaigns designed to facilitate or enhance cyber operations. In practice, Cyfluence represents the convergence of technical infiltration, sabotage, data exfiltration, information manipulation, and narrative campaigns - all embedded within mutually reinforcing, influence-centered kill chains. It is the comprehensive expression of how power is applied and projected across today’s interconnected information environments. In this primer, we present an updated definition of Cyfluence, reflecting the latest evolutions of the concept and the increasing convergence of cyber and cognitive domains. [ Download PDF Here ]
- CRC Weekly: Cyber-based hostile influence campaigns 13th - 19th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. [Highlights] A sophisticated narrative laundering operation identified by tracing a fabricated news story's journey from a fringe website through Russian state media and AI-powered search results to the U.S. congress - NewsGuard's Reality Check An investigation unmasks a sprawling pro-Kremlin influence network of 139 fake news websites in France, using AI-generated content and coordinated inauthentic behavior to manipulate public discourse. - NewsGuard Taiwan reports a significant escalation in Chinese 'cyfluence' operations, where millions of daily cyber intrusions are strategically combined with AI-driven disinformation campaigns to undermine state security and public trust. - The Record An analysis reveals how Chinese state and private actors, are using sophisticated AI tools to generate fake social media profiles for influence operations targeting India's democracy. - NDTV A detailed report outlines Iran's campaigns in Sweden, which combine traditional espionage with cyber operations like malware-laden apps and spear-phishing. - Eurasia Review Testing of OpenAI's Sora confirms its potential for creating synthetic propaganda, successfully producing realistic videos that advanced false narratives in 80% of test cases. - NewsGuard NATO's top information officer issues a stark warning that 'hybrid warfare has begun,' citing a combination of cyberattacks, disinformation campaigns, and physical disruptions. - Euronews French officials express alarm over the growing 'porosity' between the U.S. 'MAGA sphere' and Kremlin-aligned influence channels. - Le Monde [Weekly Review] From Fringe Site to US Congress: Anatomy of a Kremlin Narrative Laundering Operation AI-Powered Disinformation: Uncovering a Pro-Kremlin Network of 139 Fake French News Sites Estonian Politician Weaponizes Satire in Pro-Kremlin Influence Campaign Kremlin Deploys Disinformation to Foment Panic with 'Kyiv Evacuation' Hoax NATO Warns of China's Technologically Advanced FIMI Threat Taiwan Confronts Chinese 'Cyfluence' as Cyberattacks and Disinformation Surge Analysis: China's Use of AI and Private Firms Poses Influence Threat to India Iran's Hybrid Threat in Sweden Combines Cyber Espionage with Dissident Targeting Sora's Potential for Synthetic Propaganda Highlighted in New Analysis NATO Official: Hybrid Warfare Against Europe 'Has Already Begun' Investigation Reveals UK Far-Right Facebook Groups as 'Engine of Radicalization' French authorities fear mounting 'MAGA sphere' intrusions into domestic politics From Fringe Site to US Congress: Anatomy of a Kremlin Narrative Laundering Operation A fabricated story alleging corruption within Ukrainian President Volodymyr Zelensky's inner circle illustrates a textbook case of narrative laundering. A report by NewsGuard's Reality Check traces the claim's path from a fringe, pro-Russian Turkish website to amplification by Russian state media like TASS and Sputnik. The narrative gained a veneer of credibility after being republished by smaller websites and appearing on Microsoft's MSN news platform, despite a complete lack of evidence. The digital ecosystem played a crucial role in the operation's next phase, as screenshots and AI-generated summaries on Microsoft's Bing search engine facilitated the story's spread across social media. This hostile influence campaign achieved a significant milestone when U.S. Congresswoman Anna Paulina Luna shared the claim, citing MSN as her source. Russian state outlets then completed the propaganda feedback loop by citing the American lawmaker's statements as external validation of the original falsehood, demonstrating how contrived narratives can be pushed into mainstream discourse to achieve strategic objectives. Source: NewsGuard's Reality Check, How Russia Laundered a Lie About Ukraine Through Congress, Available Online: https://www.newsguardrealitycheck.com/p/how-russia-laundered-a-lie-about-ukraine-through-congress Top Of Page AI-Powered Disinformation: Uncovering a Pro-Kremlin Network of 139 Fake French News Sites A network of 139 French-language websites with ties to Russia is disseminating false and misleading claims, often using AI-generated content to populate its pages. According to an article from NewsGuard , the operation is believed to be managed by John Mark Dougan, a former U.S. Marine who fled to Russia, with alleged support from Russian military intelligence (GRU). These fake websites were established between February and August 2025, using fabricated ownership details to masquerade as legitimate French media outlets. This coordinated inauthentic behavior is part of a broader Russian information operation, designated Storm-1516, which has also targeted the United States and Germany. The campaign’s tactics include impersonating real journalists and spreading fabricated narratives on high-profile topics to manipulate public discourse. The operation demonstrates an evolving approach to digital propaganda that leverages a distributed network of fake platforms to generate millions of views and influence public perception on key political issues. Source: NewsGuard, NewsGuard Rates Network of 139 Fake French News Websites with Ties to the Kremlin , Available Online: https://www.newsguardtech.com/press/newsguard-rates-network-of-139-fake-french-news-websites-with-ties-to-the-kremlin/ Top Of Page Estonian Politician Weaponizes Satire in Pro-Kremlin Hostile Influence Campaign In Estonia, a pro-Kremlin politician has been repurposing satirical Russian content to spread malinformation among the nation's Russian-speaking population. A report from the Atlantic Council’s DFRLab identifies Genady Afanasyev, a candidate for the KOOS party, as the central actor in this hostile influence campaign. Afanasyev adapts stories from the Russian satirical outlet Panorama.pub by localizing them to Estonian contexts, altering names and institutions to make the fabricated stories appear as factual local news. This tactic exploits gaps in media literacy by mixing political messaging with humor to cultivate anti-government sentiment and normalize pro-Kremlin narratives. The content is primarily disseminated through KOOS-affiliated Facebook groups but also spreads across VKontakte (VK), TikTok, Telegram, and X, extending its reach within the target audience. The campaign highlights how foreign satirical content can be adapted into a targeted tool for domestic political influence, raising concerns about election integrity and the manipulation of specific linguistic communities. Source: DFRLab, Pro-Kremlin politician weaponizes satire to engage Russian population in Estonia ahead of local elections , Available Online: https://dfrlab.org/2025/10/16/pro-kremlin-politician-weaponizes-satire-to-engage-russian-population-in-estonia-ahead-of-local-elections/ Top Of Page Kremlin Deploys Disinformation to Foment Panic with 'Kyiv Evacuation' Hoax Pro-Kremlin channels have been circulating a disinformation narrative claiming the West is urging an evacuation of Kyiv due to blackouts caused by Russian strikes. This information operation, detailed in an article by EUvsDisinfo , aims to exaggerate Ukraine's energy vulnerabilities and undermine public confidence in the Ukrainian government. By propagating these falsehoods through state-linked media and messaging platforms, the campaign seeks to distort perceptions of the conflict, reduce international support, and create the impression that Ukraine cannot withstand ongoing Russian attacks. In reality, neither Ukraine nor its allies have made any such calls for evacuation. Ukrainian authorities have maintained contingency plans since 2022 and continue to demonstrate resilience against energy disruptions. EU officials have reaffirmed their full support, mobilizing hundreds of millions of euros for energy aid and civil protection. The campaign exemplifies the Kremlin's persistent use of disinformation to generate fear and uncertainty, though international support for Ukraine remains strong. Source: EUvsDisinfo, DISINFO: The West calls on Ukraine to evacuate Kyiv amid blackouts , Available Online: https://euvsdisinfo.eu/report/the-west-calls-on-ukraine-to-evacuate-kyiv-amid-blackouts/ Top Of Page NATO Warns of China's Technologically Advanced FIMI Threat China has significantly intensified its disinformation campaigns against NATO members since the COVID-19 pandemic, employing strategies designed to destabilize and weaken Western countries. According to a NATO report published by the Global Influence Operations Report (GIOR) , these operations leverage advanced technologies, social media platforms like TikTok, and cooperation with Russia to amplify pro-Chinese narratives. The campaigns aim to suppress criticism of the Chinese Communist Party and infiltrate local media ecosystems, substantially increasing the speed and reach of its information operations. The analysis emphasizes that these activities constitute a form of Foreign Information Manipulation and Interference (FIMI) that threatens Euro-Atlantic security, public trust in democratic institutions, and overall stability. By mapping key actors and tracing the tactical evolution of these campaigns, the report underscores the urgent need for coordinated countermeasures among allies to protect their populations, defend democratic processes, and mitigate the impact of Beijing's hostile influence activities. Global Influence Operations Report, NATO Report on Chinese Disinformation Reveals Escalating Threats , Available Online: https://www.global-influence-ops.com/china-disinformation-nato-report-global-influence-operations/ Top Of Page Taiwan Confronts Chinese 'Cyfluence' as Cyberattacks and Disinformation Surge Taiwan's National Security Bureau (NSB) has reported a significant increase in cyberattacks and coordinated disinformation campaigns from China, aimed at undermining public trust and creating societal divisions. An article in The Record states that government networks faced an average of 2.8 million intrusions per day in 2025, a 17 percent annual increase targeting critical infrastructure. Beijing’s strategy represents a form of Cyfluence, combining these cyber intrusions with information warfare. The campaigns employ state media, an "online troll army" of fake users, and AI-generated content to spread fabricated narratives attacking the Taiwanese government and promoting pro-China messaging. The NSB report identified over 10,000 suspicious social media accounts distributing more than 1.5 million disinformation posts. This state-level strategy involves military, civilian, and private-sector hackers, with cybersecurity researchers linking activity to actors like TA415. These hybrid operations are designed to manipulate online discourse and shape public perception ahead of Taiwan's 2026 local elections. Source: The Record, Taiwan reports surge in Chinese cyber activity and disinformation efforts , Available Online: https://therecord.media/taiwan-nsb-report-china-surge-cyberattacks-influence-operations Top Of Page Analysis: China's Use of AI and Private Firms Poses Influence Threat to India China is deploying sophisticated global influence operations that leverage disinformation, AI-generated content, and social media manipulation to polarize societies and exploit divisions within democratic systems. An opinion article published by NDTV highlights the use of Chinese state institutions and private entities like GoLaxy, which run campaigns using AI tools to generate realistic social media profiles and fabricate narratives targeting individuals in India, the U.S., and elsewhere. These operations also enlist academics, media figures, and influencers to amplify messaging and reach specific audiences. For India, the campaigns risk fueling domestic polarization, undermining democratic processes, and exerting strategic influence over regional geopolitics. The analysis emphasizes the need for India to develop proactive countermeasures, including AI-focused digital forensics, robust legal frameworks, and dedicated counterespionage strategies. As China continues to exploit the information environment, vigilance is required to protect India’s domestic stability and strategic interests. Source: NDTV, What Ashley Tellis 'Spying' Allegation Should Tell India About Chinese 'Influence Ops' , Available Online: https://www.ndtv.com/opinion/what-ashley-tellis-arrest-should-tell-india-about-chinese-influence-ops-9473545 Top Of Page Iran's Hybrid Threat in Sweden Combines Cyber Espionage with Dissident Targeting The Islamic Republic of Iran has conducted extensive intelligence, cyber, and influence operations in Sweden targeting dissidents, Jewish communities, and Israeli interests. A recent analysis in Eurasia Review details how these activities are part of a broader hostile campaign to advance Tehran's geopolitical objectives. The operations employ a range of tactics, including cyber espionage through malware-laden apps and spear-phishing campaigns, assassination plots, and the infiltration of academic institutions. Iran also exploits local criminal networks and religious institutions to carry out surveillance, intimidation, and influence activities aimed at silencing opposition and evading international sanctions. These operations reveal significant vulnerabilities in Sweden's cyber defenses and immigration vetting processes. By coordinating with Russia and leveraging criminal proxies, Iran’s activities threaten not only targeted communities but also the stability of Swedish society and regional security, prompting calls for more decisive countermeasures. Source: Eurasia Review, A Growing Security Threat: Iranian Intelligence Operations In Scandinavia (Part Two: Sweden) – Analysis , Available Online: https://www.eurasiareview.com/27092025-a-growing-security-threat-iranian-intelligence-operations-in-scandinavia-part-two-sweden-analysis/ Top Of Page Sora's Potential for Synthetic Propaganda Highlighted in New Analysis OpenAI's new text-to-video generator, Sora, produced realistic videos advancing false claims in 80% of test cases, including several narratives originating from Russian disinformation operations. A report from NewsGuard found that the tool allows users to create synthetic propaganda with minimal effort, enabling hostile actors to rapidly amplify misleading narratives. The analysis raises concerns about the proliferation of high-quality manipulated media and the erosion of trust in authentic content. While OpenAI has implemented guardrails such as watermarking and C2PA metadata, the investigation found these measures can be circumvented, allowing generated videos to appear authentic to unsuspecting viewers. Sora’s accessibility and speed significantly lower the barrier for creating convincing fabricated content, which could be weaponized in large-scale information operations. The findings underscore the broader implications for media integrity and the challenge of countering AI-driven falsehoods in contested information environments. NewsGuard, OpenAI’s Sora: When Seeing Should Not Be Believing , Available Online: https://www.newsguardtech.com/special-reports/sora-report/ Top Of Page NATO Official: Hybrid Warfare Against Europe 'Has Already Begun' Hybrid warfare, combining cyberattacks, disinformation campaigns, and physical disruptions, is already underway in Europe, with Russia suspected as a key actor. In an article from Euronews , NATO's first Chief Information Officer, Manfred Boudreaux-Dehmer, warned that recent incidents like unidentified drones forcing airport shutdowns are part of a broader strategy to disrupt daily life and weaken public morale. These non-kinetic tactics are designed to exploit digital and psychological vulnerabilities within NATO member states. Boudreaux-Dehmer noted that the Alliance is enhancing its cyber resilience through a new defense center in Belgium and increased coordination among its 32 members. He described the current environment as a constant technological and informational race between adversaries and defenders. The growing use of disinformation and other soft warfare methods highlights a strategic shift toward battles over public perception and trust, making collaboration with the private sector and academia critical for Alliance security. Source: Euronews, Hybrid warfare has begun, senior NATO official tells Euronews , Available Online: https://www.euronews.com/2025/10/15/hybrid-warfare-has-begun-senior-nato-official-tells-euronews Top Of Page Investigation Reveals UK Far-Right Facebook Groups as 'Engine of Radicalization' A network of far-right Facebook groups in the United Kingdom is exposing hundreds of thousands of members to racist language, conspiracy theories, and extremist disinformation. An investigation by The Guardian describes these online spaces as an "engine of radicalization." The analysis of over 51,000 posts across three large public groups revealed the widespread promotion of anti-immigration tropes and dehumanizing rhetoric. A key finding is that these groups are often managed by older, otherwise ordinary Facebook users, who moderate content and disseminate disinformation across the network. This dynamic leverages peer-to-peer trust, making users more likely to perceive the content as credible compared to institutional sources. Experts warn that such online ecosystems, amplified by platform algorithms, can accelerate radicalization, a threat potentially magnified by emerging technologies like deepfakes and automated bots. Despite a review, Meta found the groups did not violate its policies, highlighting ongoing challenges in moderating extremist content at scale. Source: The Guardian, Far-right Facebook groups are engine of radicalisation in UK, data investigation suggests , Available Online: https://www.theguardian.com/world/2025/sep/28/far-right-facebook-groups-are-engine-of-radicalisation-in-uk-data-investigation-suggests Top Of Page French authorities fear mounting 'MAGA sphere' intrusions into domestic politics French authorities are increasingly concerned by the expanding influence of the American far-right "MAGA sphere" and its convergence with Russian disinformation networks targeting Europe. Le Monde reports that this concern grew after Elon Musk amplified a claim by Telegram's founder that French intelligence attempted to censor certain accounts, an allegation officials viewed as pro-Russian propaganda. In response, France's Foreign Ministry launched an X account to counter such online falsehoods. A French official described the phenomenon as a "porosity" between U.S. far-right and Kremlin-aligned influence channels, noting that narratives on migration, freedom of expression, and the war in Ukraine spread rapidly across these ecosystems. The French government now views the MAGA-aligned media sphere, including outlets like Breitbart News and platforms like X, as a growing source of foreign information manipulation and interference that could be used to sway upcoming French elections. Le Monde, French authorities fear mounting 'MAGA sphere' intrusions into domestic politics , Available Online: https://www.lemonde.fr/en/international/article/2025/10/14/french-authorities-fear-mounting-maga-sphere-intrusions-into-domestic-politics_6746437_4.html Top Of Page [CRC Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. However, across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms relating to Hostile Influence and Cyfluence. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top Of Page [Download Report] Top Of Page
- CRC Spotlight: Ride-Hailing Apps as Vehicles of Foreign Solidarity and Potential Influence Operations
In August and September 2025, a series of civil and political upheavals, primarily in Asian countries, shocked regional observers and elites. Viral images featuring the ‘One Piece’ pirate flag, adopted as a symbol by protestors, made front page news and social media was soon flooded with cross-country messages of support and solidarity. Interestingly, a key characteristic of this wave of protests was the role played by popular ride-hailing and delivery apps as well as the ‘gig economy’ workers that rely on them. Platform users became central to the movement's core narratives, while being supplied in real-time by supportive netizens. In this CRC Spotlight article, we examine the potential operational implications of this development: how commercial apps can serve as channels for on-the-ground support and how they might represent a new vector for Influence Operations. The platforms and their users are already vulnerable to exploitation, with active "Fraud-as-a-Service" networks using tactics like account takeover (ATO) and location spoofing for financial gain. Although this wave of protests appears to be organic, existing Tactics, Techniques, and Procedures (TTPs) could easily be repurposed from financial fraud to political interference, such as astroturfing support for unrest. This emerging threat is amplified by the difficulty in attribution, inherent to the spontaneous, grassroots nature of platform-based aid. With gig economy platforms becoming de-facto civic infrastructure worldwide, their potential for malign socio-political exploitation is outpacing the regulatory frameworks needed to mitigate the risks. Read the full report below for in depth analysis [ Download Full Report here ]
- CRC Weekly: Cyber-based hostile influence campaigns 6th - 12th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. [Review highlights] Russia is weaponizing grief by using AI to create deepfake "resurrections" of fallen soldiers , turning personal tragedy into state propaganda. – CyberNews A Russian Influence campaign generated 200,000 social media mentions overnight, creating "informational chaos" to deflect blame for a drone incursion. - Le Monde Chinese chatbots are being used for espionage, harvesting user data for microtargetd propaganda targeting sensitive groups like military personnel - Politico A Chinese Influence campaign using fake social media accounts and a pseudo-local media outlet to undermine the US-Philippine alliance was uncovered. – Reuters The UK’s new national security adviser met with a group that the U.S. State Department has labeled a "malign" part of Beijing’s foreign influence network . - The Telegraph An AI-enabled influence operation, synchronized with military strikes , used deepfake videos and impersonated media to incite revolt in Iran . - Citizen Lab C hinese and Russian state media launched coordinated campaigns to frame Taiwan's president as a provocateur, distorting his calls for deterrence. - DisinfoWatch The U.S. has dismantled key defenses like the Foreign Malign Influence Center, creating a vacuum exploited by adversaries - The Washington Post TikTok’s algorithm has enabled manipulated videos and propaganda to spread rapidly across Africa , fueling pro-junta sentiment during recent coups. - LSE [Week in Review] AI-Generated "Ghosts": Russia's New Front in Digital Propaganda The use of artificial intelligence in Russia to create propaganda from private grief is examined in an article from CyberNews . For a fee ranging from $35 to $60, families of deceased soldiers can commission AI-generated videos in which their loved ones appear to speak, embrace them, or ascend to heaven. These services, some of which reportedly handle hundreds of orders daily, produce deepfake clips that are then rapidly disseminated across Russian social media platforms, including Telegram and VKontakte. While these videos may provide a "balm effect" for grieving families, especially those unable to recover the bodies of soldiers, Ukrainian outlets like StopFake.org have warned against the manipulation of emotions inherent in such content. The practice represents a novel form of digital propaganda, turning personal mourning into a tool for reinforcing state narratives by creating a sanitized depiction of wartime loss. Source: CyberNews ‘Russian AI resurrection videos turn grief into propaganda’ Available Online Top Of Page How Russian Bot Networks Assaulted Czech Democracy Online During the October parliamentary elections in the Czech Republic, Russia engaged in coordinated disinformation campaigns aimed at interfering with the democratic process. A report by EUvsDisinfo details how networks of TikTok bot accounts and pro-Russian websites saturated Czech online spaces with propaganda. These operations sought to portray Vladimir Putin in a positive light, legitimize the war in Ukraine, and amplify anti-Western and anti-establishment narratives. Investigations by Czech media found that these propaganda sites published more articles daily than the country’s most established news outlets. After the election, Russian state-controlled media continued to push misleading narratives, falsely claiming the results indicated a rejection of the EU. This digital interference campaign also included accusations from Kremlin-linked sources that the European Union was itself guilty of election interference, a common tactic of projecting blame onto adversaries. Source: EUvsDisinfo ‘For the Kremlin, elections are golden opportunities for interference’ Available Online Top Of Page A Digital Blitz: Russia combined drone and Information Attack on Poland Following a Russian drone incursion into Polish airspace, the country was targeted by an unprecedented and coordinated disinformation attack, as detailed in an article published by Le Monde. The operation aimed to generate "informational chaos" by saturating social media algorithms with false narratives at a massive scale, resulting in up to 200,000 mentions in one night. Primarily driven by coordinated Russian and Belarusian accounts on platforms like X and Facebook, the campaign sought to divert blame by portraying the incident as a Ukrainian provocation designed to draw NATO into the conflict. Simultaneously, it characterized the Polish military and NATO as "ineffective and powerless." Experts view this incident as a significant escalation in Russia’s hybrid war, demonstrating a new phase of information warfare. The influence operation's reach extended to France, Germany, and Romania, highlighting its regional scope and its strategic goal of eroding European support for Ukraine. Source: Le Monde, ‘Poland hit by unprecedented disinformation attack following Russian drone incursion’ Available Online Top Of Page Chinese-developed chatbots leave user information vulnerable exploitation China's substantial investment in artificial intelligence is fueling concerns that extend beyond economic competition into the realms of cyberwarfare, espionage, and disinformation. According to an article from Politico, Beijing’s integration of AI into state-linked hacking groups could amplify the scale and sophistication of cyberattacks on U.S. infrastructure. In parallel, Chinese-made chatbots present espionage risks by harvesting user data, which could be weaponized for tailored disinformation campaigns targeting sensitive sectors such as first responders or military personnel. Research indicates that leading Chinese chatbots, including DeepSeek, Baidu’s Ernie, and Alibaba’s Qwen, consistently produce content that aligns with Beijing’s political narratives, subtly reinforcing state messaging. Such platforms pose a risk of shaping public opinion, particularly as affordable Chinese AI services become more widespread in developing nations, creating new vectors for digital influence. Source: Politico ‘Inside the Chinese AI threat to security’ Available Online Top Of Page Beijing's Shadow Campaign to Fracture US-Philippine Alliance A Chinese-funded Foreign Information Manipulation & Interference (FIMI) campaign in the Philippines was orchestrated to undermine local support for the country’s alliance with the United States. A Reuters investigation uncovered that the operation was managed by the marketing firm InfinitUs Marketing Solutions, which received direct funding from China’s embassy in Manila to "guide public opinion." The campaign utilized fake social media accounts posing as Filipinos to amplify pro-China and anti-American content, as well as a fabricated media outlet named Ni Hao Manila. These accounts spread misinformation regarding U.S. military cooperation, attacked Philippine lawmakers critical of China, and disseminated false narratives on other geopolitical issues. Philippine officials warned that such digital influence operations aim to make Manila "compliant" with Beijing’s strategic interests, highlighting the information war playing out in a region of significant geopolitical importance. Source: Politico ‘How China waged an infowar against U.S. interests in the Philippines’ Available Online Top Of Page UK Security Adviser’s Past Meetings with China Influence Group Raise Concerns Sir Keir Starmer’s new national security adviser, Jonathan Powell, is facing scrutiny over past meetings with a Chinese organization identified by U.S. intelligence as part of Beijing’s foreign influence network. A The Telegraph report revealed that in March 2024, Powell met with the Chinese People’s Association for Friendship with Foreign Countries (CPAFFC), an organization the U.S. State Department has described as "malign." This group is linked to Chinese Communist Party efforts to co-opt global institutions and shape international narratives. U.S. officials have warned that CPAFFC and associated think tanks like the Grandview Institution are instrumental to China's "people-to-people" diplomacy, a strategy used to promote pro-Beijing messaging. Powell’s repeated visits to China and speaking engagements have fueled concerns that these exchanges may inadvertently legitimize entities associated with disinformation and political manipulation campaigns, coming at a time of heightened sensitivity over Chinese interference in the UK. Source: The Telegraph ‘Powell met ‘malign’ Chinese group before joining Starmer’s team’ Available Online Top Of Page AI-Augmented Influence Operation Targets Regime Change in Iran A covert network known as PRISONBREAK has been executing an AI-enabled influence operation targeting Iranian audiences with calls for revolt and fabricated media. An analysis from Citizen Lab details how the campaign utilized over 50 inauthentic profiles on X to distribute deepfake video content and impersonate media outlets, aiming to stoke domestic unrest. The operation's digital activities appear to have been tightly synchronized with kinetic military actions, such as the June 2025 Evin Prison bombing, employing tactics of narrative seeding and amplification in real-time. While definitive attribution is challenging, Citizen Lab assesses that the operator is most likely an Israeli government agency or a contractor, citing the advanced knowledge of military operations and coordinated narrative timing. This case highlights the evolving threat of AI-augmented disinformation in geopolitical conflicts, demonstrating how digital influence campaigns now operate alongside traditional warfare. Source: Citizen Lab ‘We Say You Want a Revolution: PRISONBREAK – An AI-Enabled Influence Operation Aimed at Overthrowing the Iranian Regime’ Available Online Top Of Page China and Russia Coordinate False Narratives Against Taiwan Chinese and Russian state media outlets have engaged in coordinated campaigns to distort the statements of Taiwanese President Lai Ching-te and portray Taiwan as a source of regional instability. According to DisinfoWatch, recent analysis shows that on October 8, 2025, China’s Global Times accused President Lai of "seeking independence through military means," a claim echoed by Russian state media. This narrative directly contradicted Lai’s actual remarks, which stressed deterrence and called on Beijing to renounce the use of force. The disinformation campaign also framed the People’s Liberation Army’s coercive military drills as a stabilizing measure. Furthermore, Beijing has manipulated international law by falsely equating its "One China" principle with UN Resolution 2758, which pertains to China’s UN seat but does not determine Taiwan’s sovereignty. These coordinated digital narratives represent a joint effort to isolate Taiwan and legitimize aggressive actions in the region. Source: DisinfoWatch ‘Converging False PRC–Russian Narratives Target Taiwan and President Lai’ Available Online Top Of Page United States Cedes Ground in the Global Information War The United States has effectively "disarmed" in the information war, leaving it vulnerable to foreign disinformation from Russia, China, and Iran. As stated by The Washington Post, the dismantling of key defenses, such as the Foreign Malign Influence Center, has created a vacuum that adversaries have exploited by spreading fabricated content, including AI-generated images and videos. Analysts at NewsGuard identified thousands of social media posts from state-backed media that aimed to deepen polarization by circulating conflicting lies. The impact is measurable, with surveys showing that a third of Americans believe at least one significant Russian falsehood about Ukraine. The article notes that Russian disinformation networks, like the Pravda Network, have seeded millions of false stories, some of which are now being used to "infect" large AI models that subsequently repeat these lies as fact, amplifying their reach and perceived credibility. Source: The Washington Post ‘How foreign nations are gaslighting Americans’ Available Online Top Of Page TikTok's Ascendance in Africa Reshapes Media with Misinformation Risks TikTok has rapidly become one of Africa’s most influential platforms for news consumption, bringing with it a significant surge in misinformation and political propaganda. A news piece by LSE describes how millions across the continent now rely on TikTok for information, while trust in traditional media outlets declines. The platform’s algorithms, designed to maximize engagement, enable manipulated videos and misleading content to achieve viral reach before they can be verified. This digital environment has had tangible real-world consequences, such as bolstering pro-junta sentiment during coups in Niger and Mali and fueling political division during elections in South Africa and Nigeria. While countermeasures are emerging, such as South Africa's partnership with TikTok’s election center and Ghana's fact-checking networks, the report concludes that combating disinformation on the platform will require stronger digital literacy, transparent moderation, and renewed investment in credible journalism. Source: LSE ‘TikTok is becoming Africa’s newsroom’ Available Online Top Of Page [Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. However, across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms relating to Hostile Influence and Cyfluence. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top Of Page [Download Report]
- Dancing with Cyfluence – Travolta, Telegram & the Moldovan Leak
In this week’s follow-up, we return to Moldova, where the recent parliamentary elections once again underscored the country’s vulnerability in its political information space . As noted in our previous coverage on influence attempts surrounding the Moldovan vote (more information can be found [ here ]), competing narratives and external actors shaped much of the pre-election atmosphere. Against this backdrop, a remarkable incident occurred — one that appears, with high probability, linked to a suspected Russian influence campaign: a likely cyfluence-counteroperation targeting the pro-Russian network of oligarch Ilan Shor and its affiliated organization, the Victorie Bloc. On 3 September, internal data from these structures appeared online, triggering a chain reaction that severely disrupted Shor’s political machinery and exposed the operational mechanics behind what is assessed to have been a foreign-directed influence apparatus. The leak represented one of the clearest intersections of cyber intrusion and influence strategy observed during this election cycle. i Who is Ilan Shor? Ilan Shor, a Moldovan businessman and politician, fled to Russia several years ago after facing extensive corruption charges. From exile, he remained politically active and established the Victorie Bloc in Moscow, a distinctly pro-Russian political platform aimed at regaining influence in Moldova through affiliated candidates. Shor is widely regarded as a symbolic figure of Moldova’s pro-Russian current: financially well-connected, politically ambitious, and closely tied to Kremlin-linked networks. The Data Leak On 3 September, reports surfaced that data from two Shor-affiliated companies, A7 and Anykey LLC, had been published. ii Figure 1 – Screenshot of the Folders of the Leaked Data The files first appeared on the encrypted cloud service ProtonDrive iii and were later disseminated via Telegram channels. They contained internal communications, confidential financial records, and expenditure summaries for campaign activities. Particularly notable were chat logs in which Shor, using the codename “Travolta,” commented on operational issues. The materials also included lists of names, phone numbers, and addresses of individuals allegedly paid to organize protests or promote pro-Russian messaging. The documents revealed that the Victorie Bloc functioned not merely as a political organization, but as a structured, financed iv , and centrally coordinated influence network. Figure – 2 Leaked data: paid individuals, including names, tasks, and monthly payments v Indicators of a Cyfluence Counteroperation The following phase-based analysis outlines the structure and sequencing of the operation to illustrate how cyber-technical and influence-oriented components were combined. Breaking the event into three phases, intrusion, exposure, and amplification, allows for a clear understanding of how technical compromise evolved into a coordinated perception operation. At this point, we use this analytical framework to identify hybrid operations that merge cyber capabilities with psychological and narrative objectives. The incident occurred only days before Moldova’s parliamentary elections and displays key indicators of a coordinated cyber and information activity. Data from entities linked to Ilan Shor and the Victorie Bloc were exfiltrated, publicly released, and then used to directly engage individuals named in the dataset. The timing and sequencing suggest the operation’s intent was not financial gain or espionage, but the disruption and delegitimization of a Russian-backed influence network. Cyber Intrusion and Data Exfiltration The first phase likely involved unauthorized access to internal systems of the Shor-affiliated companies A7 and Anykey LLC. Significant volumes of data, including financial ledgers, payment records, and personally identifiable information, were exfiltrated and uploaded to ProtonDrive, an encrypted cloud-sharing platform. The material was subsequently distributed via Telegram channels and closed online groups, ensuring rapid dissemination while maintaining anonymity and non-attribution for the perpetrators. This stage established the technical foundation for the influence component that followed. Exposure and Doxxing Component In the second phase, the attackers deliberately released personal information, names, contact details, and payment histories of individuals associated with the Victorie Bloc. This elevated the incident from a typical hack-and-leak to a hybrid operation with doxxing characteristics. Immediately after publication, numerous individuals listed in the leak received direct messages stating: “The Victory Bloc is broken. You will no longer be paid. Your data is public. Russia has betrayed you.” vi The messages were designed to have a psychological impact. They combined exposure and intimidation to pressure individual supporters of the Victorie Bloc, undermine their trust in the organization’s leadership, and weaken the internal cohesion between coordinators, financiers, and field operatives. This targeted approach effectively amplified the disruptive impact of the data release. Narrative Amplification and Public Signaling The third phase focused on narrative shaping and institutional signaling. The leaked documents appeared to show direct financial and organizational connections to Russian actors, framing the Victorie Bloc as a foreign-directed influence structure. Media outlets and social channels picked up these narratives, turning a data breach into a strategic reputational and operational collapse. Authorities, including the Central Electoral Commission and CERT-GOV-MD, Moldova’s national cybersecurity agency, launched preliminary reviews to verify the authenticity of the materials and assess potential election interference. This official response further amplified the visibility and perceived legitimacy of the operation’s outcomes. Analytical Assessment The coordination of cyber intrusion, targeted disclosure, and psychological messaging aligns with the structure of a Cyfluence Counteroperation, an integrated activity designed to weaken or neutralize a hostile influence campaign through synchronized cyber and perception mechanisms. In this case, the campaign can be assessed with high confidence as successful, given the rapid breakdown of internal communications, loss of financial control, and subsequent reputational collapse of the targeted network. Together, these components placed significant pressure on participants, disrupted internal communication processes, and eroded the organization’s stability. Moreover, the operation publicly reframed the Victorie Bloc as a foreign-directed entity, sharply reducing its domestic legitimacy and public support, a decisive influence effect extending beyond the technical breach itself. Attribution and Context Attribution remains undetermined. The operation could plausibly have been conducted by regional hacktivist collectives seeking to counter Russian interference, or by a state-affiliated actor executing a preemptive countermeasure. Regardless of origin, the case illustrates a mature application of Cyfluence methodology, the deliberate integration of cyber intrusion, information exposure, and psychological leverage to disrupt an active influence campaign in real-time. Outcome In the aftermath, communication within the Victorie Bloc collapsed, financial flows were interrupted, and several key figures publicly distanced themselves from the organization. Public debate shifted away from the Bloc’s messaging and toward its exposure as a mechanism of Russian influence. The operation achieved dual objectives: operational neutralization and narrative delegitimization, significantly reducing the reach of a foreign-backed political campaign on the eve of the vote. [Footnotes:] [i] WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak [ii] Moldova1, R. Lozinschi‑Hadei, 2025. Telegram leaks: Șor’s firms used to undermine Moldova’s democracy. [online] Published 3 September 2025. Available at: https://moldova1.md/p/56415/telegram-leaks-sor-s-firms-used-to-undermine-moldova-s-democracy [iii] Publicly accessible ProtonDrive link associated with the leak: https://drive.proton.me/urls/PAEYV2N61R#rxaNKy4NtPNL [iv] Elliptic, 2025. The A7 leaks: The role of crypto in Russian sanctions evasion and election interference . [online] Published 26 September 2025. Available at: https://www.elliptic.co/blog/the-a7-leaks-the-role-of-crypto-in-russian-sanctions-evasion-and-election-interference# [v] Source of the picture: WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak [vi] Moldova1, R. Lozinschi‑Hadei, 2025. Telegram leaks: Șor’s firms used to undermine Moldova’s democracy. [online] Published 3 September 2025. Available at: https://moldova1.md/p/56415/telegram-leaks-sor-s-firms-used-to-undermine-moldova-s-democracy [vii] WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak
- CRC Weekly: Cyber-based hostile influence campaigns 29th September - 05th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what weDuring the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. regard as the main events. [Review highlights] Russia's foreign intelligence service (SVR) is now issuing public statements to amplify pro-Kremlin narratives, a significant shift in its operational tactics. - EUvsDisinfo A Russian-backed network created a fake news website using AI-generated videos and media impersonation to spread false narratives about the French president. - Le Monde Russia’s Ottawa embassy conducted an information campaign accusing Canada of covering up a fabricated nuclear incident allegedly perpetrated by Ukrainian forces. - DisinfoWatch Russia used social media influencers to promote a deceptive work-study program that lured African women into working in its military drone factories. - EUvsDisinfo The Kremlin is executing a multifaceted information campaign to deny and reframe its systematic abduction of over 20,000 Ukrainian children. - EUvsDisinfo A new report argues the 2025 Gaza flotilla was a coordinated information operation by groups with ties to Hamas, using humanitarianism to shape opinion. - Global Influence Operations Report The failure of Moscow's extensive interference efforts in Moldova highlights the declining impact of its information operations in countries it considers its near abroad. - Atlantic Council An EU-led digital literacy camp is equipping youth in Bosnia and Herzegovina with critical thinking skills to identify and counter manipulated information. - EU Delegation to Bosnia and Herzegovina [Weekly Review] Russia’s Foreign Intelligence Service Adopts Public Role in Spreading False Narratives Russia’s foreign intelligence service (SVR), an agency that typically operates covertly, has recently become a public-facing vehicle for pro-Kremlin disinformation. According to an EUvsDisinfo analysis , the SVR has begun issuing official statements that amplify false narratives targeting NATO, the EU, and Western governments. This tactic marks a shift from the standard practice of circulating such claims through state media or deniable covert outlets. The SVR’s new role was prominent during Moldova’s September 2025 elections, where it baselessly accused the EU of planning a NATO-backed occupation following the decisive victory of a pro-EU party. The SVR has also spread disinformation in Serbia, alleging an EU-orchestrated “Maidan-style” coup, and in Georgia, where it claimed the U.S. and EU were plotting a “color revolution” while smearing NGOs with fabricated allegations. These actions represent a strategic change, leveraging the perceived authority of an intelligence agency to legitimize disinformation openly. Source: EUvsDisinfo, “The Shadowy SVR Openly Pushes Disinformation Narratives,” Available Online Top of Page Moldova’s Pro-EU Party Secures Victory Amidst Coordinated Cyberattacks Moldova’s pro-European Action and Solidarity Party (PAS) won a parliamentary majority despite a campaign of Russian interference and cyberattacks designed to destabilize the vote. A report from The Record detailed how authorities faced coordinated hoax bomb threats at polling stations and sustained cyberattacks on government infrastructure, including DDoS incidents targeting the Central Electoral Commission website and government cloud systems. These operations, coupled with disinformation campaigns aimed at Moldovan voters abroad, sought to intimidate the electorate and suppress the diaspora vote. According to France 24 , the Kremlin was identified as the central actor in the interference, with the Moldovan government accusing Moscow of spending hundreds of millions in "dirty money" on vote-buying and other destabilization efforts. While the attacks were blocked in real-time without disrupting the voting process, analysts warned that the Kremlin could still attempt to bribe new members of parliament to undermine the formation of a stable pro-European government. Source: The Record, “Moldova’s Pro-EU Party Wins Election Amid Cyberattacks and Kremlin Interference,” Available Online Source: France 24, “Moldova's pro-EU party on course to win pivotal election mired in claims of Russian meddling,” Available Online Top of Page Russian-Backed Network Deploys AI and Impersonation in Disinformation Campaign A Russian-backed influence network known as Storm-1516 created a fake news website to impersonate French media outlets and spread pro-Kremlin disinformation. An article in Le Monde revealed that the site, called BrutInfo, mimicked the branding of Brut and Le Monde to publish false stories, including a fabricated claim that President Emmanuel Macron was building a €148 million bunker. This operation utilized AI-generated videos, such as a fake interview with a supposed construction worker, to add a veneer of credibility. The network’s tactics also include employing paid actors, plagiarizing legitimate articles, and placing propaganda in low-standard international media outlets that accept paid contributions. France’s disinformation watchdog, Viginum, reported that content from Storm-1516 is frequently amplified by a network of pro-Kremlin influencers and paid accounts, extending the reach of its digitally sophisticated disinformation campaigns. Source: Le Monde, “A fake news website impersonates Le Monde and Brut,” Available Online Top of Page Russian State Actors Accuse Canada of Concealing Fabricated Nuclear Incident The Russian Embassy in Ottawa and the state news agency TASS initiated a disinformation campaign accusing Ukraine of shelling the Zaporizhzhia Nuclear Power Plant (ZNPP) and claiming Canada was covering up the supposed crime. A DisinfoWatch report details how the embassy’s official statements labeled Ukrainian President Volodymyr Zelensky a “maniacal terrorist” and asserted that the International Atomic Energy Agency (IAEA) was documenting Ukrainian provocations. This narrative, however, contradicts independent monitoring and recent IAEA updates, which confirmed military activity around the plant but did not assign blame, instead urging both sides to cease hostilities in the area. Russia's claims ignored evidence of potential sabotage by its own occupying forces and misrepresented the IAEA's neutral role. No credible evidence was found to support the accusation that Canada was involved in covering up a non-existent nuclear crime, with its official position remaining aligned with its allies. Source: DisinfoWatch, “Russian Embassy and TASS claim Canada is covering up non-existent Kiev nuclear crime,” Available Online Top of Page Russia Exploits Social Media Influencers for Deceptive Military Recruitment Russia has conducted a disinformation campaign across Africa that uses social media influencers to lure women into its war production industry under false pretenses. According to an article by EUvsDisinfo , the campaign promoted the “Alabuga Start” program, which was advertised on TikTok, Instagram, and YouTube as a work-study opportunity in fields like hospitality. In reality , recruits were sent to work in drone factories supporting Russia’s war in Ukraine, where they faced grueling conditions and health risks. When Nigerian media exposed the scheme, Russian embassies and pro-Kremlin channels mounted a coordinated response, dismissing the reporting as “Western disinformation.” This counternarrative was amplified by pan-Africanist influencers, who reframed the story as a Western plot against Russia-Nigeria relations, thereby creating an illusion of widespread support for the program while obscuring the evidence of exploitation. Source: EUvsDisinfo, “From social media to weapon factories: how Russia traps African women in war production,” Available Online Top of Page Kremlin Pivots to Election Fraud Narratives After Failed Interference Following the victory of Moldova’s pro-EU party, the Kremlin and its media affiliates executed a rapid pivot in their disinformation strategy, shifting from pre-election accusations of corruption to post-election claims of widespread voter fraud. As reported by NewsGuard Reality Check , this strategy involved disseminating fabricated evidence across social media platforms like X and through state-owned outlets such as TASS. The campaign circulated deceptive videos, including one repurposed from Azerbaijan that falsely depicted ballot stuffing in Italy, in an attempt to delegitimize the election results. This effort, which showed signs of the Storm-1516 influence operation, ultimately failed to sway the outcome, demonstrating the limits of Russian influence and the resilience of Moldova's democratic institutions. In a separate but related effort, a DFRLab report identified a pro-Russian campaign codenamed "Matushka" that exploited Orthodox Christian beliefs to influence voters. The operation created a network of 67 channels on Telegram, TikTok, and other platforms, initially sharing religious content before pivoting to political messaging that framed European integration as a threat to the church. This strategy aimed to mobilize a religious voter base by suggesting that voting for pro-Kremlin candidates was a religious duty to protect traditional values from "moral decay." Source: NewsGuard Reality Check, “Russians Cry Fraud After Failing to Sway Moldovan Election With Disinformation,” Available Online DFRLab, “Targeting the faithful: Pro-Russia campaign engages Moldova’s Christian voters,” Available Online Top of Page Putin’s Valdai Speech Outlines a Global Disinformation Strategy At the Valdai Club, a Kremlin-controlled think tank, Russian President Vladimir Putin delivered a speech outlining a strategic disinformation campaign aimed at Western nations. A publication by DisinfoWatch analyzes how Putin and state media outlets are promoting a narrative that frames Russia as a moral "counterweight" to a decadent and declining Western liberal order. The core strategy involves driving a "culture-war wedge" by weaponizing issues like "gender terrorism" to generalize about systemic Western collapse and legitimize Moscow’s vision of a "polycentric," illiberal world. Specific disinformation tactics included inverting causality by labeling European rearmament a "provocation" and using fearmongering to deter military support for Ukraine. This coordinated information warfare campaign serves multiple goals: reassuring Russia’s domestic audience, encouraging sanctions fatigue among EU voters, and advancing Moscow’s revisionist foreign policy. Source: DisinfoWatch, “DisinfoDigest: Decoding Putin’s Valdai Speech,” Available Online Top of Page Kremlin FIMI Campaign Aims to Obscure Child Abduction War Crimes The Kremlin is leveraging a Foreign Information Manipulation and Interference (FIMI) campaign to obscure its systematic abduction of over 20,000 Ukrainian children, a policy that constitutes a war crime. According to EUvsDisinfo , this operation relies on a three-pronged disinformation strategy: outright denial of the abductions, falsely reframing the kidnappings as humanitarian "evacuations," and claiming to facilitate family reunification while actively erasing the children’s identities through forced adoptions and citizenship changes. Key actors leading this effort include Russian President Vladimir Putin and his 'Commissioner for Children's Rights,' Maria Lvova-Belova, both of whom face arrest warrants from the International Criminal Court for their role in the unlawful deportations. In response, 38 countries, alongside the Council of Europe and the EU, have called for the children's immediate return, and an international coalition has been launched to address Russia's actions. Source: EUvsDisinfo, “At the 80th UNGA, Remember Russia’s War on Ukrainian Children,” Available Online Top of Page Gaza Flotilla Analyzed as Coordinated Information Operation The 2025 Global Sumud Flotilla, a maritime campaign challenging Israel’s blockade of Gaza, functioned as both a humanitarian initiative and a coordinated information operation driven by a network aligned with the Muslim Brotherhood. A report from the Global Influence Operations Report (GIOR) argues that while the flotilla was framed publicly as a humanitarian intervention, its key organizers—including Turkey’s İHH and the Freedom Flotilla Coalition—have long-standing ties to Hamas. According to the analysis , these groups leveraged humanitarian rhetoric to shape global opinion and legitimize their political activism. The report contends that the flotilla demonstrates a 15-year evolution of Gaza solidarity activism, which has transformed from grassroots convoys into a transnational influence ecosystem connecting NGOs with sympathetic states like Turkey, Qatar, and Malaysia. This suggests that humanitarian activism can serve as a vehicle for ideological influence, blurring the line between civil solidarity and coordinated campaigns. Source: Global Influence Operations Report, “The Global Sumud Flotilla of 2025: Humanitarian Activism or Islamist Influence Operation?,” Available Online Top of Page Study Finds AI Misinformation Has Dual Effect on Media Trust Exposure to AI-generated misinformation reduces overall trust in media but can simultaneously increase engagement with credible news sources, according to a field experiment involving 17,000 readers. A study , published in TechXplore and conducted by researchers from multiple universities in partnership with German newspaper Süddeutsche Zeitung , presented readers with pairs of real and AI-generated images. The findings revealed this dual effect: while trust declined, readers who became aware of the difficulty in distinguishing real from fake content subsequently visited the newspaper's digital platforms more frequently and demonstrated better information retention. This effect was most pronounced among individuals with lower prior interest in politics. The implications suggest that while AI-driven misinformation threatens public trust, it also creates an opportunity for reputable media outlets to deepen audience engagement by educating them about the challenges of the modern information environment. Source: TechXplore, “Reader survey shows AI-driven misinformation can reduce trust, but increase engagement with credible news,” Available Online Top of Page AI-Driven Disinformation Accelerates Democratic Decay Across Africa Artificial intelligence is increasingly being deployed as a tool to destabilize democratic processes and support authoritarianism in Africa. An article from the LSE Africa at LSE blog highlights how AI-generated deepfakes and coordinated disinformation campaigns fueled polarization and public skepticism during Nigeria's 2023 elections. In the Sahel region , AI-driven content, often linked to Russian-influenced networks, has been used to glorify military juntas and undermine calls for civilian governance. This trend is occurring in a context of declining public faith in democracy across the continent, with support for democratic rule having fallen by seven percentage points in the last decade. AI-fueled disinformation acts as a force multiplier for this democratic decay by accelerating the spread of false narratives, eroding trust in institutions, and overwhelming citizens' ability to discern fact from fabrication, underscoring the need for global governance frameworks. Source: LSE Africa at LSE blog, “In the age of artificial intelligence, democracy needs help,” Available Online Top of Page AI Weaponized to Threaten Democratic Processes and Critical Systems The increasing accessibility of artificial intelligence is enabling malicious actors to undermine elections, manipulate markets, and compromise critical systems. According to an article in TechXplore , AI-generated content like deepfakes and fake social media profiles has been used to spread disinformation and influence public opinion, leading to events such as the suspension of the 2024 Romanian presidential elections due to foreign interference. Beyond elections , AI systems trained on biased data have resulted in discriminatory outcomes in healthcare, while AI-generated fake news has been deployed to manipulate financial markets. The World Economic Forum has highlighted AI’s potential to disrupt geopolitical stability and national security. The adaptability of AI lowers the barrier for executing large-scale attacks, making it more difficult to safeguard critical infrastructure. Experts advocate for secure AI practices, robust regulation, and international cooperation to mitigate these risks and ensure AI is harnessed responsibly. Source: TechXplore, “How AI poses a threat to national elections, health care and security,” Available Online Top of Page Comparative Study Examines Frameworks for Measuring Disinformation Impact To better understand and counter disinformation, it is crucial to accurately measure its effects, yet methodologies for doing so vary widely. In a comparative study , the organization EU DisinfoLab analyzed several frameworks used to assess the impact of disinformation, including the ABCDE Framework, the Disarm Framework, and the Impact-Risk Index. The analysis revealed that these frameworks adopt different approaches; some prioritize quantifying the reach of a disinformation campaign, while others focus on measuring the subsequent harm to public opinion and behavior. The study concludes that harmonizing these divergent methodologies is essential for developing a more comprehensive and standardized understanding of disinformation’s impact. Such work is critical for informing effective policy-making and counter-disinformation strategies, particularly as digital platforms and influence campaigns continue to grow in sophistication. The study calls for continued collaboration to refine these vital assessment tools. Source: EU DisinfoLab, “Decoding Disinformation Impact Frameworks and Indicators: a Comparative Study,” Available Online Top of Page Moldova’s Institutional Resilience Blunts Russian Election Interference Efforts Russia’s comprehensive campaign to interfere in Moldova's recent elections was ultimately unsuccessful due to the resilience of the country's institutions and electorate. An Atlantic Council article explains how the Kremlin deployed operatives and AI-generated fake accounts to saturate Moldovan social media with disinformation targeting President Maia Sandu and her pro-European party. Despite the scale of this information operation, Moldovan authorities effectively countered the threat by uncovering illicit financing schemes and voter bribery efforts linked to the campaign. The Moldovan public demonstrated a strong commitment to democratic values by delivering decisive support for Sandu’s platform of European integration. The election outcome is seen as a significant indicator of Russia's declining influence in its near abroad, demonstrating that even well-resourced interference campaigns can be thwarted by vigilant institutions and an informed public. Source: Atlantic Council, “Putin’s Moldova election failure highlights Russia’s declining influence,” Available Online Top of Page EU Initiative Bolsters Youth Digital Literacy to Counter Disinformation An initiative in Bosnia and Herzegovina aims to equip young people with the skills necessary to navigate the digital information landscape and counter disinformation. The EU Delegation to Bosnia and Herzegovina reported on its second Media and Digital Literacy Camp, which gathered youth for workshops on critical thinking, fact-checking, and assessing source credibility. The program featured guidance from experts in academia and from fact-checking platforms such as Raskrinkavanje, with a focus on identifying manipulated information. This initiative addresses the growing challenge of disinformation by fostering a more informed and engaged citizenry. It aligns with the EU's broader commitment, outlined in its annual human rights and democracy reports, to promote media freedom and combat the spread of false information. Such educational programs are considered a crucial component in strengthening democratic processes and ensuring information integrity in the digital age. Source: EU Delegation to Bosnia and Herzegovina, “Media and Digital Literacy Camp: Enhancing critical thinking and digital skills among youth,” Available Online Top of Page [Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. Across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top of Page [Download Report]
- From Coup to Cult: The Transnational Construction of Power in West Africa’s Information Space -The Case of Burkina Faso
The Sahel region has emerged as a key setting for significant evolutions in cognitive warfare, where the contest for its information space, although underreported, has global impact and relevance. A new analysis by Tim Stark uses a case study of Burkina Faso under Captain Ibrahim Traoré to provide a deep dive into these dynamics. It details how West African influence campaigns exploit the region’s fertile ground for narrative warfare—an environment where traditional oral storytellers have morphed into digital influencers—through the use of synthetic propaganda and hybrid operations, all in the context of a struggle by foreign powers to fill the strategic vacuum left by departing Western nations. Traoré’s trajectory from coup leader to mythologized icon of Pan-African resistance illustrates a broader transformation in the global information environment, whereby authoritarian leaders in fragile states can now project narratives across borders to build legitimacy while reshaping perceptions abroad. Stark concludes this is more than simple regime consolidation; it is a durable, transnational mythmaking effort that achieves global resonance by linking local grievances to potent anti-imperialist rhetoric, infiltrating Western timelines and directly influencing democratic discourse. [ Download Full Report here ]
.png)









