CRC Weekly: Cyber-based hostile influence campaigns 1st - 7th September
- CRC

- Sep 13
- 8 min read

[Introduction]
Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the first week of September 2025, we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). The following report is a summary of what we regard as the main events.
[Highlights]
The Telegram channel 'War on Fakes' was established one day before the 2022 invasion to preemptively push false narratives and deflect Russian war crimes.
During the recent Pakistan Indian conflict, Competing Influence operations ‘Hidden Charkha’ and ‘Khyber Defender’ deployed over 1,200 inauthentic accounts in support of their governments.
Multiple, distinct Russia-linked influence networks are converging their efforts to destabilize Moldova's elections and halt its pro-European trajectory.
The Moscow Patriarchate is using religious conferences in Africa to expand Russian influence in coordination with state intelligence and mercenary operatives.
The rate of AI tools repeating false news claims nearly doubled from 18 percent to 35 percent in one year.
The #trumpdead hashtag campaign on X generated over 35 million views within a four-day period, showcasing the narrative's rapid, high-volume spread.
The Cyfluence Research Center has relaunched the CRC Glossary. This initiative aims to provide a lexicon of both foundational and emerging terms relating Cyfluence
[Weekly Review]
Influential Figures Exploit Minneapolis Shooting to Push Competing Narratives on X
According to Wired, the aftermath of the Minneapolis church shooting became a case study in how X’s platform architecture accelerates the spread of hostile influence. Immediately following the event, high-profile figures, including politicians and activists, exploited the information vacuum to disseminate competing and unsubstantiated narratives about the shooter's motives. These claims, which ranged from anti-Christian hate to white supremacy and leftist radicalization, quickly went viral. This rapid spread was facilitated by X’s weakened moderation and an algorithmic model optimized for engagement over factual accuracy.
The platform’s incentive structure rewarded sensational content, allowing false claims to outpace verified information. Even X’s own content summaries reportedly amplified details that fueled political narratives. While experts ultimately concluded the shooter’s motivations were likely nihilistic rather than ideological, the platform had already successfully converted the tragedy into a vehicle for political polarization and viral misinformation, demonstrating a significant vulnerability in the modern information ecosystem.
Source: Wired, David Gilbert, How Disinformation About the Minnesota Shooting Spread Like Wildfire on X, Sep 3, 2025, [online] Available at: https://www.wired.com/story/disinformation-minnesota-shooting-x/
Institutional Mimicry: Russia Deploys Fake Fact-Checking Outfits to Launder Propaganda
EUvsDisinfo has reported that pro-Russian actors are actively corrupting the practice of fact-checking by creating bogus organizations to legitimize and disseminate state-sponsored propaganda. This tactic represents a continuation of the Kremlin's subversion of journalistic principles, which intensified following the 2022 full-scale invasion of Ukraine with initiatives like the 'War on Fakes' Telegram channel. The most recent and sophisticated effort is the 'Global Fact-Checking Network' (GFCN), a Kremlin-funded entity intentionally named to resemble the legitimate International Fact-Checking Network. Launched in April by Russian Foreign Ministry spokesperson Maria Zakharova, the GFCN is managed by sanctioned individuals previously involved in other influence operations. Its content is notably vacuous, avoiding any scrutiny of Russia while covertly inserting pro-Kremlin messaging and allusions to Western culture wars. Despite its formal structure, the operation currently exhibits negligible impact, with its social media channels attracting almost no organic viewership or followers, suggesting its pretenses have failed to gain traction.
Source: EUvsDisinfo, 2025, Fake fact-checking: when facts are fiction and falsehoods are facts. [online] Available at: https://euvsdisinfo.eu/fake-fact-checking-when-facts-are-fiction-and-falsehoods-are-facts/
Vilification Campaigns and Inauthentic News Deployed Against Moldovan Leadership
A publication from Recorded Futures Insikt group covers how multiple Russia-linked influence operations are converging to destabilize Moldova's September elections and derail its accession to the European Union. Networks including Operation Overload, Operation Undercut, and the Foundation to Battle Injustice are executing parallel campaigns to vilify President Maia Sandu and the ruling PAS party. These efforts portray EU integration as economically disastrous while promoting alignment with the Kremlin's “Russkiy Mir” doctrine. The operations leverage a range of tactics, from laundering pro-Kremlin content via aggregator sites like Pravda Moldova to deploying covert social media pages linked to oligarch Ilan Shor. For the first time, Operation Undercut has been observed using TikTok to target Moldovan users with anti-government narratives. While these campaigns have not yet achieved substantial success in shaping public opinion, they heighten risks to media integrity and voter trust. The report also notes that a retreat in US counter-disinformation efforts has created a more permissive environment for these increasingly sophisticated Russian campaigns.
Source: Recorded Future, September 2025. Russian Influence Assets Converge on Moldovan Elections. [online] Available at: https://www.recordedfuture.com/research/russian-influence-assets-converge-on-moldovan-elections
Russia Deploys 'Failed State' and Falsified Data Narratives Against Canada
According to a report from DisinfoWatch, Russian Foreign Ministry spokesperson Maria Zakharova has signaled an escalation of information warfare against Canada. Through a Telegram post amplified by state media, Zakharova depicted Canada as a nation in a “deep crisis of values” that could cease to exist within a decade. Her commentary leveraged a combination of established Kremlin tactics, including the promotion of fringe separatism, culture-war tropes, and anti-LGBTQ narratives. The messaging relied on specific falsehoods, such as inflating support for separatism in Manitoba and misrepresenting Canadian policies on drug decriminalization and medical assistance in dying. These efforts align with documented Russian influence operation templates aimed at exploiting societal divisions, undermining support for Ukraine, and portraying liberal democracies as decadent and failing. The direct targeting of Canada suggests a renewed Kremlin focus on subverting the country's national unity and weakening its international alliances, signaling a new phase of hostile influence operations that Kremlin-aligned actors are expected to amplify.
Source: DisinfoWatch, 5 September, 2025. Kremlin Spokeswoman Zakharova Takes Aim At Canada. [online] Available at: https://disinfowatch.org/kremlin-spokeswoman-zakharova-aims-to-divide-canada/
Moscow Patriarchate Pivots to Africa and Domestic Extremism Amid declining political relevance
A Jamestown Foundation analysis details how the Moscow Patriarchate (ROC MP) is compensating for its declining influence in the post-Soviet space by increasing its strategic utility to the Kremlin. Despite significant losses, particularly in Ukraine, the church is successfully executing a pivot towards new domestic and foreign influence operations. Domestically, it promotes traditional values and has helped elevate Orthodoxy to a core component of Russian civic identity for 61 percent of the population. Abroad, the ROC MP is expanding its geopolitical reach into Africa, holding conferences for local clerics to cement Russian influence in coordination with state intelligence. It also projects soft power by asserting canonical authority over groups like the Orthodox Church in America. A directive for the church to engage with the extremist 'Russian Community' shows a high-risk strategy to co-opt radical nationalism, which may secure Patriarch Kirill's position but entangles the state more deeply with extremist elements.
Source: Jamestown Foundation, Paul Goble, 2025. Eurasia Daily Monitor. [online] Available at: https://jamestown.org/program/despite-losses-at-home-and-abroad-moscow-patriarchate-helps-kremlin-expand-influence/
Generative AI Falsehood Rate Doubles Amid Push for Real-Time Responsiveness
The rate at which leading generative AI tools repeat false information has nearly doubled in one year, an increase that undermines industry promises of safer systems. An audit by NewsGuard found that the failure rate for news-related prompts increased from 18 percent in 2024 to 35 percent in 2025. This degradation stems from a structural tradeoff where chatbots have integrated real-time web search capabilities. While this change eliminated query non-responses, it simultaneously exposed the models to a polluted online information ecosystem. Malign actors, including Russian disinformation operations, are actively exploiting this vulnerability. They are laundering falsehoods through low-engagement websites, social media, and AI-generated content farms, which the models fail to distinguish from credible outlets. The push to make AI tools more responsive and timely has inadvertently made them more susceptible to spreading propaganda, turning them into more effective conduits for hostile influence operations.
Source: NewsGuard, September 4th 2025, AI False Information Rate Nearly Doubles in One Year. [online] Available at: https://www.newsguardtech.com/ai-monitor/august-2025-ai-false-claim-monitor/
False Trump Health Rumors Garner Tens of Millions of Social Media Views
Liberal and anti-Trump social media accounts are executing a disinformation campaign alleging a severe decline in President Trump's health, including rumors of his death. This operation, analyzed by NewsGuard, relies on multiple pieces of fabricated or decontextualized evidence to construct its narrative. Key tactics include circulating a misleading map screenshot to suggest road closures at Walter Reed Medical Center and using an AI-enhanced photograph to create false visual evidence of a stroke. Actors also repurposed older media, such as a 2023 photo of an ambulance at the White House and an image of the flag at half-staff for a school shooting, to imply a current medical emergency. The campaign achieved significant reach, with one associated hashtag, #trumpdead, accumulating over 35 million views on X in four days. The events demonstrate how disparate, low-effort falsifications can be networked to create a pervasive and viral political narrative.
Source: NewsGuard, Sofia Rubinson, 2025. NewsGuard Reality Check. [online] Available at: https://www.newsguardrealitycheck.com/p/bogus-evidence-for-trumps-supposed
Hidden Charkha and Khyber Defender: State-Aligned IO in South Asian Conflict
Two large, state-aligned influence networks, Hidden Charkha (pro-India) and Khyber Defender (pro-Pakistan), operated during the 2025 conflict between the nations. The report by Recorded Future provides a blueprint for how symmetrical influence operations are deployed by adversarial, nuclear-armed states to control escalation and garner support during kinetic military actions. Both networks attempted to frame their respective nations as holding the moral high ground through technological and military superiority, thereby justifying tactical restraint. Key tactics included amplifying forged military documents, exaggerating the impact of cyberattacks, and impersonating media outlets. Despite the scale of these operations, they were assessed as almost certainly unsuccessful in shaping public opinion. Their failure was attributed to an inability to break out of patriotic echo chambers and a recurrent use of generative AI for visual content, which likely undermined their credibility. Still, their activities demonstrate how patriotic sentiment can align non-state actors with government objectives during wartime.
Source: Recorded Future, Insikt Group, 02 SEP 2025, Influence Operations and Conflict Escalation in South Asia. [online] Available at: https://www.recordedfuture.com/research/influence-operations-and-conflict-escalation-in-south-asia
UK Democracy Remains Vulnerable to Misinformation Amid Weak Election Reforms
According to an analysis by Full Fact, the UK government’s Elections Bill represents a missed opportunity, as its measures are insufficient to protect democratic processes from misinformation. While the strategy contains some positive steps, such as increasing fines for the Electoral Commission and requiring digital imprints on some campaign materials, it fails to match the scale of the threat. The proposed legislation needs significant upgrades to be effective. Key recommendations include amending the Online Safety Act to cover more categories of illegal election-related content and other material harmful to democracy. The bill should also incorporate robust media and political literacy initiatives, especially for younger voters, and establish stronger rules to deal with political deepfakes, including clear labeling requirements. Further proposals include creating a comprehensive digital library of political advertisements to enable public scrutiny and establishing an independent body to regulate non-broadcast political advertising. Without these more ambitious provisions, the bill will not achieve its stated objectives of safeguarding democracy and restoring political trust.
Source: Full Fact, 1st Sep, 2025, Protecting our democracy from the harms of misinformation and disinformation. [online] Available at: https://fullfact.org/politics/protecting-our-democracy-from-the-harms-of-misinformation-and-disinformation/
[Takeaways]
This week we saw examples of how threat actors are increasingly forgoing direct persuasion in favor of tactics, such as exploiting AI vulnerabilities and mimicking trusted institutions, to passively degrade the Information Ecosystem. This approach suggests a strategic calculation: an ungovernable and untrustworthy information space is, in itself, a victory. By fostering an environment of radical doubt, malign actors can paralyze democratic decision-making and erode social cohesion without having to win a direct contest of ideas.
[CRC Glossary]
The Cyfluence Research Center has relaunched the CRC Glossary. This initiative aims to serve as a shared lexicon of both foundational and emerging terms that shape the field. To this end, the Glossary is designed to be a continually updated resource, with new entries added weekly. We see this as a collaborative project and strongly encourage input from the expert community. The goal is to reduce the problem of ambiguous, or conflicting terminology that can hinder collaborative work as well as communication effectiveness to the general public as a whole.
We invite you to submit additions, changes, or corrections via the form on our website.
_edited.png)
.png)


