CRC Weekly: Cyber-based hostile influence campaigns 22nd-28th September 2025
- CRC

- Oct 2
- 10 min read

[Introduction]
Hostile influence campaigns combine various aspects of cognitive warfare and are often accompanied by political, economic, or military strategies to achieve long-term strategic advantage. Our analysis focuses on cyber-based campaigns, with a particular emphasis on a key subset we define as Cyfluence.
Cyfluence (cyber-attacks for influence) is the strategic and operational integration of cyber threat vectors with hostile information influence operations (IIOs). Cyfluence operations, conducted by state-sponsored or independent actors, represent an advanced form of cognitive warfare.
They combine cyberattacks (e.g., hack-and-leak, digital identity hijacking, DDoS) with digital influence techniques (e.g., coordinated disinformation, misinformation, and malinformation amplified through inauthentic activity). The objectives of Cyfluence operations include the manipulation of public discourse, election interference, reputation abuse, and societal polarization.
During the 22nd to the 28th of September 2025, we observed, collected, and analyzed endpoints of information related to these campaigns. The following report is a summary of what we regard as the main events from this period.
[Report Highlights]
A Kremlin-linked campaign succeeded in "infecting" major generative AI models with fabricated corruption claims targeting the Moldovan election.
Over 200 leading figures, including Nobel Prize winners and AI experts from companies such as OpenAI, Google DeepMind, and Microsoft, have called for action to establish strict “red lines” for artificial intelligence.
Georgia’s ruling party, Georgian Dream, has intensified its use of disinformation and conspiracy theories to undermine public trust in the European Union.
BBC undercover investigation uncovered the secret Russian-funded network to disrupt Moldova’s September 28 parliamentary elections through coordinated disinformation campaigns.
According to a report by DFR Lab, a new online outlet called REST has emerged as a key pro-Kremlin disinformation.
Iran's intelligence operations are targeting Scandinavian countries with greater intensity. A report by the Middle East Quarterly provides insights into their recent campaigns with a focus on Denmark and Norway.
[Weekly Review]
REST Outlet Makes a New Front in Russia’s Campaign Against Moldova
Manufacturing a Crisis: Inside Russia's Information War on Moldova's Election
Paid to Post: Anatomy of a Pro-Russian 'Digital Army'
RT Pushes Kremlin Disinformation to Undermine Canadian Support for Ukraine
Deconstructing Russia's Moldova 'Occupation' Narrative
Kremlin Campaign Corrupts AI Models in Moldovan Election Influence Op
Iran's Scandinavian Operations: A Permissive Environment for Espionage and Influence
Georgia's Ruling Party Uses 'Traditional Values' Disinformation to Counter EU Pressure
Experts Issue Global Call for AI 'Red Lines' to Prevent Mass Disinformation
Expert Analysis: EU's Institutional Weakness is its Greatest Vulnerability to Foreign Meddling
REST Outlet Makes a New Front in Russia’s Campaign Against Moldova
According to a recent analysis by the DFR Lab, a new online outlet named REST has emerged as another tool in the pro-Kremlin disinformation campaign targeting Moldova ahead of its September 2025 parliamentary elections. The publication details REST’s connection to Rybar, a major sanctioned Russian propaganda operation, suggesting the new outlet is designed to evade sanctions and regenerate influence capabilities. The connection is supported by technical evidence, including shared hosting infrastructure, identical server configurations, and leaked image metadata that directly references Rybar.
The outlet’s content, which gained millions of views on TikTok and was amplified across X and Telegram, is designed to embed disinformation into Moldova’s digital environment. This activity represents a continuation of Russian influence operations, which employ a sophisticated toolkit including AI-generated deepfakes, mirror websites, and covert financing to undermine Moldova’s pro-European course. The analysis also notes the translation of REST content into EU languages, indicating a multi-platform, cross-border effort to manipulate information.
Source: DFRLab, J. Kubś & E. Buziashvili, 2025. Sanctioned Russian actor linked to new media outlet targeting Moldova. [online] Published 23 September 2025. Available at: https://dfrlab.org/2025/09/23/sanctioned-russian-actor-linked-to-new-media-outlet-targeting-moldova/
Manufacturing a Crisis: Inside Russia's Information War on Moldova's Election
The BBC reports on a Russian-funded network in Moldova. Its goal is to influence the parliamentary elections on 28 September 2025. Participants were recruited through Telegram. They were asked to post pro-Russian content on TikTok and Facebook. The reported payment was approximately $ 170 per month. Organisers gave instructions. They also provided guidance on using AI.
The posts targeted President Maia Sandu and the ruling PAS party. Claims included election fraud, child trafficking and forced LGBT policies. Participants were also asked to conduct unauthorized opinion polls. The results and secret recordings could later be used to cast doubt on the election outcome.
According to the BBC, the network was coordinated by Alina Juc from Transnistria. She is reportedly linked to Russia. Funding reportedly came via the Russian state-owned Promsvyazbank. There are also indications of ties to oligarch Ilan Shor. He is based in Moscow and sanctioned by the US, EU, and UK. The NGO Evrazia was also named as involved.
The BBC reports that the network operates at least 90 TikTok accounts, which have garnered over 23 million views. DFRLab estimates an even wider reach. Shor, Evrazia, and Juc did not respond to questions. Moldova’s police view disinformation as the main method of interference. The Russian embassy denies the allegations.
Source: BBC, O. Marocico, S. Mirodan & R. Ings, 2025. How a Russian‑funded fake news network aims to disrupt elections in Europe. [online] Published 21 September 2025. Available at: https://www.bbc.com/news/articles/c4g5kl0n5d2o
Paid to Post: Anatomy of a Pro-Russian 'Digital Army'
The DFRLab report describes an operation with alleged links to Moscow that aims to influence Moldova’s parliamentary elections on September 28, 2025. Individuals were reportedly paid to create inauthentic accounts and spread coordinated content. The network has been active since the fall of 2024 and has been monitored since January 2025.
By August 2025, around 200 so-called “InfoLeaders” had been recruited. DFRLab analyzed 253 accounts across TikTok, Facebook, and Instagram. In total, the operation generated nearly 29,000 posts, reaching over 55 million views, more than 2 million likes, and hundreds of thousands of comments. While TikTok was the main platform, Facebook activity grew in mid-2025.
The structure was hierarchical. Russian-speaking curators set daily tasks, hashtags, and quotas. Recruits could advance from “communication activists” to InfoLeaders. The network utilized hashtags systematically, organized flash mobs, and instructed participants to personalize their content to make it appear more organic.
The main narratives targeted President Maia Sandu and the ruling PAS party, focusing on alleged fraud, corruption, and criticism of EU and NATO integration. Politically, the operation shifted from supporting Ilan Shor’s “Victory Bloc” to promoting the “Moldova Mare” party, reusing earlier narratives under a new banner.
Source: DFRLab, V. Châtelet & V. Olari, 2025. Paid to post: Russia-linked ‘digital army’ seeks to undermine Moldovan election. [online] Published 24 September 2025. Available at: https://dfrlab.org/2025/09/24/paid-to-post-russia-linked-digital-army-seeks-to-undermine-moldovan-election/
RT Pushes Kremlin Disinformation to Undermine Canadian Support for Ukraine
A recent analysis by DisinfoWatch details another instance of Russian state media attempting to undermine Western support for Ukraine, this time targeting Canadian audiences. The report breaks down an RT article that falsely accuses Canada of funding "atrocities" and "neo-Nazi brigades." This campaign provides a clear case study of a broader Kremlin strategy to erode public support for Ukraine by reviving the well-worn "Ukraine-as-Nazi" trope and reframing legitimate aid as complicity in war crimes.
The DisinfoWatch analysis highlights RT's use of classic disinformation techniques, including whataboutism, projection, and the distortion of facts, notably, ignoring ICC warrants against Russian officials. The campaign's objective is to emotionally manipulate audiences and delegitimize Canada's actual efforts, which focus on documenting war crimes in cooperation with the ICC. The report notes that the operation scores extremely high on disinformation risk, given its overt delivery by a recognized state-media asset, its reliance on single-source claims, and its repetition of established Kremlin propaganda narratives, making it a straightforward example of foreign information manipulation.
Source: Publisher: DisinfoWatch, Author: DisinfoWatch, Title: RT Falsely claims “Canada keeps bankrolling Ukraine’s war crimes”, Date: 22 September 2025, Available at: https://disinfowatch.org/disinfo/rt-falsely-claims-canada-keeps-bankrolling-ukraines-war-crimes/
Deconstructing Russia's Moldova 'Occupation' Narrative
An article by DisinfoWatch deconstructs a Russian disinformation narrative, circulated in the lead-up to Moldova's recent elections, which claimed the EU and NATO were preparing to "occupy" the country. The report traces the claim's origin to Russia's Foreign Intelligence Service (SVR), providing another clear example of a coordinated, state-level influence operation. The narrative, which cited NATO troop presence in the region as a pretext, was amplified without evidence by state media outlets like RT and TASS.
The DisinfoWatch report highlights the campaign's clear strategic objectives: timed to coincide with the election, it sought to intimidate voters, delegitimize the country's pro-EU policies, and erode trust in Western partners. The analysis tracks the dissemination path from the SVR press bureau through major state media before being laundered into regional sites and social media ecosystems. By debunking the claim and contrasting it with the EU's actual policy of supporting democratic reforms, the report presents a concise case study on how unsubstantiated security threats are fabricated and deployed to create political instability.
Source: DisinfoWatch, 2025. EU is not “preparing to ‘occupy’ Moldova – Moscow”. [online] Published 23 September 2025. Available at: https://disinfowatch.org/disinfo/eu-is-not-preparing-to-occupy-moldova-moscow/
Kremlin Campaign Corrupts AI Models in Moldovan Election Influence Op
A recent analysis by NewsGuard has identified a Kremlin-linked disinformation operation. The campaign's name is "Storm-1516," and it targeted Moldova's recent parliamentary elections. The campaign represents a continuation of established malign influence efforts, focusing on disseminating false corruption claims against the incumbent pro-European government to undermine the democratic process. Utilizing a vast propaganda network, the operation achieved considerable reach, drawing over 17.7 million views on platforms like X. This saturation level underscores the scale of the effort directed at a country with a population of only 2.4 million.
The investigation’s key finding, however, elaborates on an evolving tactic: the deliberate infection of Generative AI models. NewsGuard found that when prompted about the campaign's false narratives, major AI chatbots reproduced the disinformation more than one-third of the time. This successful compromise of widely used AI tools demonstrates a new and dangerous vector for FIMI campaigns. The operation highlights an escalation in tactics used to influence key elections, in this case, aiming to derail Moldova's European trajectory and reassert Russian influence in the region.
Source: NewsGuard, E. Maitland, A. Lee & M. Roache, 2025. New Kremlin‑linked influence campaign targeting Moldovan elections draws 17 million views on X and infects AI models. [online] Published 26 September 2025. Available at: https://www.newsguardrealitycheck.com/p/new-kremlin-linked-influence-campaign
Iran's Scandinavian Operations: A Permissive Environment for Espionage and Influence
An analysis published by Eurasia Review details the long-standing and varied intelligence operations conducted by the Islamic Republic of Iran (IRI) in Denmark and Norway. The report provides further examples of Iran's operational playbook, highlighting how the region's advanced industries, universities, and politically active diaspora make it an attractive, yet often overlooked, target for hostile state activities. The findings reinforce the understanding of Iran's global intelligence reach and its use of multifaceted tactics.
The analysis outlines a range of operations, including assassination plots against dissidents, cyber espionage targeting research institutions, surveillance conducted through diplomatic and religious channels, and the use of local criminal networks for kinetic attacks. Crucially, it places these activities within the context of Iran’s strategic alignment with Russia and China, citing the Swedish Security Service's assessment that these states are collaborating to reshape the global order. The report concludes that a fragmented and weak response from Scandinavian governments has created a low-risk, permissive environment, effectively emboldening Tehran's intelligence services.
Source: Eurasia Review, A. Khoshnood, M. Norell & A. M. Khoshnood, 2025. A growing security threat: Iranian intelligence operations in Scandinavia (Part One: Denmark and Norway) – Analysis. [online] Published 25 September 2025. Available at: https://www.eurasiareview.com/25092025-a-growing-security-threat-iranian-intelligence-operations-in-scandinavia-part-one-denmark-and-norway-analysis/
Georgia's Ruling Party Uses 'Traditional Values' Disinformation to Counter EU Pressure
According to an article from The Jamestown Foundation's Eurasia Daily Monitor details the intensified use of disinformation by Georgia’s ruling party, Georgian Dream, as it faces EU pressure to reverse democratic backsliding. The analysis outlines how the party is weaponizing anti-LGBT conspiracy theories, falsely framing EU democratic norms as an imposition of “Western decadence” and a threat to national sovereignty. This narrative serves as a political tool to rally the party's conservative base and deflect blame for potential EU sanctions resulting from its own controversial policies.
Despite this top-down campaign, the report highlights polling data showing that public support for EU integration remains overwhelmingly high at 78 percent. This suggests the government’s narrative has failed to shift the majority opinion on Georgia's geopolitical orientation. However, the continued promotion of these divisive conspiracies through pro-government media risks further polarizing society. The strategy illustrates a case of a state actor using value-based disinformation to undermine a supranational body and erode trust in democratic processes, even when public sentiment is resistant.
Source: Jamestown Foundation, B. Chedia, 2025. Georgian Dream weaponizes LGBT‑related conspiracy theories. [online] Published 23 September 2025. Available at: https://jamestown.org/program/georgian-dream-weaponizes-lgbt-related-conspiracy-theories/
Experts Issue Global Call for AI 'Red Lines' to Prevent Mass Disinformation
In a significant public call for urgent regulation, a coalition of over 200 leading figures, including Nobel laureates and prominent experts from OpenAI and Google DeepMind, have signed an open letter demanding that governments establish strict "red lines" for artificial intelligence. Released to coincide with the UN General Assembly session, the statement warns that unregulated AI poses severe dangers, explicitly highlighting its potential to enable large-scale disinformation campaigns and manipulate public opinion, thereby undermining democratic societies.
The letter further details risks such as the loss of meaningful human control as AI systems, some of which have already exhibited deceptive behavior, are granted increasing autonomy. The signatories stress that voluntary commitments from developers are insufficient. They urge governments to act swiftly to create a binding international agreement on these "red lines" by the end of 2026. This framework would aim to hold AI providers accountable for preventing foreseeable harmful outcomes, directly addressing the growing threat of AI-powered foreign information manipulation and influence.
Source: The Signatories of the "AI Red Lines" Letter, 2025. Global Call for AI Red Lines. [online] Published September 2025. Available at: https://red-lines.ai/
Expert Analysis: EU's Institutional Weakness is its Greatest Vulnerability to Foreign Meddling
In an interview published by Follow the Money (FTM), democracy expert Luise Quaritsch elaborates on the European Union’s systemic vulnerability to foreign malign interference, framing it as a component of a broader hybrid warfare strategy. The analysis highlights persistent Russian tactics, including the creation of "doppelganger" websites and covert influence platforms, such as "Voice of Europe", as examples of a low-level, constant stream of interference designed to exploit societal divisions. These operations are amplified by other actors and across platforms where malign content can gain traction.
Quaritsch argues that the critical issue is not a lack of tools but the EU's failure to deploy its existing powers effectively. The bloc’s complex governance and interconnected member state policies create numerous institutional and physical access points for foreign actors to exploit. This means that a vulnerability in one member state poses a threat to the entire Union. While new legislative efforts, such as transparency registers, are being discussed, the interview emphasizes that the priority should be securing these inherent structural weaknesses, arguing that the EU is currently failing to counter the threat effectively.
Source:Follow the Money (FTM), Keepe, A. (2025). EU has the power to fight foreign meddling – but isn’t using it, democracy expert says. [online] Published 23 September 2025. Available at: https://www.ftm.eu/articles/interview-luise-quaritsch-eu-foreign-meddling
[CRC Glossary]
The Cyfluence Research Centre has relaunched the CRC Glossary. This initiative aims to serve as a shared lexicon of both foundational and emerging terms that shape the field. To this end, the Glossary is designed to be a continually updated resource, with new entries added weekly. We see this as a collaborative project and strongly encourage input from the expert community. The goal is to reduce the problem of ambiguous or conflicting terminology that can hinder collaborative work as well as communication effectiveness to the general public as a whole.
We invite you to submit additions, changes, or corrections via the form on our website.
_edited.png)
.png)


