Search CRC
130 results found with an empty search
- CRC Weekly: Cyber-based hostile influence campaigns 6th - 12th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. [Review highlights] Russia is weaponizing grief by using AI to create deepfake "resurrections" of fallen soldiers , turning personal tragedy into state propaganda. – CyberNews A Russian Influence campaign generated 200,000 social media mentions overnight, creating "informational chaos" to deflect blame for a drone incursion. - Le Monde Chinese chatbots are being used for espionage, harvesting user data for microtargetd propaganda targeting sensitive groups like military personnel - Politico A Chinese Influence campaign using fake social media accounts and a pseudo-local media outlet to undermine the US-Philippine alliance was uncovered. – Reuters The UK’s new national security adviser met with a group that the U.S. State Department has labeled a "malign" part of Beijing’s foreign influence network . - The Telegraph An AI-enabled influence operation, synchronized with military strikes , used deepfake videos and impersonated media to incite revolt in Iran . - Citizen Lab C hinese and Russian state media launched coordinated campaigns to frame Taiwan's president as a provocateur, distorting his calls for deterrence. - DisinfoWatch The U.S. has dismantled key defenses like the Foreign Malign Influence Center, creating a vacuum exploited by adversaries - The Washington Post TikTok’s algorithm has enabled manipulated videos and propaganda to spread rapidly across Africa , fueling pro-junta sentiment during recent coups. - LSE [Week in Review] AI-Generated "Ghosts": Russia's New Front in Digital Propaganda The use of artificial intelligence in Russia to create propaganda from private grief is examined in an article from CyberNews . For a fee ranging from $35 to $60, families of deceased soldiers can commission AI-generated videos in which their loved ones appear to speak, embrace them, or ascend to heaven. These services, some of which reportedly handle hundreds of orders daily, produce deepfake clips that are then rapidly disseminated across Russian social media platforms, including Telegram and VKontakte. While these videos may provide a "balm effect" for grieving families, especially those unable to recover the bodies of soldiers, Ukrainian outlets like StopFake.org have warned against the manipulation of emotions inherent in such content. The practice represents a novel form of digital propaganda, turning personal mourning into a tool for reinforcing state narratives by creating a sanitized depiction of wartime loss. Source: CyberNews ‘Russian AI resurrection videos turn grief into propaganda’ Available Online Top Of Page How Russian Bot Networks Assaulted Czech Democracy Online During the October parliamentary elections in the Czech Republic, Russia engaged in coordinated disinformation campaigns aimed at interfering with the democratic process. A report by EUvsDisinfo details how networks of TikTok bot accounts and pro-Russian websites saturated Czech online spaces with propaganda. These operations sought to portray Vladimir Putin in a positive light, legitimize the war in Ukraine, and amplify anti-Western and anti-establishment narratives. Investigations by Czech media found that these propaganda sites published more articles daily than the country’s most established news outlets. After the election, Russian state-controlled media continued to push misleading narratives, falsely claiming the results indicated a rejection of the EU. This digital interference campaign also included accusations from Kremlin-linked sources that the European Union was itself guilty of election interference, a common tactic of projecting blame onto adversaries. Source: EUvsDisinfo ‘For the Kremlin, elections are golden opportunities for interference’ Available Online Top Of Page A Digital Blitz: Russia combined drone and Information Attack on Poland Following a Russian drone incursion into Polish airspace, the country was targeted by an unprecedented and coordinated disinformation attack, as detailed in an article published by Le Monde. The operation aimed to generate "informational chaos" by saturating social media algorithms with false narratives at a massive scale, resulting in up to 200,000 mentions in one night. Primarily driven by coordinated Russian and Belarusian accounts on platforms like X and Facebook, the campaign sought to divert blame by portraying the incident as a Ukrainian provocation designed to draw NATO into the conflict. Simultaneously, it characterized the Polish military and NATO as "ineffective and powerless." Experts view this incident as a significant escalation in Russia’s hybrid war, demonstrating a new phase of information warfare. The influence operation's reach extended to France, Germany, and Romania, highlighting its regional scope and its strategic goal of eroding European support for Ukraine. Source: Le Monde, ‘Poland hit by unprecedented disinformation attack following Russian drone incursion’ Available Online Top Of Page Chinese-developed chatbots leave user information vulnerable exploitation China's substantial investment in artificial intelligence is fueling concerns that extend beyond economic competition into the realms of cyberwarfare, espionage, and disinformation. According to an article from Politico, Beijing’s integration of AI into state-linked hacking groups could amplify the scale and sophistication of cyberattacks on U.S. infrastructure. In parallel, Chinese-made chatbots present espionage risks by harvesting user data, which could be weaponized for tailored disinformation campaigns targeting sensitive sectors such as first responders or military personnel. Research indicates that leading Chinese chatbots, including DeepSeek, Baidu’s Ernie, and Alibaba’s Qwen, consistently produce content that aligns with Beijing’s political narratives, subtly reinforcing state messaging. Such platforms pose a risk of shaping public opinion, particularly as affordable Chinese AI services become more widespread in developing nations, creating new vectors for digital influence. Source: Politico ‘Inside the Chinese AI threat to security’ Available Online Top Of Page Beijing's Shadow Campaign to Fracture US-Philippine Alliance A Chinese-funded Foreign Information Manipulation & Interference (FIMI) campaign in the Philippines was orchestrated to undermine local support for the country’s alliance with the United States. A Reuters investigation uncovered that the operation was managed by the marketing firm InfinitUs Marketing Solutions, which received direct funding from China’s embassy in Manila to "guide public opinion." The campaign utilized fake social media accounts posing as Filipinos to amplify pro-China and anti-American content, as well as a fabricated media outlet named Ni Hao Manila. These accounts spread misinformation regarding U.S. military cooperation, attacked Philippine lawmakers critical of China, and disseminated false narratives on other geopolitical issues. Philippine officials warned that such digital influence operations aim to make Manila "compliant" with Beijing’s strategic interests, highlighting the information war playing out in a region of significant geopolitical importance. Source: Politico ‘How China waged an infowar against U.S. interests in the Philippines’ Available Online Top Of Page UK Security Adviser’s Past Meetings with China Influence Group Raise Concerns Sir Keir Starmer’s new national security adviser, Jonathan Powell, is facing scrutiny over past meetings with a Chinese organization identified by U.S. intelligence as part of Beijing’s foreign influence network. A The Telegraph report revealed that in March 2024, Powell met with the Chinese People’s Association for Friendship with Foreign Countries (CPAFFC), an organization the U.S. State Department has described as "malign." This group is linked to Chinese Communist Party efforts to co-opt global institutions and shape international narratives. U.S. officials have warned that CPAFFC and associated think tanks like the Grandview Institution are instrumental to China's "people-to-people" diplomacy, a strategy used to promote pro-Beijing messaging. Powell’s repeated visits to China and speaking engagements have fueled concerns that these exchanges may inadvertently legitimize entities associated with disinformation and political manipulation campaigns, coming at a time of heightened sensitivity over Chinese interference in the UK. Source: The Telegraph ‘Powell met ‘malign’ Chinese group before joining Starmer’s team’ Available Online Top Of Page AI-Augmented Influence Operation Targets Regime Change in Iran A covert network known as PRISONBREAK has been executing an AI-enabled influence operation targeting Iranian audiences with calls for revolt and fabricated media. An analysis from Citizen Lab details how the campaign utilized over 50 inauthentic profiles on X to distribute deepfake video content and impersonate media outlets, aiming to stoke domestic unrest. The operation's digital activities appear to have been tightly synchronized with kinetic military actions, such as the June 2025 Evin Prison bombing, employing tactics of narrative seeding and amplification in real-time. While definitive attribution is challenging, Citizen Lab assesses that the operator is most likely an Israeli government agency or a contractor, citing the advanced knowledge of military operations and coordinated narrative timing. This case highlights the evolving threat of AI-augmented disinformation in geopolitical conflicts, demonstrating how digital influence campaigns now operate alongside traditional warfare. Source: Citizen Lab ‘We Say You Want a Revolution: PRISONBREAK – An AI-Enabled Influence Operation Aimed at Overthrowing the Iranian Regime’ Available Online Top Of Page China and Russia Coordinate False Narratives Against Taiwan Chinese and Russian state media outlets have engaged in coordinated campaigns to distort the statements of Taiwanese President Lai Ching-te and portray Taiwan as a source of regional instability. According to DisinfoWatch, recent analysis shows that on October 8, 2025, China’s Global Times accused President Lai of "seeking independence through military means," a claim echoed by Russian state media. This narrative directly contradicted Lai’s actual remarks, which stressed deterrence and called on Beijing to renounce the use of force. The disinformation campaign also framed the People’s Liberation Army’s coercive military drills as a stabilizing measure. Furthermore, Beijing has manipulated international law by falsely equating its "One China" principle with UN Resolution 2758, which pertains to China’s UN seat but does not determine Taiwan’s sovereignty. These coordinated digital narratives represent a joint effort to isolate Taiwan and legitimize aggressive actions in the region. Source: DisinfoWatch ‘Converging False PRC–Russian Narratives Target Taiwan and President Lai’ Available Online Top Of Page United States Cedes Ground in the Global Information War The United States has effectively "disarmed" in the information war, leaving it vulnerable to foreign disinformation from Russia, China, and Iran. As stated by The Washington Post, the dismantling of key defenses, such as the Foreign Malign Influence Center, has created a vacuum that adversaries have exploited by spreading fabricated content, including AI-generated images and videos. Analysts at NewsGuard identified thousands of social media posts from state-backed media that aimed to deepen polarization by circulating conflicting lies. The impact is measurable, with surveys showing that a third of Americans believe at least one significant Russian falsehood about Ukraine. The article notes that Russian disinformation networks, like the Pravda Network, have seeded millions of false stories, some of which are now being used to "infect" large AI models that subsequently repeat these lies as fact, amplifying their reach and perceived credibility. Source: The Washington Post ‘How foreign nations are gaslighting Americans’ Available Online Top Of Page TikTok's Ascendance in Africa Reshapes Media with Misinformation Risks TikTok has rapidly become one of Africa’s most influential platforms for news consumption, bringing with it a significant surge in misinformation and political propaganda. A news piece by LSE describes how millions across the continent now rely on TikTok for information, while trust in traditional media outlets declines. The platform’s algorithms, designed to maximize engagement, enable manipulated videos and misleading content to achieve viral reach before they can be verified. This digital environment has had tangible real-world consequences, such as bolstering pro-junta sentiment during coups in Niger and Mali and fueling political division during elections in South Africa and Nigeria. While countermeasures are emerging, such as South Africa's partnership with TikTok’s election center and Ghana's fact-checking networks, the report concludes that combating disinformation on the platform will require stronger digital literacy, transparent moderation, and renewed investment in credible journalism. Source: LSE ‘TikTok is becoming Africa’s newsroom’ Available Online Top Of Page [Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. However, across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms relating to Hostile Influence and Cyfluence. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top Of Page [Download Report]
- Dancing with Cyfluence – Travolta, Telegram & the Moldovan Leak
In this week’s follow-up, we return to Moldova, where the recent parliamentary elections once again underscored the country’s vulnerability in its political information space . As noted in our previous coverage on influence attempts surrounding the Moldovan vote (more information can be found [ here ]), competing narratives and external actors shaped much of the pre-election atmosphere. Against this backdrop, a remarkable incident occurred — one that appears, with high probability, linked to a suspected Russian influence campaign: a likely cyfluence-counteroperation targeting the pro-Russian network of oligarch Ilan Shor and its affiliated organization, the Victorie Bloc. On 3 September, internal data from these structures appeared online, triggering a chain reaction that severely disrupted Shor’s political machinery and exposed the operational mechanics behind what is assessed to have been a foreign-directed influence apparatus. The leak represented one of the clearest intersections of cyber intrusion and influence strategy observed during this election cycle. i Who is Ilan Shor? Ilan Shor, a Moldovan businessman and politician, fled to Russia several years ago after facing extensive corruption charges. From exile, he remained politically active and established the Victorie Bloc in Moscow, a distinctly pro-Russian political platform aimed at regaining influence in Moldova through affiliated candidates. Shor is widely regarded as a symbolic figure of Moldova’s pro-Russian current: financially well-connected, politically ambitious, and closely tied to Kremlin-linked networks. The Data Leak On 3 September, reports surfaced that data from two Shor-affiliated companies, A7 and Anykey LLC, had been published. ii Figure 1 – Screenshot of the Folders of the Leaked Data The files first appeared on the encrypted cloud service ProtonDrive iii and were later disseminated via Telegram channels. They contained internal communications, confidential financial records, and expenditure summaries for campaign activities. Particularly notable were chat logs in which Shor, using the codename “Travolta,” commented on operational issues. The materials also included lists of names, phone numbers, and addresses of individuals allegedly paid to organize protests or promote pro-Russian messaging. The documents revealed that the Victorie Bloc functioned not merely as a political organization, but as a structured, financed iv , and centrally coordinated influence network. Figure – 2 Leaked data: paid individuals, including names, tasks, and monthly payments v Indicators of a Cyfluence Counteroperation The following phase-based analysis outlines the structure and sequencing of the operation to illustrate how cyber-technical and influence-oriented components were combined. Breaking the event into three phases, intrusion, exposure, and amplification, allows for a clear understanding of how technical compromise evolved into a coordinated perception operation. At this point, we use this analytical framework to identify hybrid operations that merge cyber capabilities with psychological and narrative objectives. The incident occurred only days before Moldova’s parliamentary elections and displays key indicators of a coordinated cyber and information activity. Data from entities linked to Ilan Shor and the Victorie Bloc were exfiltrated, publicly released, and then used to directly engage individuals named in the dataset. The timing and sequencing suggest the operation’s intent was not financial gain or espionage, but the disruption and delegitimization of a Russian-backed influence network. Cyber Intrusion and Data Exfiltration The first phase likely involved unauthorized access to internal systems of the Shor-affiliated companies A7 and Anykey LLC. Significant volumes of data, including financial ledgers, payment records, and personally identifiable information, were exfiltrated and uploaded to ProtonDrive, an encrypted cloud-sharing platform. The material was subsequently distributed via Telegram channels and closed online groups, ensuring rapid dissemination while maintaining anonymity and non-attribution for the perpetrators. This stage established the technical foundation for the influence component that followed. Exposure and Doxxing Component In the second phase, the attackers deliberately released personal information, names, contact details, and payment histories of individuals associated with the Victorie Bloc. This elevated the incident from a typical hack-and-leak to a hybrid operation with doxxing characteristics. Immediately after publication, numerous individuals listed in the leak received direct messages stating: “The Victory Bloc is broken. You will no longer be paid. Your data is public. Russia has betrayed you.” vi The messages were designed to have a psychological impact. They combined exposure and intimidation to pressure individual supporters of the Victorie Bloc, undermine their trust in the organization’s leadership, and weaken the internal cohesion between coordinators, financiers, and field operatives. This targeted approach effectively amplified the disruptive impact of the data release. Narrative Amplification and Public Signaling The third phase focused on narrative shaping and institutional signaling. The leaked documents appeared to show direct financial and organizational connections to Russian actors, framing the Victorie Bloc as a foreign-directed influence structure. Media outlets and social channels picked up these narratives, turning a data breach into a strategic reputational and operational collapse. Authorities, including the Central Electoral Commission and CERT-GOV-MD, Moldova’s national cybersecurity agency, launched preliminary reviews to verify the authenticity of the materials and assess potential election interference. This official response further amplified the visibility and perceived legitimacy of the operation’s outcomes. Analytical Assessment The coordination of cyber intrusion, targeted disclosure, and psychological messaging aligns with the structure of a Cyfluence Counteroperation, an integrated activity designed to weaken or neutralize a hostile influence campaign through synchronized cyber and perception mechanisms. In this case, the campaign can be assessed with high confidence as successful, given the rapid breakdown of internal communications, loss of financial control, and subsequent reputational collapse of the targeted network. Together, these components placed significant pressure on participants, disrupted internal communication processes, and eroded the organization’s stability. Moreover, the operation publicly reframed the Victorie Bloc as a foreign-directed entity, sharply reducing its domestic legitimacy and public support, a decisive influence effect extending beyond the technical breach itself. Attribution and Context Attribution remains undetermined. The operation could plausibly have been conducted by regional hacktivist collectives seeking to counter Russian interference, or by a state-affiliated actor executing a preemptive countermeasure. Regardless of origin, the case illustrates a mature application of Cyfluence methodology, the deliberate integration of cyber intrusion, information exposure, and psychological leverage to disrupt an active influence campaign in real-time. Outcome In the aftermath, communication within the Victorie Bloc collapsed, financial flows were interrupted, and several key figures publicly distanced themselves from the organization. Public debate shifted away from the Bloc’s messaging and toward its exposure as a mechanism of Russian influence. The operation achieved dual objectives: operational neutralization and narrative delegitimization, significantly reducing the reach of a foreign-backed political campaign on the eve of the vote. [Footnotes:] [i] WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak [ii] Moldova1, R. Lozinschi‑Hadei, 2025. Telegram leaks: Șor’s firms used to undermine Moldova’s democracy. [online] Published 3 September 2025. Available at: https://moldova1.md/p/56415/telegram-leaks-sor-s-firms-used-to-undermine-moldova-s-democracy [iii] Publicly accessible ProtonDrive link associated with the leak: https://drive.proton.me/urls/PAEYV2N61R#rxaNKy4NtPNL [iv] Elliptic, 2025. The A7 leaks: The role of crypto in Russian sanctions evasion and election interference . [online] Published 26 September 2025. Available at: https://www.elliptic.co/blog/the-a7-leaks-the-role-of-crypto-in-russian-sanctions-evasion-and-election-interference# [v] Source of the picture: WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak [vi] Moldova1, R. Lozinschi‑Hadei, 2025. Telegram leaks: Șor’s firms used to undermine Moldova’s democracy. [online] Published 3 September 2025. Available at: https://moldova1.md/p/56415/telegram-leaks-sor-s-firms-used-to-undermine-moldova-s-democracy [vii] WhereIsRussia Today, n.d. Collapsing from the inside: Ilan Shor’s network crumbles amid data leak. [online] Published 24 September 2025. Available at: https://whereisrussia.today/feed/politics/ilan_shors_network_crumbles_amid_data_leak
- CRC Weekly: Cyber-based hostile influence campaigns 29th September - 05th October 2025
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what weDuring the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). This weeks report is a summary of what we regard as the main events. regard as the main events. [Review highlights] Russia's foreign intelligence service (SVR) is now issuing public statements to amplify pro-Kremlin narratives, a significant shift in its operational tactics. - EUvsDisinfo A Russian-backed network created a fake news website using AI-generated videos and media impersonation to spread false narratives about the French president. - Le Monde Russia’s Ottawa embassy conducted an information campaign accusing Canada of covering up a fabricated nuclear incident allegedly perpetrated by Ukrainian forces. - DisinfoWatch Russia used social media influencers to promote a deceptive work-study program that lured African women into working in its military drone factories. - EUvsDisinfo The Kremlin is executing a multifaceted information campaign to deny and reframe its systematic abduction of over 20,000 Ukrainian children. - EUvsDisinfo A new report argues the 2025 Gaza flotilla was a coordinated information operation by groups with ties to Hamas, using humanitarianism to shape opinion. - Global Influence Operations Report The failure of Moscow's extensive interference efforts in Moldova highlights the declining impact of its information operations in countries it considers its near abroad. - Atlantic Council An EU-led digital literacy camp is equipping youth in Bosnia and Herzegovina with critical thinking skills to identify and counter manipulated information. - EU Delegation to Bosnia and Herzegovina [Weekly Review] Russia’s Foreign Intelligence Service Adopts Public Role in Spreading False Narratives Russia’s foreign intelligence service (SVR), an agency that typically operates covertly, has recently become a public-facing vehicle for pro-Kremlin disinformation. According to an EUvsDisinfo analysis , the SVR has begun issuing official statements that amplify false narratives targeting NATO, the EU, and Western governments. This tactic marks a shift from the standard practice of circulating such claims through state media or deniable covert outlets. The SVR’s new role was prominent during Moldova’s September 2025 elections, where it baselessly accused the EU of planning a NATO-backed occupation following the decisive victory of a pro-EU party. The SVR has also spread disinformation in Serbia, alleging an EU-orchestrated “Maidan-style” coup, and in Georgia, where it claimed the U.S. and EU were plotting a “color revolution” while smearing NGOs with fabricated allegations. These actions represent a strategic change, leveraging the perceived authority of an intelligence agency to legitimize disinformation openly. Source: EUvsDisinfo, “The Shadowy SVR Openly Pushes Disinformation Narratives,” Available Online Top of Page Moldova’s Pro-EU Party Secures Victory Amidst Coordinated Cyberattacks Moldova’s pro-European Action and Solidarity Party (PAS) won a parliamentary majority despite a campaign of Russian interference and cyberattacks designed to destabilize the vote. A report from The Record detailed how authorities faced coordinated hoax bomb threats at polling stations and sustained cyberattacks on government infrastructure, including DDoS incidents targeting the Central Electoral Commission website and government cloud systems. These operations, coupled with disinformation campaigns aimed at Moldovan voters abroad, sought to intimidate the electorate and suppress the diaspora vote. According to France 24 , the Kremlin was identified as the central actor in the interference, with the Moldovan government accusing Moscow of spending hundreds of millions in "dirty money" on vote-buying and other destabilization efforts. While the attacks were blocked in real-time without disrupting the voting process, analysts warned that the Kremlin could still attempt to bribe new members of parliament to undermine the formation of a stable pro-European government. Source: The Record, “Moldova’s Pro-EU Party Wins Election Amid Cyberattacks and Kremlin Interference,” Available Online Source: France 24, “Moldova's pro-EU party on course to win pivotal election mired in claims of Russian meddling,” Available Online Top of Page Russian-Backed Network Deploys AI and Impersonation in Disinformation Campaign A Russian-backed influence network known as Storm-1516 created a fake news website to impersonate French media outlets and spread pro-Kremlin disinformation. An article in Le Monde revealed that the site, called BrutInfo, mimicked the branding of Brut and Le Monde to publish false stories, including a fabricated claim that President Emmanuel Macron was building a €148 million bunker. This operation utilized AI-generated videos, such as a fake interview with a supposed construction worker, to add a veneer of credibility. The network’s tactics also include employing paid actors, plagiarizing legitimate articles, and placing propaganda in low-standard international media outlets that accept paid contributions. France’s disinformation watchdog, Viginum, reported that content from Storm-1516 is frequently amplified by a network of pro-Kremlin influencers and paid accounts, extending the reach of its digitally sophisticated disinformation campaigns. Source: Le Monde, “A fake news website impersonates Le Monde and Brut,” Available Online Top of Page Russian State Actors Accuse Canada of Concealing Fabricated Nuclear Incident The Russian Embassy in Ottawa and the state news agency TASS initiated a disinformation campaign accusing Ukraine of shelling the Zaporizhzhia Nuclear Power Plant (ZNPP) and claiming Canada was covering up the supposed crime. A DisinfoWatch report details how the embassy’s official statements labeled Ukrainian President Volodymyr Zelensky a “maniacal terrorist” and asserted that the International Atomic Energy Agency (IAEA) was documenting Ukrainian provocations. This narrative, however, contradicts independent monitoring and recent IAEA updates, which confirmed military activity around the plant but did not assign blame, instead urging both sides to cease hostilities in the area. Russia's claims ignored evidence of potential sabotage by its own occupying forces and misrepresented the IAEA's neutral role. No credible evidence was found to support the accusation that Canada was involved in covering up a non-existent nuclear crime, with its official position remaining aligned with its allies. Source: DisinfoWatch, “Russian Embassy and TASS claim Canada is covering up non-existent Kiev nuclear crime,” Available Online Top of Page Russia Exploits Social Media Influencers for Deceptive Military Recruitment Russia has conducted a disinformation campaign across Africa that uses social media influencers to lure women into its war production industry under false pretenses. According to an article by EUvsDisinfo , the campaign promoted the “Alabuga Start” program, which was advertised on TikTok, Instagram, and YouTube as a work-study opportunity in fields like hospitality. In reality , recruits were sent to work in drone factories supporting Russia’s war in Ukraine, where they faced grueling conditions and health risks. When Nigerian media exposed the scheme, Russian embassies and pro-Kremlin channels mounted a coordinated response, dismissing the reporting as “Western disinformation.” This counternarrative was amplified by pan-Africanist influencers, who reframed the story as a Western plot against Russia-Nigeria relations, thereby creating an illusion of widespread support for the program while obscuring the evidence of exploitation. Source: EUvsDisinfo, “From social media to weapon factories: how Russia traps African women in war production,” Available Online Top of Page Kremlin Pivots to Election Fraud Narratives After Failed Interference Following the victory of Moldova’s pro-EU party, the Kremlin and its media affiliates executed a rapid pivot in their disinformation strategy, shifting from pre-election accusations of corruption to post-election claims of widespread voter fraud. As reported by NewsGuard Reality Check , this strategy involved disseminating fabricated evidence across social media platforms like X and through state-owned outlets such as TASS. The campaign circulated deceptive videos, including one repurposed from Azerbaijan that falsely depicted ballot stuffing in Italy, in an attempt to delegitimize the election results. This effort, which showed signs of the Storm-1516 influence operation, ultimately failed to sway the outcome, demonstrating the limits of Russian influence and the resilience of Moldova's democratic institutions. In a separate but related effort, a DFRLab report identified a pro-Russian campaign codenamed "Matushka" that exploited Orthodox Christian beliefs to influence voters. The operation created a network of 67 channels on Telegram, TikTok, and other platforms, initially sharing religious content before pivoting to political messaging that framed European integration as a threat to the church. This strategy aimed to mobilize a religious voter base by suggesting that voting for pro-Kremlin candidates was a religious duty to protect traditional values from "moral decay." Source: NewsGuard Reality Check, “Russians Cry Fraud After Failing to Sway Moldovan Election With Disinformation,” Available Online DFRLab, “Targeting the faithful: Pro-Russia campaign engages Moldova’s Christian voters,” Available Online Top of Page Putin’s Valdai Speech Outlines a Global Disinformation Strategy At the Valdai Club, a Kremlin-controlled think tank, Russian President Vladimir Putin delivered a speech outlining a strategic disinformation campaign aimed at Western nations. A publication by DisinfoWatch analyzes how Putin and state media outlets are promoting a narrative that frames Russia as a moral "counterweight" to a decadent and declining Western liberal order. The core strategy involves driving a "culture-war wedge" by weaponizing issues like "gender terrorism" to generalize about systemic Western collapse and legitimize Moscow’s vision of a "polycentric," illiberal world. Specific disinformation tactics included inverting causality by labeling European rearmament a "provocation" and using fearmongering to deter military support for Ukraine. This coordinated information warfare campaign serves multiple goals: reassuring Russia’s domestic audience, encouraging sanctions fatigue among EU voters, and advancing Moscow’s revisionist foreign policy. Source: DisinfoWatch, “DisinfoDigest: Decoding Putin’s Valdai Speech,” Available Online Top of Page Kremlin FIMI Campaign Aims to Obscure Child Abduction War Crimes The Kremlin is leveraging a Foreign Information Manipulation and Interference (FIMI) campaign to obscure its systematic abduction of over 20,000 Ukrainian children, a policy that constitutes a war crime. According to EUvsDisinfo , this operation relies on a three-pronged disinformation strategy: outright denial of the abductions, falsely reframing the kidnappings as humanitarian "evacuations," and claiming to facilitate family reunification while actively erasing the children’s identities through forced adoptions and citizenship changes. Key actors leading this effort include Russian President Vladimir Putin and his 'Commissioner for Children's Rights,' Maria Lvova-Belova, both of whom face arrest warrants from the International Criminal Court for their role in the unlawful deportations. In response, 38 countries, alongside the Council of Europe and the EU, have called for the children's immediate return, and an international coalition has been launched to address Russia's actions. Source: EUvsDisinfo, “At the 80th UNGA, Remember Russia’s War on Ukrainian Children,” Available Online Top of Page Gaza Flotilla Analyzed as Coordinated Information Operation The 2025 Global Sumud Flotilla, a maritime campaign challenging Israel’s blockade of Gaza, functioned as both a humanitarian initiative and a coordinated information operation driven by a network aligned with the Muslim Brotherhood. A report from the Global Influence Operations Report (GIOR) argues that while the flotilla was framed publicly as a humanitarian intervention, its key organizers—including Turkey’s İHH and the Freedom Flotilla Coalition—have long-standing ties to Hamas. According to the analysis , these groups leveraged humanitarian rhetoric to shape global opinion and legitimize their political activism. The report contends that the flotilla demonstrates a 15-year evolution of Gaza solidarity activism, which has transformed from grassroots convoys into a transnational influence ecosystem connecting NGOs with sympathetic states like Turkey, Qatar, and Malaysia. This suggests that humanitarian activism can serve as a vehicle for ideological influence, blurring the line between civil solidarity and coordinated campaigns. Source: Global Influence Operations Report, “The Global Sumud Flotilla of 2025: Humanitarian Activism or Islamist Influence Operation?,” Available Online Top of Page Study Finds AI Misinformation Has Dual Effect on Media Trust Exposure to AI-generated misinformation reduces overall trust in media but can simultaneously increase engagement with credible news sources, according to a field experiment involving 17,000 readers. A study , published in TechXplore and conducted by researchers from multiple universities in partnership with German newspaper Süddeutsche Zeitung , presented readers with pairs of real and AI-generated images. The findings revealed this dual effect: while trust declined, readers who became aware of the difficulty in distinguishing real from fake content subsequently visited the newspaper's digital platforms more frequently and demonstrated better information retention. This effect was most pronounced among individuals with lower prior interest in politics. The implications suggest that while AI-driven misinformation threatens public trust, it also creates an opportunity for reputable media outlets to deepen audience engagement by educating them about the challenges of the modern information environment. Source: TechXplore, “Reader survey shows AI-driven misinformation can reduce trust, but increase engagement with credible news,” Available Online Top of Page AI-Driven Disinformation Accelerates Democratic Decay Across Africa Artificial intelligence is increasingly being deployed as a tool to destabilize democratic processes and support authoritarianism in Africa. An article from the LSE Africa at LSE blog highlights how AI-generated deepfakes and coordinated disinformation campaigns fueled polarization and public skepticism during Nigeria's 2023 elections. In the Sahel region , AI-driven content, often linked to Russian-influenced networks, has been used to glorify military juntas and undermine calls for civilian governance. This trend is occurring in a context of declining public faith in democracy across the continent, with support for democratic rule having fallen by seven percentage points in the last decade. AI-fueled disinformation acts as a force multiplier for this democratic decay by accelerating the spread of false narratives, eroding trust in institutions, and overwhelming citizens' ability to discern fact from fabrication, underscoring the need for global governance frameworks. Source: LSE Africa at LSE blog, “In the age of artificial intelligence, democracy needs help,” Available Online Top of Page AI Weaponized to Threaten Democratic Processes and Critical Systems The increasing accessibility of artificial intelligence is enabling malicious actors to undermine elections, manipulate markets, and compromise critical systems. According to an article in TechXplore , AI-generated content like deepfakes and fake social media profiles has been used to spread disinformation and influence public opinion, leading to events such as the suspension of the 2024 Romanian presidential elections due to foreign interference. Beyond elections , AI systems trained on biased data have resulted in discriminatory outcomes in healthcare, while AI-generated fake news has been deployed to manipulate financial markets. The World Economic Forum has highlighted AI’s potential to disrupt geopolitical stability and national security. The adaptability of AI lowers the barrier for executing large-scale attacks, making it more difficult to safeguard critical infrastructure. Experts advocate for secure AI practices, robust regulation, and international cooperation to mitigate these risks and ensure AI is harnessed responsibly. Source: TechXplore, “How AI poses a threat to national elections, health care and security,” Available Online Top of Page Comparative Study Examines Frameworks for Measuring Disinformation Impact To better understand and counter disinformation, it is crucial to accurately measure its effects, yet methodologies for doing so vary widely. In a comparative study , the organization EU DisinfoLab analyzed several frameworks used to assess the impact of disinformation, including the ABCDE Framework, the Disarm Framework, and the Impact-Risk Index. The analysis revealed that these frameworks adopt different approaches; some prioritize quantifying the reach of a disinformation campaign, while others focus on measuring the subsequent harm to public opinion and behavior. The study concludes that harmonizing these divergent methodologies is essential for developing a more comprehensive and standardized understanding of disinformation’s impact. Such work is critical for informing effective policy-making and counter-disinformation strategies, particularly as digital platforms and influence campaigns continue to grow in sophistication. The study calls for continued collaboration to refine these vital assessment tools. Source: EU DisinfoLab, “Decoding Disinformation Impact Frameworks and Indicators: a Comparative Study,” Available Online Top of Page Moldova’s Institutional Resilience Blunts Russian Election Interference Efforts Russia’s comprehensive campaign to interfere in Moldova's recent elections was ultimately unsuccessful due to the resilience of the country's institutions and electorate. An Atlantic Council article explains how the Kremlin deployed operatives and AI-generated fake accounts to saturate Moldovan social media with disinformation targeting President Maia Sandu and her pro-European party. Despite the scale of this information operation, Moldovan authorities effectively countered the threat by uncovering illicit financing schemes and voter bribery efforts linked to the campaign. The Moldovan public demonstrated a strong commitment to democratic values by delivering decisive support for Sandu’s platform of European integration. The election outcome is seen as a significant indicator of Russia's declining influence in its near abroad, demonstrating that even well-resourced interference campaigns can be thwarted by vigilant institutions and an informed public. Source: Atlantic Council, “Putin’s Moldova election failure highlights Russia’s declining influence,” Available Online Top of Page EU Initiative Bolsters Youth Digital Literacy to Counter Disinformation An initiative in Bosnia and Herzegovina aims to equip young people with the skills necessary to navigate the digital information landscape and counter disinformation. The EU Delegation to Bosnia and Herzegovina reported on its second Media and Digital Literacy Camp, which gathered youth for workshops on critical thinking, fact-checking, and assessing source credibility. The program featured guidance from experts in academia and from fact-checking platforms such as Raskrinkavanje, with a focus on identifying manipulated information. This initiative addresses the growing challenge of disinformation by fostering a more informed and engaged citizenry. It aligns with the EU's broader commitment, outlined in its annual human rights and democracy reports, to promote media freedom and combat the spread of false information. Such educational programs are considered a crucial component in strengthening democratic processes and ensuring information integrity in the digital age. Source: EU Delegation to Bosnia and Herzegovina, “Media and Digital Literacy Camp: Enhancing critical thinking and digital skills among youth,” Available Online Top of Page [Glossary] The nature and sophistication of the modern Information Environment is projected to only continue to escalate in complexity. Across academic publications, legal frameworks, policy debates, and public communications, the same concepts are often described in different ways, making collaboration, cooperation and effective action more difficult. To ensure clarity and establish a consistent frame of reference, the CRC is maintaining a standard glossary to reduce ambiguity and promote terminological interoperability. Its scope encompasses foundational concepts, as well as emerging terms. As a collaborative project maintained with input from the community of experts, the CRC Glossary is intended to reflect professional consensus. We encourage you to engage with this initiative and welcome contributions via the CRC website Top of Page [Download Report]
- From Coup to Cult: The Transnational Construction of Power in West Africa’s Information Space -The Case of Burkina Faso
The Sahel region has emerged as a key setting for significant evolutions in cognitive warfare, where the contest for its information space, although underreported, has global impact and relevance. A new analysis by Tim Stark uses a case study of Burkina Faso under Captain Ibrahim Traoré to provide a deep dive into these dynamics. It details how West African influence campaigns exploit the region’s fertile ground for narrative warfare—an environment where traditional oral storytellers have morphed into digital influencers—through the use of synthetic propaganda and hybrid operations, all in the context of a struggle by foreign powers to fill the strategic vacuum left by departing Western nations. Traoré’s trajectory from coup leader to mythologized icon of Pan-African resistance illustrates a broader transformation in the global information environment, whereby authoritarian leaders in fragile states can now project narratives across borders to build legitimacy while reshaping perceptions abroad. Stark concludes this is more than simple regime consolidation; it is a durable, transnational mythmaking effort that achieves global resonance by linking local grievances to potent anti-imperialist rhetoric, infiltrating Western timelines and directly influencing democratic discourse. [ Download Full Report here ]
- CRC Weekly: Cyber-based hostile influence campaigns 22nd-28th September 2025
[Introduction] Hostile influence campaigns combine various aspects of cognitive warfare and are often accompanied by political, economic, or military strategies to achieve long-term strategic advantage. Our analysis focuses on cyber-based campaigns, with a particular emphasis on a key subset we define as Cyfluence . Cyfluence (cyber-attacks for influence) is the strategic and operational integration of cyber threat vectors with hostile information influence operations (IIOs). Cyfluence operations, conducted by state-sponsored or independent actors, represent an advanced form of cognitive warfare. They combine cyberattacks (e.g., hack-and-leak, digital identity hijacking, DDoS) with digital influence techniques (e.g., coordinated disinformation, misinformation, and malinformation amplified through inauthentic activity). The objectives of Cyfluence operations include the manipulation of public discourse, election interference, reputation abuse, and societal polarization. During the 22nd to the 28th of September 2025 , we observed, collected, and analyzed endpoints of information related to these campaigns. The following report is a summary of what we regard as the main events from this period. [Report Highlights] A Kremlin-linked campaign succeeded in "infecting" major generative AI models with fabricated corruption claims targeting the Moldovan election. Over 200 leading figures, including Nobel Prize winners and AI experts from companies such as OpenAI, Google DeepMind, and Microsoft, have called for action to establish strict “red lines” for artificial intelligence. Georgia’s ruling party, Georgian Dream, has intensified its use of disinformation and conspiracy theories to undermine public trust in the European Union. BBC undercover investigation uncovered the secret Russian-funded network to disrupt Moldova’s September 28 parliamentary elections through coordinated disinformation campaigns. According to a report by DFR Lab, a new online outlet called REST has emerged as a key pro-Kremlin disinformation. Iran's intelligence operations are targeting Scandinavian countries with greater intensity. A report by the Middle East Quarterly provides insights into their recent campaigns with a focus on Denmark and Norway. [Weekly Review] REST Outlet Makes a New Front in Russia’s Campaign Against Moldova Manufacturing a Crisis: Inside Russia's Information War on Moldova's Election Paid to Post: Anatomy of a Pro-Russian 'Digital Army' RT Pushes Kremlin Disinformation to Undermine Canadian Support for Ukraine Deconstructing Russia's Moldova 'Occupation' Narrative Kremlin Campaign Corrupts AI Models in Moldovan Election Influence Op Iran's Scandinavian Operations: A Permissive Environment for Espionage and Influence Georgia's Ruling Party Uses 'Traditional Values' Disinformation to Counter EU Pressure Experts Issue Global Call for AI 'Red Lines' to Prevent Mass Disinformation Expert Analysis: EU's Institutional Weakness is its Greatest Vulnerability to Foreign Meddling REST Outlet Makes a New Front in Russia’s Campaign Against Moldova According to a recent analysis by the DFR Lab, a new online outlet named REST has emerged as another tool in the pro-Kremlin disinformation campaign targeting Moldova ahead of its September 2025 parliamentary elections. The publication details REST’s connection to Rybar, a major sanctioned Russian propaganda operation, suggesting the new outlet is designed to evade sanctions and regenerate influence capabilities. The connection is supported by technical evidence, including shared hosting infrastructure, identical server configurations, and leaked image metadata that directly references Rybar. The outlet’s content, which gained millions of views on TikTok and was amplified across X and Telegram, is designed to embed disinformation into Moldova’s digital environment. This activity represents a continuation of Russian influence operations, which employ a sophisticated toolkit including AI-generated deepfakes, mirror websites, and covert financing to undermine Moldova’s pro-European course. The analysis also notes the translation of REST content into EU languages, indicating a multi-platform, cross-border effort to manipulate information. Source: DFRLab, J. Kubś & E. Buziashvili, 2025. Sanctioned Russian actor linked to new media outlet targeting Moldova. [online] Published 23 September 2025. Available at: https://dfrlab.org/2025/09/23/sanctioned-russian-actor-linked-to-new-media-outlet-targeting-moldova/ Top of Page Manufacturing a Crisis: Inside Russia's Information War on Moldova's Election The BBC reports on a Russian-funded network in Moldova. Its goal is to influence the parliamentary elections on 28 September 2025. Participants were recruited through Telegram. They were asked to post pro-Russian content on TikTok and Facebook. The reported payment was approximately $ 170 per month. Organisers gave instructions. They also provided guidance on using AI. The posts targeted President Maia Sandu and the ruling PAS party. Claims included election fraud, child trafficking and forced LGBT policies. Participants were also asked to conduct unauthorized opinion polls. The results and secret recordings could later be used to cast doubt on the election outcome. According to the BBC, the network was coordinated by Alina Juc from Transnistria. She is reportedly linked to Russia. Funding reportedly came via the Russian state-owned Promsvyazbank. There are also indications of ties to oligarch Ilan Shor. He is based in Moscow and sanctioned by the US, EU, and UK. The NGO Evrazia was also named as involved. The BBC reports that the network operates at least 90 TikTok accounts, which have garnered over 23 million views. DFRLab estimates an even wider reach. Shor, Evrazia, and Juc did not respond to questions. Moldova’s police view disinformation as the main method of interference. The Russian embassy denies the allegations. Source: BBC, O. Marocico, S. Mirodan & R. Ings, 2025. How a Russian‑funded fake news network aims to disrupt elections in Europe. [online] Published 21 September 2025. Available at: https://www.bbc.com/news/articles/c4g5kl0n5d2o Top of Page Paid to Post: Anatomy of a Pro-Russian 'Digital Army' The DFRLab report describes an operation with alleged links to Moscow that aims to influence Moldova’s parliamentary elections on September 28, 2025. Individuals were reportedly paid to create inauthentic accounts and spread coordinated content. The network has been active since the fall of 2024 and has been monitored since January 2025. By August 2025, around 200 so-called “InfoLeaders” had been recruited. DFRLab analyzed 253 accounts across TikTok, Facebook, and Instagram. In total, the operation generated nearly 29,000 posts, reaching over 55 million views, more than 2 million likes, and hundreds of thousands of comments. While TikTok was the main platform, Facebook activity grew in mid-2025. The structure was hierarchical. Russian-speaking curators set daily tasks, hashtags, and quotas. Recruits could advance from “communication activists” to InfoLeaders. The network utilized hashtags systematically, organized flash mobs, and instructed participants to personalize their content to make it appear more organic. The main narratives targeted President Maia Sandu and the ruling PAS party, focusing on alleged fraud, corruption, and criticism of EU and NATO integration. Politically, the operation shifted from supporting Ilan Shor’s “Victory Bloc” to promoting the “Moldova Mare” party, reusing earlier narratives under a new banner. Source: DFRLab, V. Châtelet & V. Olari, 2025. Paid to post: Russia-linked ‘digital army’ seeks to undermine Moldovan election. [online] Published 24 September 2025. Available at: https://dfrlab.org/2025/09/24/paid-to-post-russia-linked-digital-army-seeks-to-undermine-moldovan-election/ Top of Page RT Pushes Kremlin Disinformation to Undermine Canadian Support for Ukraine A recent analysis by DisinfoWatch details another instance of Russian state media attempting to undermine Western support for Ukraine, this time targeting Canadian audiences. The report breaks down an RT article that falsely accuses Canada of funding "atrocities" and "neo-Nazi brigades." This campaign provides a clear case study of a broader Kremlin strategy to erode public support for Ukraine by reviving the well-worn "Ukraine-as-Nazi" trope and reframing legitimate aid as complicity in war crimes. The DisinfoWatch analysis highlights RT's use of classic disinformation techniques, including whataboutism, projection, and the distortion of facts, notably, ignoring ICC warrants against Russian officials. The campaign's objective is to emotionally manipulate audiences and delegitimize Canada's actual efforts, which focus on documenting war crimes in cooperation with the ICC. The report notes that the operation scores extremely high on disinformation risk, given its overt delivery by a recognized state-media asset, its reliance on single-source claims, and its repetition of established Kremlin propaganda narratives, making it a straightforward example of foreign information manipulation. Source: Publisher: DisinfoWatch, Author: DisinfoWatch, Title: RT Falsely claims “Canada keeps bankrolling Ukraine’s war crimes”, Date: 22 September 2025, Available at: https://disinfowatch.org/disinfo/rt-falsely-claims-canada-keeps-bankrolling-ukraines-war-crimes/ Top of Page Deconstructing Russia's Moldova 'Occupation' Narrative An article by DisinfoWatch deconstructs a Russian disinformation narrative, circulated in the lead-up to Moldova's recent elections, which claimed the EU and NATO were preparing to "occupy" the country. The report traces the claim's origin to Russia's Foreign Intelligence Service (SVR), providing another clear example of a coordinated, state-level influence operation. The narrative, which cited NATO troop presence in the region as a pretext, was amplified without evidence by state media outlets like RT and TASS. The DisinfoWatch report highlights the campaign's clear strategic objectives: timed to coincide with the election, it sought to intimidate voters, delegitimize the country's pro-EU policies, and erode trust in Western partners. The analysis tracks the dissemination path from the SVR press bureau through major state media before being laundered into regional sites and social media ecosystems. By debunking the claim and contrasting it with the EU's actual policy of supporting democratic reforms, the report presents a concise case study on how unsubstantiated security threats are fabricated and deployed to create political instability. Source: DisinfoWatch, 2025. EU is not “preparing to ‘occupy’ Moldova – Moscow” . [online] Published 23 September 2025. Available at: https://disinfowatch.org/disinfo/eu-is-not-preparing-to-occupy-moldova-moscow/ Top of Page Kremlin Campaign Corrupts AI Models in Moldovan Election Influence Op A recent analysis by NewsGuard has identified a Kremlin-linked disinformation operation. The campaign's name is "Storm-1516," and it targeted Moldova's recent parliamentary elections. The campaign represents a continuation of established malign influence efforts, focusing on disseminating false corruption claims against the incumbent pro-European government to undermine the democratic process. Utilizing a vast propaganda network, the operation achieved considerable reach, drawing over 17.7 million views on platforms like X. This saturation level underscores the scale of the effort directed at a country with a population of only 2.4 million. The investigation’s key finding, however, elaborates on an evolving tactic: the deliberate infection of Generative AI models. NewsGuard found that when prompted about the campaign's false narratives, major AI chatbots reproduced the disinformation more than one-third of the time. This successful compromise of widely used AI tools demonstrates a new and dangerous vector for FIMI campaigns. The operation highlights an escalation in tactics used to influence key elections, in this case, aiming to derail Moldova's European trajectory and reassert Russian influence in the region. Source: NewsGuard, E. Maitland, A. Lee & M. Roache, 2025. New Kremlin‑linked influence campaign targeting Moldovan elections draws 17 million views on X and infects AI models. [online] Published 26 September 2025. Available at: https://www.newsguardrealitycheck.com/p/new-kremlin-linked-influence-campaign Top of Page Iran's Scandinavian Operations: A Permissive Environment for Espionage and Influence An analysis published by Eurasia Review details the long-standing and varied intelligence operations conducted by the Islamic Republic of Iran (IRI) in Denmark and Norway. The report provides further examples of Iran's operational playbook, highlighting how the region's advanced industries, universities, and politically active diaspora make it an attractive, yet often overlooked, target for hostile state activities. The findings reinforce the understanding of Iran's global intelligence reach and its use of multifaceted tactics. The analysis outlines a range of operations, including assassination plots against dissidents, cyber espionage targeting research institutions, surveillance conducted through diplomatic and religious channels, and the use of local criminal networks for kinetic attacks. Crucially, it places these activities within the context of Iran’s strategic alignment with Russia and China, citing the Swedish Security Service's assessment that these states are collaborating to reshape the global order. The report concludes that a fragmented and weak response from Scandinavian governments has created a low-risk, permissive environment, effectively emboldening Tehran's intelligence services. Source: Eurasia Review, A. Khoshnood, M. Norell & A. M. Khoshnood, 2025. A growing security threat: Iranian intelligence operations in Scandinavia (Part One: Denmark and Norway) – Analysis. [online] Published 25 September 2025. Available at: https://www.eurasiareview.com/25092025-a-growing-security-threat-iranian-intelligence-operations-in-scandinavia-part-one-denmark-and-norway-analysis/ Top of Page Georgia's Ruling Party Uses 'Traditional Values' Disinformation to Counter EU Pressure According to an article from The Jamestown Foundation's Eurasia Daily Monitor details the intensified use of disinformation by Georgia’s ruling party, Georgian Dream, as it faces EU pressure to reverse democratic backsliding. The analysis outlines how the party is weaponizing anti-LGBT conspiracy theories, falsely framing EU democratic norms as an imposition of “Western decadence” and a threat to national sovereignty. This narrative serves as a political tool to rally the party's conservative base and deflect blame for potential EU sanctions resulting from its own controversial policies. Despite this top-down campaign, the report highlights polling data showing that public support for EU integration remains overwhelmingly high at 78 percent. This suggests the government’s narrative has failed to shift the majority opinion on Georgia's geopolitical orientation. However, the continued promotion of these divisive conspiracies through pro-government media risks further polarizing society. The strategy illustrates a case of a state actor using value-based disinformation to undermine a supranational body and erode trust in democratic processes, even when public sentiment is resistant. Source: Jamestown Foundation, B. Chedia, 2025. Georgian Dream weaponizes LGBT‑related conspiracy theories. [online] Published 23 September 2025. Available at: https://jamestown.org/program/georgian-dream-weaponizes-lgbt-related-conspiracy-theories/ Top of Page Experts Issue Global Call for AI 'Red Lines' to Prevent Mass Disinformation In a significant public call for urgent regulation, a coalition of over 200 leading figures, including Nobel laureates and prominent experts from OpenAI and Google DeepMind, have signed an open letter demanding that governments establish strict "red lines" for artificial intelligence. Released to coincide with the UN General Assembly session, the statement warns that unregulated AI poses severe dangers, explicitly highlighting its potential to enable large-scale disinformation campaigns and manipulate public opinion, thereby undermining democratic societies. The letter further details risks such as the loss of meaningful human control as AI systems, some of which have already exhibited deceptive behavior, are granted increasing autonomy. The signatories stress that voluntary commitments from developers are insufficient. They urge governments to act swiftly to create a binding international agreement on these "red lines" by the end of 2026. This framework would aim to hold AI providers accountable for preventing foreseeable harmful outcomes, directly addressing the growing threat of AI-powered foreign information manipulation and influence. Source: The Signatories of the "AI Red Lines" Letter , 2025. Global Call for AI Red Lines . [online] Published September 2025. Available at: https://red-lines.ai/ Top of Page Expert Analysis: EU's Institutional Weakness is its Greatest Vulnerability to Foreign Meddling In an interview published by Follow the Money (FTM) , democracy expert Luise Quaritsch elaborates on the European Union’s systemic vulnerability to foreign malign interference, framing it as a component of a broader hybrid warfare strategy. The analysis highlights persistent Russian tactics, including the creation of "doppelganger" websites and covert influence platforms, such as "Voice of Europe", as examples of a low-level, constant stream of interference designed to exploit societal divisions. These operations are amplified by other actors and across platforms where malign content can gain traction. Quaritsch argues that the critical issue is not a lack of tools but the EU's failure to deploy its existing powers effectively. The bloc’s complex governance and interconnected member state policies create numerous institutional and physical access points for foreign actors to exploit. This means that a vulnerability in one member state poses a threat to the entire Union. While new legislative efforts, such as transparency registers, are being discussed, the interview emphasizes that the priority should be securing these inherent structural weaknesses, arguing that the EU is currently failing to counter the threat effectively. Source:Follow the Money (FTM), Keepe, A. (2025). EU has the power to fight foreign meddling – but isn’t using it, democracy expert says . [online] Published 23 September 2025. Available at: https://www.ftm.eu/articles/interview-luise-quaritsch-eu-foreign-meddling Top of Page [CRC Glossary] The Cyfluence Research Centre has relaunched the CRC Glossary. This initiative aims to serve as a shared lexicon of both foundational and emerging terms that shape the field. To this end, the Glossary is designed to be a continually updated resource, with new entries added weekly. We see this as a collaborative project and strongly encourage input from the expert community. The goal is to reduce the problem of ambiguous or conflicting terminology that can hinder collaborative work as well as communication effectiveness to the general public as a whole. We invite you to submit additions, changes, or corrections via the form on our website. [Download]
- Influence in Czechia: Digital Battles Ahead of the 2025 Elections
On 3–4 October 2025, the Czech Republic will hold parliamentary elections. Since Russia’s invasion of Ukraine, the Czech government has supplied weapons, training, and financial support to Kyiv. President Petr Pavel has consistently argued for continued backing. Czechia is also an important EU economy, closely tied to European supply chains in industry and energy. A change in government could affect both its Ukraine policy and its role within the EU. This election follows other recent cases where foreign information manipulation and interference (FIMI) was more than just visible. In Romania, digital campaigns contributed to the annulment of the presidential vote (for a deep dive analysis, see our report here ). In Moldova, pro-EU parties won the parliamentary elections in September 2025, despite significant interference (for more information, see our blog here ) . Now, it is Czechia’s turn to face similar challenges to its democratic processes and discourse. The Czechia Country Election Risk Assessment (CERA) i provides a detailed examination of how hostile influence networks operate within the Czech information space, encompassing coordinated Telegram ecosystems, disinformation portals, and financing structures. It also identifies structural vulnerabilities such as low trust in institutions, susceptibility to conspiracy narratives, and gaps in regulation. Taken together, these findings give one of the clearest pictures of the pressures shaping the 2025 elections. The report can be found here . Political Context The contest is dominated by three main actors: The populist ANO movement of Andrej Babiš . The governing conservative coalition SPOLU, led by Prime Minister Petr Fiala . The far-right SPD of Tomio Okamura . Yet the situation is more complex. Smaller political forces, from the Pirates to protest parties like Stačilo! or the Motorists, exist on the fringes. This fragmentation is likely to complicate coalition-building and raises the stakes for every percentage point that digital influence campaigns might shift. ii External Influence Networks Russia remains the central external actor. For years, Moscow has invested in disinformation, cyber operations, and covert funding. Following the EU's ban on channels like Sputnik in 2022, activity shifted to the digital domain. Telegram channels, such as neČT24, distribute translated Kremlin content daily, while the Pravda network aggregates posts from more than 7,000 channels into Czech debates. iii Parallel structures, such as Voice of Europe, operated from Prague with Russian financing. China is less visible but still relevant, particularly through TikTok. Just days before the election, investigators uncovered around 300 fake TikTok accounts spreading pro-Russian synthetic propaganda. These profiles generated millions of views weekly, surpassing the combined reach of the official accounts of leading Czech politicians. iv Figure 1 – Potentially inauthentic TikTok accounts, identified by The Center for Online Risk Research Campaigns and Platforms The Czech information environment is hybrid. Traditional outlets, such as Seznam Zprávy or public service broadcasting, enjoy high levels of trust, but alongside them, an ecosystem of problematic portals and Telegram channels operates. From Parlamentní listy to fringe groups, narratives are orchestrated and mutually amplified. Digital mobilization often spills into physical actions: protests under the Stačilo! banner directly channels narratives first spread on Telegram into the streets. Figure 2 – Sources of news, courtesy of FDEI project v Narratives and Their National Resonance Hostile influence campaigns (HICs) in Czechia revolve around dominant narratives: electoral fraud , delegitimization of security institutions , anti-Ukraine frames , and anti-EU/anti-Western frames. vi Their resonance derives from deep-rooted domestic fault lines. Mistrust of electoral integrity runs deep: over half of Czech citizens believe the government could manipulate election results. vii Anti-Ukraine narratives play on war fatigue and economic hardship, while many consider support for Kyiv as excessive. Anti-EU narratives also resonate strongly: 54% of the population views EU decisions critically, making claims of alleged “Brussels dictate” highly effective. viii These narratives are not simply imported; they exploit existing anxieties, reinforcing them until they erode trust in the country’s democratic trajectory. Impact Assessment The impact of HICs is less about measurable vote shifts than about long-term erosion. The CERA report highlights three risks: The normalization of mistrust. If 54% believe fraud is possible, the legitimacy of any future government is undermined. Discouraging participation in voting, using the demobilization of pro-European voters, along with repeated claims of corruption and stolen elections. Amplifying social and political fragmentation, so that smaller protest parties benefit disproportionately from digital influence and intervention efforts, thereby pushing themselves into mainstream debates. Together, these dynamics create an electoral environment in which populist and pro-Russian forces gain strength without a single ballot being hacked. ix Figure 3 – Key Issues Shaping Voter Sentiment in the Czech Republic, courtesy of FDEI project x Responses and Limitations Authorities have sought to push back: the Ministry of the Interior has launched public information campaigns, the BIS intelligence service monitors disinformation networks, and cooperation with TikTok has been initiated. xi Yet structural deficits remain. The Digital Services Act (DSA) , which obliges platforms to monitor manipulative content, ensure algorithmic transparency, and remove harmful material swiftly, has been in force at the EU level since 2024. But the Czech Republic has been slow in transposing and implementing the framework nationally. xii As a result, a critical tool for curbing FIMI remains blunt. Election authorities face similar limits: their resources are designed for physical ballot management, not real-time counter-disinformation. Coordination across agencies is often fragmented, with warnings being issued in parallel rather than centrally. Conclusion The Czech parliamentary elections are more than a domestic event. They are another link in an ongoing chain of growing friction between EU domestic forces and geopolitical rival powers alike. Digital influence campaigns aim to weaken pro-European actors, empower populist currents, and challenge Czechia’s Western orientation. Resilience in the information space is therefore crucial. Platforms must be held accountable, opaque Telegram networks cannot remain blind spots, and state institutions need a coordinated strategic communication (StratCom) approach. Clear rules on political financing are also essential to prevent covert external funding. The recent elections in Romania, Moldova, and the Czech Republic confirm that digital information manipulation is already an inherent challenge. Europe’s response in building up resilience and implementing countermeasures will determine whether democratic trust can withstand the culminating pressure. [Footnotes:] [i] FIMI Response Team (FRT‑24), Debunk.org , EU DisinfoLab, GLOBSEC, Institute for Strategic Dialogue (ISD), 2025. Czechia: Country election risk assessment. [online] Available at: https://fimi-isac.org/wp-content/uploads/2025/09/FRT-24_Czechia-Country-Election-Risk-Assessment-CERA_FINAL.pdf [ii] Ibid. pp. 9-10 [iii] Ibid. pp. 28-32 [iv] Radio Prague International, Jakub Ferenčík, 2025. Russian propaganda is spreading on Czech TikTok ahead of elections. [online] Published 30 September 2025. Available at: https://english.radio.cz/russian-propaganda-spreading-czech-tiktok-ahead-elections-8864264 ; [v] FIMI Response Team (FRT‑24), Debunk.org , EU DisinfoLab, GLOBSEC, Institute for Strategic Dialogue (ISD), 2025. Czechia: Country election risk assessment. [online] p.15 Available at: https://fimi-isac.org/wp-content/uploads/2025/09/FRT-24_Czechia-Country-Election-Risk-Assessment-CERA_FINAL.pdf [vi] Ibid. pp. 20-24 [vii] Ibid. p. 21 [viii] Ibid. p. 19 [ix] Ibid. pp. 18-22 [x] Ibid. p. 14 [xi] Radio Prague International, Jakub Ferenčík, 2025. Russian propaganda is spreading on Czech TikTok ahead of elections. [online] Published 30 September 2025. Available at: https://english.radio.cz/russian-propaganda-spreading-czech-tiktok-ahead-elections-8864264 ; [xii] FIMI Response Team (FRT‑24), Debunk.org , EU DisinfoLab, GLOBSEC, Institute for Strategic Dialogue (ISD), 2025. Czechia: Country election risk assessment. [online] pp. 46-47 Available at: https://fimi-isac.org/wp-content/uploads/2025/09/FRT-24_Czechia-Country-Election-Risk-Assessment-CERA_FINAL.pdf
- CRC Spotlight: Exposing Digital Hostile Influence with Honeypots
A recent X poll regarding a water crisis in Iran, displayed voting irregularities indicative of Coordinated Inauthentic Behaviour (CIB) attributed to regime-backed actors. In this CRC spotlight we use this incident as a case study to explore a new perspective on counter Foreign Information Manipulation & Interference (FIMI) tactics. We examine how interactive online content, such as polls on controversial topics, can provide defenders and researchers alike with an intelligence windfall. And that by baiting threat actors into action, knowledge of their Tactics, Techniques, and Procedures (TTPs) can be leveraged as part of a defensive strategy. While there is no indication this poll was a deliberate trap, it does suggest further study is required on the potential for concepts such as an ‘Influence Honeypot’ to be included in existing defensive frameworks, such as DISARM Blue. [ Download Full Report here ]
- Influence in Moldova: Coordinated Campaigns Ahead of Critical Elections
On 28 September 2025, Moldova will hold parliamentary elections. For a country of just 2.5 million people, the stakes are unusually high. The election will determine whether Moldova continues on its path toward the European Union or whether Moscow succeeds in reasserting its influence. This contest is no longer fought solely at the ballot box, but increasingly across digital arenas where political majorities are shaped and reshaped. The elections have also drawn attention from the EU itself, reflecting their broader significance for European security. On 10 September 2025, the European Parliament adopted a resolution condemning Russian hybrid interference in Moldova and calling for strengthened EU support to safeguard the electoral process. Commissioner Kos reinforced this stance in a speech announcing the readiness of EU Hybrid Rapid Response Teams to assist Moldova’s counter-FIMI efforts. With only a few days left before the vote, a recent report, Moldova: Country Election Risk Assessment (CERA) , has gained relevance. Compiled by analysts from several organizations (DFRLab, Alliance4Europe, Debunk.org , and EU DisinfoLab) as part of the FIMI Defenders for Election Integrity Project (FDEI), this comprehensive assessment outlines the current political landscape, identifies key actors, and explores the various influence operations affecting Moldova’s information space. Given the significant impact of digital manipulation in the final stretch before elections, and as showcased so clearly by the case of the recent Romanian elections , the Moldova CERA report warrants close attention. This blog does not attempt to summarize all 80+ pages but highlights the central operations, structures, and narratives that may prove decisive in the upcoming vote. Political Context Moldova’s domestic politics are sharply divided. President Maia Sandu and her pro-European PAS party are pushing firmly toward EU integration, while pro-Russian forces gather around fugitive oligarch Ilan Șor, who is long associated with covert financing and orchestrated protest actions. i Between these poles stand smaller parties that present themselves as pro-European yet often amplify narratives originating from Kremlin-linked channels. ii This polarized environment provides fertile ground for external influence. Figure 1: Foreign actors operating in Moldova’s information landscape, courtesy of FDEI iii External Influence Networks The FDEI report, alongside others, shows that Russian-linked structures deliberately target Moldova’s information ecosystem. Operation Matryoshka produces highly polished videos designed to resemble neutral think tank analysis, but which consistently frame EU membership as a threat. Distribution primarily runs through Telegram, X, and Bluesky, targeting Russian- and Romanian-speaking communities within Moldova. iv The Pravda network, also known as “Portal Kombat,” functions as a redistribution hub. Content from Russian state media and Kremlin-affiliated Telegram channels is translated and republished across dozens of websites posing as local outlets. Activity peaks coincide with sensitive moments such as U.S. Secretary of State Antony Blinken’s visit or the second round of presidential elections. v Storm-1516, a Russian digital influence network, follows an imitation strategy. It builds cloned domains of legitimate news outlets and fills them with fabricated stories. One Moldovan case accused Sandu of embezzling foreign aid, allegedly citing Mayor Ion Ceban. The article was entirely false but looked authentic, complete with stolen bylines, and circulated widely on Telegram. vi Independent assessments, including VIGINUM, note the tactical overlap with Russia’s “Doppelgänger” operation but classify Storm-1516 as a separate network. Anonymous Telegram channels serve as a primary gateway for Kremlin narratives to reach Moldova. They aggregate, synchronize, and amplify manipulated content from the above networks, ensuring alignment with political events and pushing narratives into public debate. vii Campaigns and Platforms These networks converge in concrete campaigns. Following the ban on several pro-Russian TV stations, Moldova24 (MD24) emerged, hosted in Russia and backed by at least 16 mirror domains. It spreads content simultaneously across TikTok, Telegram, Instagram, and YouTube. The U.S. platform Cameo was also exploited: purchased celebrity greetings were re-captioned to suggest calls for Sandu’s resignation, then circulated on Moldovan Telegram and Facebook channels as supposed “Western voices”. vii i The Șor network illustrates the link between digital and physical mobilization. Under the hashtag #STOPUE, Telegram bots recruited referendum opponents. Participants uploaded ID documents and were paid to share content or join protests, with transactions routed through sanctioned Russian banks and the MIR payment system. This model was expanded via the Taito app, where protesters registered, signed contracts, and received up to $3,000 per month, four times Moldova’s average salary. These funds sustained the so-called “tent protests,” which appeared spontaneous but were in fact coordinated and financed. ix Artificial intelligence is also part of the toolkit. A deepfake video depicted Electoral Commission chair Angelica Caraman allegedly admitting to foreign interference. It spread on Telegram and was later amplified by Russian Foreign Ministry spokesperson Maria Zakharova, illustrating how anonymous digital manipulation can merge with official diplomacy. x Narratives and Their Dynamics The report identifies consistent narrative lines. At the meta-level, Anti-EU, Anti-West, Anti-Establishment, and pro-Russian frames dominate. xi Beneath them operate sub-narratives: that EU membership erodes sovereignty, xii that NATO and the EU bring war and chaos, that Sandu is corrupt and incompetent, that democracy is hollow, and that elections are rigged anyway. xiii The impact does not stem from single stories but from cumulative reinforcement across platforms and formats. xiv Figure 2: Meta-Narratives and Sub-Narratives, courtesy of FDEI xv Assessing the Impact Whether such campaigns ultimately shift votes is difficult to prove. The report is cautious, stressing the limited measurability of direct effects. Yet it warns by comparison: in Romania, similar combinations of disinformation, covert financing, and orchestrated protests contributed to elections being annulled. xvi The risk in Moldova lies less in one decisive fake than in the steady erosion of trust and the demobilization of pro-European voters. Figure 3: Meta-Narratives and Sub-Narratives, courtesy of FDEI xvii Responses and Their Limits Moldovan authorities have started to respond. In 2025, the electoral commission refused to register the pro-Russian “Victory” bloc after tracing its funding to Șor’s structures. The Supreme Security Council now designates electoral interference and illicit financing as national security threats, while foreign activists linked to destabilization are denied entry. Also, external initiatives have arrived, including programs such as M-MIIND, introduced in 2024 to reinforce independent media and experiment with approaches to countering foreign information manipulation and interference (FIMI). xviii At the same time, Moldova’s institutions lack the resources to counter complex digital campaigns in real-time, a weakness highlighted by the report . xix And this is not unique to Moldova: many Western European countries face the same challenge, struggling to match the speed and scale of hostile influence campaigns (HICs). Conclusion Moldova’s election serves as another reminder of how foreign interference has become an integral part of modern geopolitics rather than an episodic disruption. The country is not unique in facing these tactics, but its small size, polarized politics, and proximity to the EU–Russia fault line make it a prime case. The precedent of Romania, where similar methods of disinformation, covert financing, and orchestrated protests contributed to the annulment of the presidential election, shows how fragile electoral integrity can be when external manipulation intersects with domestic fragmentation. A CRC report on Romania’s annulled elections provides a detailed analysis of the influence efforts that fueled the situation. In the context of the upcoming Moldovan elections, this case study now offers valuable lessons, while shedding light on dynamics that are highly relevant for Moldova, where external interference threatens democratic processes. What the Moldova: Country Election Risk Assessment adds to that analysis is a systematic mapping of the evolving threat environment. By going beyond isolated incidents, it identifies the networks, narratives, and financial flows that drive hostile influence, and shows how digital propaganda, mobilization, and covert funding reinforce each other. This makes the report valuable not only for understanding Moldova but as a reference point for analyzing how HICs evolve across Europe. [Footnotes:] [i] FDEI for election integrity (FIMI‑ISAC), Digital Forensic Research Lab (DFRLab), Alliance4Europe, Debunk.org , EU DisinfoLab, 2025. Country report Moldova: risk assessment (Jan 2025 – Jan 2027). Pp.62-64 [online] Available at: https://fimi-isac.org/wp-content/uploads/2025/09/Country-Report-Moldova-Risk-Assessment.pdf [ii] Ibid. 8-13 [iii] Ibid. p.21 [iv] Ibid. p.20 [v] Ibid. p.20 [vi] Ibid. pp.23-24 [vii] Ibid. p.21 [viii] Ibid. p.23 [ix] ibid. 62-64 [x] Ibid. 24 [xi] Ibid. 31-42 [xii] Ibid. p.45 [xiii] ibid. p. 45-52 [xiv] ibid. pp.56-59 [xv] ibid. p.30 [xvi] ibid. p.27 [xvii] ibid. p.33 [xviii] ibid pp.70-72 [xix] ibid. pp.72, 82
- CRC Weekly: Cyber-based hostile influence campaigns 08th-14th September
[Introduction] Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the last week we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). The following report is a summary of what we regard as the main events. [Report Highlights] Sophisticated AI-generated images and videos effectively fooled 73 percent of surveyed Americans regarding false Trump-Epstein narratives, showcasing synthetic media’s power. - NewsGuard Leaked documents challenge the perception of China’s digital control as purely ideological, revealing market forces drive its global surveillance and influence industry. - Wired Political violence following Charlie Kirk’s killing creates fertile ground for Russia and China to intensify disinformation and destabilize American institutions. - FIMI – ISAC Moldova’s weak media regulations allow banned pro-Russian outlets to persist, creating vulnerabilities exploited by foreign information manipulation and interference. - FIMI-ISAC Social media users exploit video editing glitches to falsely claim AI generation, propagating conspiracy theories around Trumps recent address. - NewsGuard Moscow systematically weaponizing international forums - proclaiming peace and human rights while actively undermining them through aggressive actions and pervasive disinformation campaigns. - EUvsDisinfo Weak platform moderation combined with anonymous channels allows foreign actors to weaponize Telegram for election disruption and societal division across EU member states. - EUvsDisinfo [Weekly Review] Foreign and Domestic Actors Weaponize Disinformation in Philippine Midterms Telegram’s Strategic Underreporting Challenges EU Digital Governance and Information Integrity Russia and China Target Czechia’s Elections with Advanced Disinformation Campaigns Political Violence Amplifies U.S. Vulnerability to Foreign Disinformation Operations Moldova’s Vulnerable Information Landscape Faces Escalating Foreign Influence Operations Rapid Disinformation Spread: False AI Claims Distort Trump’s Post-Assassination Address AI Chatbots Amplify Misinformation During Breaking News Crises NewsGuard Index: Persistent American Vulnerability to Digital Misinformation Continues Leaked Files Expose China’s Global Export of Disinformation and Surveillance Capabilities Kremlin’s UNGA Performance: A Masterclass in Deceptive Peacemaking and Reality Distortion Disinformation Blunts Accountability: Russia’s Pattern of Denying War Crimes Medvedev, Dmitriev Exploit Kirk’s Killing to Blame Left, Undermine Kyiv Support Foreign and Domestic Actors Weaponize Disinformation in Philippine Midterms Double Think Lab has released their analysis exploring how foreign information manipulation and disinformation are deeply embedded in the Philippines’ electoral landscape, significantly impacting its political environment and democratic integrity. Chinese-linked influence operations, deploying PR firms and “keyboard warrior” networks, actively amplified pro-Duterte narratives and undermined trust in democratic processes during the 2025 midterm elections. These sophisticated campaigns utilize AI-generated images, viral conspiracy theories, and coordinated social media activity across platforms like X, Facebook, and TikTok. Operations blend domestic political messaging, such as the #BringPRRDHome hashtag, with pro-PRC content related to the West Philippine Sea, strategically discrediting certain candidates while supporting others. The pervasive nature of FIMI extends to foreign interference cases, exemplified by the disqualified mayor Alice Guo, a Chinese national allegedly linked to criminal activities and strategic locations. Election monitors emphasize the severe threat these manipulations pose to democracy and foreign policy, advocating for stronger safeguards and collaborative counter-FIMI strategies. Initiatives like the Shadow FIN network and the “Bayanihan” volunteer model demonstrate a convergent, multi-stakeholder approach to build resilience against hostile information operations and secure the digital ecosystem, ensuring informed democratic participation amidst evolving threats. Source: Double Think Lab, September 2025, available online at: // medium.com/doublethinklab/bayanihan-for-eleksyon2025-philippine-midterms-monitoring-263ce456cb97 Top of Page Russia and China Target Czechia’s Elections with Advanced Disinformation Campaigns A report by FIMI – ISAC reveals that Czechia’s 2025 presidential elections face significant Foreign Information Manipulation and Interference (FIMI) risks, primarily from Russia and China. These actors exploit polarizing issues such as the war in Ukraine, energy security, migration, and EU relations, aiming to deepen social divides and erode trust in democratic institutions. Problematic outlets, including neČT24, Telegram-based ecosystems, and foreign-sponsored platforms like Voice of Europe, amplify divisive narratives, disseminating anti-refugee rhetoric and claims of electoral fraud, some of which feature synthetic audio and AI-generated content targeting President Petr Pavel. While Czechia possesses resilient electoral infrastructure, its information space remains vulnerable due to delayed Digital Services Act implementation, limited state capacity to analyze malign influence, polarized political discourse, and domestic actors amplifying foreign narratives. Countering these evolving threats requires a comprehensive, whole-of-society response, emphasizing closer cooperation across state institutions, civil society, independent media, and EU-level mechanisms, alongside continuous monitoring, proactive risk communication, and investment in institutional capabilities. This situation underscores the persistent challenge of safeguarding democratic integrity against sophisticated digital hostile influence operations. Source: FIMI – ISAC. (2025). Foreign Disinformation Threats to Czechia’s 2025 Presidential Elections. FIMI – ISAC. available online at: https://fimi-isac.org/wp-content/uploads/2025/09/FRT-24_Czechia-Country-Election-Risk-Assessment-CERA_FINAL.pdf Top of Page Political Violence Amplifies U.S. Vulnerability to Foreign Disinformation Operations The killing of conservative commentator Charlie Kirk in Utah exposes deepening U.S. political violence and social discord, creating opportunities for foreign adversaries like Russia and China to exploit societal rifts. Newsweek reports that these nations are accused of leveraging such divisions through disinformation campaigns to inflame tensions and undermine American governance. The article explains how this incident intensifies focus on political violence and misinformation, providing foreign actors with a fresh flashpoint to manipulate public perception and destabilize the political landscape. Utah Governor Spencer Cox warned that bots from Russia and China actively encourage violence and misinformation, highlighting the intersection of domestic unrest, social media amplification, and foreign exploitation. China has previously exploited U.S. social crises, including the January 6 Capitol riot, by amplifying divisive narratives and using networks like “Spamouflage” to impersonate voters and spread discord, increasingly with AI-generated content. Similarly, Russia’s 2016 election interference campaigns employed fake personas and troll farms to exacerbate racial, ideological, and cultural divisions, tactics that have persisted. Modern technology, including AI, enables rapid spread of these false narratives across platforms like X, Facebook, and Telegram, targeting polarized audiences. Kirk’s murder underscores how moments of unrest are utilized to weaken U.S. cohesion and credibility globally. Source: Newsweek, Amir Daftari, Sep 2025, available online at: https://www.newsweek.com/charlie-kirk-china-russia-oppourtunity-us-division-2128734 Top of Page Moldova’s Vulnerable Information Landscape Faces Escalating Foreign Influence Operations FIMI-ISAC’s Country Election Risk Assessment identifies significant threats to Moldova’s September 2025 parliamentary elections, primarily from Russian-led hybrid operations involving extensive disinformation and foreign information manipulation. These efforts aim to derail Moldova’s pro-EU path by spreading anti-EU, anti-Western, and anti-government narratives, often through the PENA model, labeling President Maia Sandu as a Western puppet, and normalizing electoral fraud. The cyfluence landscape is increasingly complex, utilizing AI-generated deepfakes, forged documents, and sophisticated cross-platform campaigns for amplification. Moldova’s media environment, marked by weak regulations and the resurgence of banned pro-Russian outlets via mirror sites and social media, exacerbates these vulnerabilities. Covert financing, bots, and encrypted messaging applications further facilitate protest organization and propaganda dissemination, complicating attribution. The report assesses the overall risk to electoral integrity as medium to high, predicting intensified activity as election day approaches, and urges proactive measures including inter-agency coordination, digital platform partnerships, civil society monitoring, and robust public communication to safeguard democratic processes. Source: fimi-isac. Escalating Disinformation Threatens Moldova’s 2025 Elections. fimi-isac. Available online at: from https://fimi-isac.org/wp-content/uploads/2025/09/Country-Report-Moldova-Risk-Assessment.pdf Top of Page Rapid Disinformation Spread: False AI Claims Distort Trump’s Post-Assassination Address NewsGuard definitively debunks widespread social media claims alleging President Donald Trump’s Oval Office address, delivered after conservative commentator Charlie Kirk’s assassination, was an AI-generated deepfake. Following Kirk’s Sept. 10 killing, Trump posted a speech on Truth Social, prompting anti-Trump users to highlight unnatural hand movements and background leaf shifts as signs of artificial intelligence. These false assertions rapidly propagated, with some suggesting the purported AI indicated Trump’s poor health or even implicated his administration in Kirk’s death, potentially as a diversion from the Jeffrey Epstein case. However, analysis by AI detection software like Hive and cybersecurity experts from GetReal Labs found no evidence of AI generation in either the video or audio. The observed irregularities are attributed to a common video editing technique known as a “morph cut,” designed to seamlessly join segments and remove verbal errors, which misinformed users misinterpreted as AI glitches. This incident critically illustrates how visually ambiguous digital content can be weaponized as disinformation, quickly disseminated to spread politically charged conspiracy theories and erode public trust, directly impacting the Cyfluence landscape. Source: NewsGuard, Sep 11 2025, Available Online at: https://www.newsguardrealitycheck.com/p/trumps-address-on-charlie-kirk-is Top of Page AI Chatbots Amplify Misinformation During Breaking News Crises AI-generated ‘fact-checks’ are actively spreading falsehoods and fueling confusion during breaking news events, exemplified by the Charlie Kirk assassination. NewsGuard reports that as social media users sought clarity on the Sept. 10 incident, AI chatbots like Perplexity and Grok issued contradictory or outright inaccurate information, including denying Kirk’s death, fabricating a suspect, and falsely linking him to the Myrotvorets blacklist. This amplification of confusion occurs amidst major tech companies scaling back human fact-checkers, leading to a vacuum that AI, incapable of real-time human-like verification, fills with confident but erroneous responses. Furthermore, the accessibility of generative AI facilitates the ‘liar’s dividend,’ enabling users to baselessly label authentic footage as fabricated, thus casting doubt on legitimate content. Despite repeated examples of these tools confidently repeating falsehoods, many users continue to treat AI systems as reliable sources during crises, posing a significant challenge to information integrity and exacerbating the hostile influence landscape. Source: NewsGuard Reality Check, McKenzie Sadeghi, Sep 11, 2025, Online at: https://www.newsguardrealitycheck.com/p/after-kirk-assassination-ai-fact Top of Page NewsGuard Index: Persistent American Vulnerability to Digital Misinformation Continues The latest Reality Gap Index from NewsGuard reveals that nearly two-thirds of Americans (64 percent) believed at least one of August 2025’s top false online claims, mirroring July’s high rate. This ongoing measurement highlights a significant vulnerability to digital hostile influence, with AI-generated media playing a pivotal role. Specifically, a wide margin of Americans, 73 percent, believed or were uncertain about AI-fabricated images and videos falsely depicting Donald Trump and Jeffrey Epstein with underage girls. This underscores the potent and deceptive nature of synthetic media in shaping public perception. Other significant falsehoods included an inaccurate claim about President Trump declaring martial law in Washington D.C., and a widely uncertain narrative concerning $100 million missing from a California wildfire charity. The index, based on a YouGov survey of 1,000 Americans, underscores the persistent challenge of online misinformation and its deep penetration into mainstream public belief, indicating a critical landscape for cyfluence operations where fabricated content can readily sow discord and confusion. Source: NewsGuard, Samantha Tanner, Sep 09, 2025, Available online at: https://www.newsguardrealitycheck.com/p/nearly-two-thirds-of-americans-believe Top of Page Leaked Files Expose China’s Global Export of Disinformation and Surveillance Capabilities Leaked documents from Chinese firms Geedge Networks and GoLaxy expose a significant commercialization of censorship, surveillance, and propaganda technologies, challenging the traditional view of China’s digital control. A recent article from Wired explains how Geedge offers a ‘Great Firewall as a service’ to nations like Kazakhstan, Pakistan, Ethiopia, and Myanmar, enabling governments to monitor, intercept, and manipulate internet traffic. Concurrently, GoLaxy leverages AI for extensive social media data collection, political mapping, and pushing targeted narratives through fabricated accounts. Its clients include the Chinese Communist Party, government, and military, with internal documents boasting capabilities akin to Cambridge Analytica in shaping discourse around sensitive topics such as Taiwan, Hong Kong, and U.S. elections. Researchers highlight that these revelations demonstrate market forces actively shaping digital authoritarianism, with companies competing for contracts and setting sales targets. This commercialization extends beyond mere censorship into active disinformation, as targeted propaganda, synthetic profiles, and narrative manipulation are openly marketed to government clients. The findings underscore a concerning global proliferation of sophisticated hostile influence capabilities, driven by profit motives within China’s tech sector. Source: Wired, Z Yang & L Matsakis Sep 11, 2025, Available online at: https//www.wired.com/story/made-in-china-how-chinas-surveillance-industry-actually-works/ Top of Page Kremlin’s UNGA Performance: A Masterclass in Deceptive Peacemaking and Reality Distortion EUvsDisinfo reveals the profound chasm between Moscow’s pronouncements at the United Nations General Assembly and its hostile actions, underscoring a sophisticated, global disinformation campaign. While Russia champions peace, development, and human rights at UNGA, its actions consistently subvert these ideals, deploying narratives of projection, denial, and distraction. The Kremlin’s “peace” proposals are, in reality, demands for Ukrainian surrender, masked by false accusations against Kyiv and the West for prolonging conflict. Concurrently, Moscow propagates the falsehood that Western sanctions harm Europe more, despite mounting evidence of Russia’s stagnating civilian economy and severe budget strain due to war production. Russia further attempts to position itself as a protector of the “Global South” against Western “bullying,” even as its documented interventions from Syria to the Sahel reveal a pattern of destabilization and state capture. Disinformation tactics extend to fabricating claims of Ukrainian chemical weapons use, while credible reports confirm Russia’s own deployment of such agents. Most disturbingly, Russia denies the forced deportation of over 20,000 Ukrainian children, a confirmed war crime linked to its demographic crisis, portraying these abductions as mere evacuations. This systematic deceit makes a mockery of international principles, forming a core component of Russia’s hostile influence operations. Source: EUvsDisinfo, September 12, 2025, available online: https://euvsdisinfo.eu/please-mind-the-gap-moscows-words-at-unga-vs-deeds-on-the-ground/ Top of Page Disinformation Blunts Accountability: Russia’s Pattern of Denying War Crimes The Kremlin is shown to have falsely accused Ukraine of fabricating mass casualty figures following a Russian glide-bomb strike in Donetsk, according to research by Disinfowatch . Within hours of the September 9 incident, RT published denials from an unnamed Russian Defense Ministry source, claiming the story originated with President Zelensky and was amplified by ‘Ukrainian propaganda.’ This narrative asserted ‘no strikes in the area’ and that the crater did not match an aerial bomb impact, fitting an agenda to discredit Kyiv’s care for Donbas residents. However, the attack is independently and extensively documented. Reuters, AP News, and the Los Angeles Times published on-scene reporting and imagery confirming the casualties. The UN Humanitarian Coordinator for Ukraine issued an official condemnation, directly contradicting the claim that the story began with a single politician. Ukrainian officials beyond Zelensky also reported the strike, while Kyiv’s provision of frontline pensions is well-documented, countering RT’s insinuations. This incident exemplifies a textbook Kremlin denial strategy, which attacks messengers and injects pseudo-forensics to muddy clear evidence. Such a tactic aims to blunt outrage and accountability for suspected war-crime incidents against civilians, forming a critical component of Russia’s cyfluence operations amid claims of not targeting civilians. Source: Disinfowatch, Sep 9th, 2025, Available online: https://disinfowatch.org/disinfo/kremlin-falsely-accuses-ukraine-of-fabricating-mass-strike-casualties/ Top of Page Medvedev, Dmitriev Exploit Kirk’s Killing to Blame Left, Undermine Kyiv Support Newsweek reports that Russian officials, including former president Dmitry Medvedev and Kremlin negotiator Kirill Dmitriev, have exploited the assassination of conservative activist Charlie Kirk to exacerbate U.S. political divisions and advance anti-Ukraine narratives. These actions represent a clear cyber hostile influence operation, leveraging a domestic tragedy to sow discord and undermine Western support for Kyiv. Kirk, known for his anti-NATO stance and criticism of Ukrainian President Zelensky—whom he once called a ‘puppet of the CIA’—provided fertile ground for this disinformation. Medvedev specifically blamed ‘left-wing liberal scum who support Banderite Kyiv’ for the murder, falsely associating Ukraine with Nazi sympathies, while Dmitriev amplified content celebrating Kirk’s death and echoing sentiments like ‘The Left is the party of murder.’ This exploitation aims to falsely link Ukraine supporters with violence and pressure right-wing Americans to withdraw their backing for Kyiv, aligning with broader Kremlin propaganda. Mark Shanahan, a U.S. politics expert, noted this incident highlights how America’s already hyper-polarized political landscape offers ripe opportunities for foreign adversaries to amplify internal conflicts and destabilize discourse. Source: Newsweek, Sep 11, Available online at https://www.newsweek.com/kirk-killing-medvedev-maga-2128048 Top of Page Telegram’s Strategic Underreporting Challenges EU Digital Governance and Information Integrity An informative EUvsDisinfo article highlights Telegram’s emergence as a primary conduit for hostile digital influence campaigns across Europe, directly challenging the EU’s information space and digital governance frameworks. Since its 2013 founding, Telegram has rapidly expanded, boasting 1 billion users globally by 2025, driven by its multi-purpose functionality, perceived security, and minimal content moderation. These factors, coupled with co-founder Pavel Durov’s “freedom of speech” branding, make it attractive to malicious actors. Critically, Telegram appears to strategically underreport its EU user base to evade designation as a Very Large Online Platform (VLOP) under the EU’s Digital Services Act, sidestepping stringent content moderation and accountability measures. Case studies from Spain, Germany, France, and Poland illustrate Telegram’s use for pivoting from anti-vaccination narratives to pro-Kremlin disinformation, disrupting elections, and amplifying content banned elsewhere. Ukraine’s experience serves as a stark warning, where Telegram’s unregulated expansion has normalized anonymous channels as primary news sources, enabling Russian actors to conduct pervasive influence operations and foster societal division. The EU must heed these lessons, implementing robust regulation and transparency to safeguard democratic values from Telegram’s corrosive influence. Source: EUvsDisinfo, P Burdiak, O Monastyrskyi & O Tretyakov-Grodzevych, September 08, 2025, Available Online: https://euvsdisinfo.eu/eus-telegram-dilemma-the-rise-of-unchecked-influence/ Top of Page [Takeaways] This week's reporting underscores the persistent and evolving nature of Russian and Chinese information operations, which continue to target democratic vulnerabilities globally. The established strategy remains twofold: sustained campaigns to degrade electoral integrity in nations like the Philippines, Moldova, and Czechia, alongside the opportunistic weaponization of domestic crises in the U.S. to deepen polarization. Generative AI remains a key force multiplier, its utility extending beyond creating synthetic content to actively degrading the information commons through the “liar’s dividend” and the misinforming output of AI chatbots. This hostile activity is enabled by under-regulated platforms and amplified by a now-established strategic trend: the commercialization of digital authoritarianism. The continued export of influence-as-a-service by Chinese firms ensures that the capabilities to erode democratic cohesion are becoming more accessible, solidifying a long-term, systemic challenge to open societies. [CRC Glossary] The Cyfluence Research Centre has relaunched the CRC Glossary. This initiative aims to serve as a shared lexicon of both foundational and emerging terms that shape the field. To this end, the Glossary is designed to be a continually updated resource, with new entries added weekly. We see this as a collaborative project and strongly encourage input from the expert community. The goal is to reduce the problem of ambiguous, or conflicting terminology that can hinder collaborative work as well as communication effectiveness to the general public as a whole. We invite you to submit additions, changes, or corrections via the form on our website. [Download]
- Stark Industries Solutions: A Threat Activity Enabler (TAE) in Focus
This blog builds on the new Insikt Group report [i] on Stark Industries Solutions to examine how hosting providers can serve as TAE [ii] in hostile cyber and influence operations. The case of Stark Industries illustrates how infrastructure providers, often presenting themselves as legitimate businesses, become indispensable to the delivery of disinformation, cyberattacks, and other hybrid threats. Stark Industries Solutions Ltd., incorporated in the United Kingdom in February 2022, was founded by Iurie and Ivan Neculiti. Both have a long history in the hosting sector, with Ivan previously linked to Morenehost Ltd . , an offshore service exposed in the Pandora Papers database (see ICIJ Offshore Leaks database [iii] ). Stark operated as a “white label" [iv] brand for PQ. Hosting [v] , offering Virtual Private Servers (VPS), proxy, and Virtual Private Network (VPN) services while concealing the true operators. [vi] Over time, its networks were repeatedly observed in connection with Distributed Denial-of-Service (DDoS) attacks, financially motivated actors such as FIN7 [vii] , and, importantly, infrastructure supporting pro-Russian information manipulation operations, including the Doppelgänger or “Recent Reliable News” (RRN) network [viii] (find more information about Doppelgänger in CRC article and blog section). In these contexts, Stark’s role was not to generate content but to provide the resilient infrastructure that made such campaigns scalable and durable. On 20 May 2025, the Council of the European Union sanctioned Stark Industries Solutions Ltd., together with its CEO and owner, for enabling Russian state-sponsored cyber operations, explicitly citing their role in information manipulation, interference, and cyberattacks. [ix] The move followed media exposure: on 8 May 2025, the Moldovan service of Radio Free Europe/Radio Liberty reported on leaked sanction lists that named the Neculiti brothers, [x] and the central newsroom of RFE/RL confirmed the forthcoming designations on 9 May [xi] . Timeline of events observed by Insikt Group, Courtesy of Recorded Future [xii] The Insikt report concludes that Stark anticipated the sanctions and deliberately restructured its operations. In April 2025, Russian infrastructure was already being migrated to UFO Hosting LLC [xiii] , a Moscow-based Internet Service Provider(ISP) registered under ASN: AS33993 [xvi] . Domains such as [bill-migration-db.stark-industries.solutions] and [russia.stark-industries.solutions] resolved through UFO-announced IP space before the EU’s action. [xv] When the sanctions came on 20 May, Stark was formally listed in the EU’s Official Journal. [xvi] Nine days later, on 29 May, PQ.Hosting announced a full rebrand as THE.Hosting , presenting Dutch entity WorkTitans B.V. as the new corporate vehicle. By 24 June, a new ASN, AS209847, had been created to consolidate the rebrand. [xvii] Company details of WorkTitans B.V. , Courtesy of Recorded Future [xviii] The RIPE database [xix] showed that maintainer [xx] records across PQ Hosting Plus, UFO Hosting, and THE.Hosting all shared the same identifiers tied to Russian operator Dmitrii Miasnikov. [xxi] This demonstrated operational continuity behind the cosmetic changes. For analysts, this case illustrates the importance of domain and network analysis in understanding influence operations. Narratives and content can shift rapidly, but infrastructure leaves durable traces. Tracking RIPE records [xxii] , ASN histories (to observe continuity despite rebrands), prefix transfers [xxiii] , and maintainer overlaps [xxiv] enables the continuity of disinformation infrastructure to be followed even when brands and jurisdictions change. The Insikt report provides concrete examples: leaked sanction lists triggered asset transfers observable in RIPE, domains resolved through UFO Hosting while protected by DDoS, and operator fingerprints remained visible across multiple shells. The full Insikt Group report is recommended reading for practitioners. It offers a detailed account of how a sanctioned TAE`s adapted with minimal disruption. The Stark case is a reminder that sanctioning entities involved in hostile information operations is necessary but not sufficient; without infrastructure-focused monitoring and multilateral coordination, such actors will continue to sustain malign campaigns under new names. [Footnotes] [i] Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [ii] A Threat Activity Enabler (TAE) is a company or service provider whose infrastructure, such as hosting, VPNs, or proxy networks, is repeatedly used to support malicious cyber or influence operations. TAEs may not conduct attacks or disinformation themselves but provide the technical backbone that allows hostile actors to operate at scale. Because they sit in a gray zone between legitimate business and illicit use, TAEs are difficult to disrupt and often adapt quickly to sanctions or law enforcement actions, source: Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [iii] International Consortium of Investigative Journalists, n.d. Offshore Leaks database: Morenehost Ltd (Node 240120865). [online] Available at: https://offshoreleaks.icij.org/nodes/240120865 ; Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] p.3, Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [iv] The term “white lable” refers to a reseller brand without its own infrastructure [v] “PQ Hosting is a Moldova-based hosting provider founded in 2019 by Ivan Neculiti. The company offers VPS/VDS, dedicated servers, VPN, and DNS services in over 35 countries, serving more than 100,000 clients”, Source: PQ Hosting, n.d. PQ Hosting: services, global reach, and infrastructure. [online] Available at: https://pq.hosting (checked 12 September 2025). [vi] KrebsOnSecurity, 2024. Stark Industries Solutions: An Iron Hammer in the Cloud. [online] Published 23 May 2024. Available at: https://krebsonsecurity.com/2024/05/stark-industries-solutions-an-iron-hammer-in-the-cloud/ [vii] FIN7 (also known as the “Carbanak Group”) is a Russian-speaking cybercrime organization active since at least 2015, targeting U.S. and international retail and restaurant chains. The group is best known for deploying malware on point-of-sale systems to steal millions of payment card records. According to the FBI, FIN7’s campaigns caused billions of dollars in losses to businesses and consumers, source: FBI, 2018. How cyber crime group FIN7 attacked and stole data from hundreds of U.S. companies. [online] Published 1 August 2018. Available at: https://www.fbi.gov/contact-us/field-offices/seattle/news/stories/how-cyber-crime-group-fin7-attacked-and-stole-data-from-hundreds-of-us-companies [viii] KrebsOnSecurity, 2024. The Stark truth behind the resurgence of Russia’s Fin7. [online] Published 10 July 2024. Available at: https://krebsonsecurity.com/2024/07/the-stark-truth-behind-the-resurgence-of-russias-fin7/ [ix] European Union, 2025. Council Decision (CFSP) 2025/966 of 20 May 2025 amending Decision (CFSP) 2024/2643 concerning restrictive measures in view of Russia’s destabilising activities. ST/5953/2025/INIT. [online] Published 20 May 2025. Available at: https://eur-lex.europa.eu/eli/dec/2025/966/oj/en [x] Europa Liberă Moldova, 2025. UE pregătește sancțiuni contra a doi frați de la Bender, acuzați că luptă în războiul hibrid al Rusiei împotriva Europei. [online] Published 8 May 2025. Available at: https://moldova.europalibera.org/a/ue-pregateste-sanctiuni-contra-a-doi-frati-de-la-bender-acuzati-ca-lupta-in-razboiul-hibrid-al-rusiei-impotriva-europei/33407343.html [xi] RFE/RL, Rikard Jozwiak, 2025. The EU’s latest sanctions package against Russia might be its weakest yet. [online] Published 9 May 2025. Available at: https://www.rferl.org/a/eu-russia-sanctions-package-ukraine-hungary-/33409397.html [xii] Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [xiii] IPinfo, n.d. UFO Hosting LLC (AS33993) details. [online] Available at: https://ipinfo.io/AS33993 [xiv] ASN stands for Autonomous System Number, a unique identifier for a network that participates independently in global internet routing; following ASN histories allows researchers to see when companies rebrand but continue using the same underlying infrastructure. [xv] Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] pp. 10-11, Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [xvi] European Union, 2025. Council Decision (CFSP) 2025/966 of 20 May 2025 amending Decision (CFSP) 2024/2643 concerning restrictive measures in view of Russia’s destabilising activities. ST/5953/2025/INIT. [online] Published 20 May 2025. Available at: https://eur-lex.europa.eu/eli/dec/2025/966/oj/en [xvii] RIPE NCC, n.d. RIPE database record for AS209847. [online] Available at: https://apps.db.ripe.net/db-web-ui/query?bflag=false&dflag=false&rflag=true&searchtext=AS209847&source=RIPE [xviii] Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [xix] RIPE stands for: Réseaux IP Européens, the regional internet registry for Europe that records who controls IP addresses and networks. [xx] A maintainer in the RIPE database is the technical contact responsible for managing IP resources; if multiple companies use the same maintainer entries, it strongly suggests they are controlled by the same actors. [xxi] Recorded Future, Insikt Group, 2025. One step ahead: Stark Industries Solutions preempts EU sanctions. [online] p.17, Published 27 August 2025. Available at: https://assets.recordedfuture.com/insikt-report-pdfs/2025/cta-2025-0827.pdf [xxii] RIPE records are public entries showing who controls IP address blocks and networks. [xxiii] Prefix is a block of IP addresses, transfer is the movement of that block from one provider to another; these transfers often indicate attempts to mask continuity [xxiv] Maintainer overlaps share technical contacts that reveal common operators
.png)









