top of page

Weekly Report: Cyber based influence campaigns 28th April - 04th of May 2025

  • Writer: CRC
    CRC
  • May 9
  • 16 min read

[Listen to the Podcast]





[Report Highlights]





> TABLE OF CONTENTS <


HOSTILE INFLUENCE CAMPAIGNS - STATE ACTORS
  • [Russia]

  • [China]

AI RELATED ARTICLES
GENERAL REPORTS
FRAMEWORKS TO COUNTER DISINFORMATION
STATE ACTORS

[Russia]

Russia Expanding Disinformation Tactics in Africa

A recent EU vs. Disinfo article analyzes the Russians’ FIMI approach in Africa, which is described as a hybrid disinformation strategy that combines overt state media, covert networks, and local actors Global outlets like TASS and RIA Novosti provide baseline narratives, while regional branches such as RT Africa and Sputnik Afrique adapt content specifically for African audiences. Russian embassies amplify official messaging, especially in South Africa and Kenya.


Covert actors like the African Initiative, linked to Russian intelligence services, and the "Portal Kombat" (Pravda ecosystem) network are key in disseminating pro-Kremlin narratives. These entities operate through seemingly local websites and use automated republication to saturate regional information spaces with synchronized messaging.


Russia uses a two-way information laundering strategy in Africa: Narratives are first localized through the African Initiative and its amplifiers, then recycled into Russian state media to create an illusion of independent validation. In addition, Offline tools, such as events, media training, and partnerships with local groups (e.g., the African Initiative Association in Burkina Faso), reinforce these efforts on the ground.


The narratives focus on portraying the West, particularly France and the U.S., as exploitative and destabilizing, while framing Russia as a trustworthy partner and defender of African values. Familiar anti-Western narratives are tailored to local contexts, casting Russia as a natural ally. This strategy is not short-term propaganda but a persistent effort to reshape Africa’s information ecosystem in Russia’s favor.


Sources: 

Kremlin Uses Ovechkin’s Record for Influence Campaigns


As published by the Jamestown Foundation, Russian ice hockey star Alexander Ovechkin, forward for the Washington Capitals and the NHL’s all-time top scorer, is strategically used by the Kremlin as part of its digital influence efforts. While his athletic success is celebrated globally, the Russian state capitalizes on his popularity through online platforms such as the Kremlin’s official website, Telegram, and state-run media to advance nationalist messaging. Ovechkin’s public support for President Vladimir Putin, including his creation of the “Team Putin” campaign, and his refusal to denounce Russia’s invasion of Ukraine, make him useful in shaping pro-regime narratives. His curated image, shared through Instagram and YouTube, and appearances on Kremlin-affiliated TV, depict him as a patriotic strength and resilience symbol. These digital channels allow the regime to export influence beyond its borders, especially targeting Russian-speaking communities abroad. The Kremlin’s framing of Ovechkin’s record as a “triumph of sports soft power”—a term cited from domestic sources—illustrates how individual athletic success is woven into broader digital influence efforts designed to reinforce loyalty, distract from international isolation, and promote a unified national identity. 


Source:  

Russian Disinformation Campaign Targets Moldova’s Pro-EU President


NewsGuard reports that Maia Sandu, Moldova's pro-European President, has once again become the target of Russian disinformation efforts—this time in the context of the upcoming parliamentary elections in September 2025. During her re-election in 2024, Moldovan authorities reported a Russian hostile influence campaign (HIC) in support of a pro-Kremlin candidate.


The current campaign is driven by the Russian operation “Matryoshka,” known for producing seemingly authentic video forgeries and imitating credible media to spread false content. Since the election announcement, pro-Russian channels have circulated fake materials accusing Sandu of corruption and personal misconduct. One prominent example is a fabricated Vogue feature claiming Sandu is the most expensively dressed president in the world. Authentic images were used but falsely attributed to luxury brands with invented prices. According to NewsGuard, the clothing shown does not belong to the luxury segment but comes from a mid-range brand. Other forgeries include a manipulated Economist cover and a supposed BBC video accusing Sandu of embezzling public funds to support an alleged mistress. So far, seven such fabricated posts have been identified on pro-Kremlin Telegram and X channels.


Unlike previous campaigns focused on election manipulation, the current wave targets personal defamation, aiming to undermine Sandu’s integrity, delegitimize her pro-Western stance, and deepen political polarization in Moldova.


Source:  

Historical Revisionism at the Core of Kremlin Disinformation


EU vs. Disinfo reports that the Kremlin is expected to intensify disinformation campaigns targeting European countries in the lead-up to Russia’s Victory Day parade on May 9. These efforts increasingly rely on labeling critics as “Nazis” or “neo-Nazis.” The overuse of the term “Nazi” has become a rhetorical tool to discredit opponents and rewrite history. Isolated incidents are exaggerated and framed as alleged state-level glorification of Nazi criminals. At the same time, Ukrainian commemorative initiatives related to World War II are portrayed as disrespectful or revisionist. This tactic serves as a political pretext to justify Russia’s war of aggression against Ukraine and to delegitimize Western support for Kyiv.


Russian state media have long since begun distorting the history of World War II. They promote the narrative that Russia single-handedly defeated Nazism, while portraying Western democracies and former Soviet republics—including Ukraine—as Nazi collaborators. Historical facts, such as the participation of six million Ukrainian fighters against Hitler, are deliberately omitted.


These narratives are repeatedly circulated through influence campaigns. In TV broadcasts, AI-generated images, and social media posts, EU and NATO leaders are depicted in Nazi uniforms or as grotesque caricatures. Kremlin-aligned figures like Vladimir Solovyov, Margarita Simonyan, and former President Dmitry Medvedev use radical rhetoric and extremist language to amplify these messages. Outlets like RT and Sputnik spread this content globally and in multiple languages.


Source:  

[China]

Suspected China-Linked Influence Campaign Targets Exile Activists in UK

 

An investigation by The Guardian, in collaboration with the anti-racism organization Hope Not Hate, has uncovered evidence of a suspected disinformation campaign. It targets pro-democracy Hong Kong activists living in the United Kingdom. Following violent unrest in the UK in 2024, far-right channels on social media began inciting violence against asylum seekers. The focus soon shifted to prominent Hong Kong activists in exile. More than 150 posts from 29 accounts on platforms such as X and Telegram called for acts of violence against these individuals. The posts used derogatory language and spread false claims, accusing the activists of unlawful support for refugees and anti-national activity.


These events may be part of an online influence operation linked to Chinese state actors. Many posts were written in broken English and appeared during Chinese working hours. Some contained Chinese characters, typography, or references to figures associated with the Chinese government. In many cases, the activists were also doxxed — their home addresses and schedules were published. While no direct link to the well-known Spamouflage Dragon network could be confirmed, the patterns and methods closely resemble established state-backed disinformation efforts from China.


The campaign may form part of the so-called “transnational repression strategy,” aimed at silencing dissent beyond China’s borders. Many of the targeted exiles had already faced arrest warrants, intimidation of family members, and bounties in Hong Kong. Chinese officials deny the allegations, but international cybersecurity analysts and law enforcement agencies warn that Beijing’s strategy of fear and manipulation is advancing.

Source:  

AI Related Articles


AI Media Advances and Disinformation Tactics


DFRLab takes an in-depth look at how generative AI is reshaping the dynamics of disinformation campaigns. Modern tools such as diffusion models, GANs, and multimodal systems enable the creation of highly realistic synthetic media. While GANs typically output finished images with limited user control, newer systems allow users to define image content with far greater precision.


Telltale signs like asymmetrical jewelry or warped backgrounds are increasingly rare. Modern generative systems can accurately render symmetrical objects, such as glasses, or remove them entirely. With text-based prompts, users can adjust even the smallest visual elements. Additionally, many of these systems tend to favor idealized aesthetics and cinematic visuals, which enhance the believability of the output. As a result, traditional detection methods are becoming less effective.


These technologies are already being used in global disinformation efforts. DFRLab highlights the use of AI-generated profile images in pro-UAE campaigns during the COP28 summit and deepfake attacks in Brazil’s 2024 elections. Multimodal AI generates content with realistic settings, branding, and contextual cues that strengthen credibility. The shift from crude forgeries to refined AI-generated visuals accelerates the reach and effectiveness of hostile influence campaigns, making them significantly more challenging for researchers and the public to detect.


Source:  

GENERAL REPORTS


Polling Misinformation May Mislead Australian Voters 

ABC News examines various types of surveys and how they may contribute to the spread of disinformation.


One example involves Labor's claim that Peter Dutton was voted the worst health minister. It is based on an online survey conducted a decade ago with only 1,077 respondents from a medical publication’s readership. The survey lacked scientific methodology and excluded subsequent health ministers, yet the results are being presented as representative of the entire medical community.


Some surveys may selectively present or omit results. For instance, Clive Palmer, founder and leader of the United Australia Party (UAP), claimed Australians supported "Trump-like" policies. Still, the actual results showed more people opposed them than supported them. Additionally, some polls use leading questions to influence public opinion, as seen in a survey praising an independent MP before asking respondents for their voting preference.


The most manipulative form of polling is "push polling," where questions are designed to plant negative impressions rather than gather unbiased data. Some voters reported receiving biased surveys about independent candidates funded by "Climate 200," with the survey being abruptly cut off if they supported other parties.


Source:  

Hostile Influence Threatening the Integrity of Kosovo’s 2025 Election 

The BIRN Kosovo report on the 2025 parliamentary election highlights the significant hostile external influence, especially from Russia and Serbia. Russian state-funded media outlets such as Sputnik Serbia and RT Balkan played an essential role in disseminating disinformation that undermined the credibility of the election and Prime Minister Albin Kurti. These narratives falsely suggested that the West, particularly the US, was supporting Kurti to instigate an inter-ethnic conflict in Kosovo. Serbian media outlets echoed these messages, further inflaming tensions.


Chinese state-controlled media also contributed, though to a lesser extent, spreading narratives critical of NATO and the West. These efforts aimed to destabilize Kosovo's political landscape and its relations with Western institutions.


Social media played a central role in amplifying these disinformation efforts. Political actors, sometimes with external backing, used platforms like Facebook, Instagram, and Twitter to bypass traditional media and directly influence public opinion. Anonymous accounts and bots were widely used to amplify misleading content.


Ultimately, the lack of substantive policy debate, the unregulated use of disinformation, and opaque campaign financing severely compromised the election's informational integrity. BIRN’s findings stressed the urgent need for more vigorous enforcement of media laws, the regulation of AI in political discourse, and strategic institutional responses to foreign and domestic disinformation.

Source:  

Viral Disinformation Marks First 100 Days of Trump’s Second Term

The NewsGuard report on Donald Trump's second presidency identifies 20 viral false claims that garnered over 134 million views and nearly 4 million likes in the first 100 days of his administration. Many of these falsehoods reflect hostile influence, particularly through Russian disinformation campaigns. The Kremlin’s “Matryoshka” campaign spread fake videos linking Trump to unfounded claims, such as imposing tariffs on uninhabited islands and banning pro-Ukrainian slogans. These narratives were designed to undermine trust in Trump and the U.S. while stoking geopolitical tensions.


Additionally, pro-Trump users spread false claims, presenting misinformation as proof of Trump’s support for the middle class, such as the claim that he abolished taxes on tips and overtime. This disinformation was primarily spread on platforms like TikTok and X to influence political perceptions. Satirical websites like “America’s Last Line of Defense” also deliberately spread false information that was not recognized as parody, further exacerbating political polarization. These campaigns illustrate how hostile influence from foreign and domestic sources was used to manipulate public opinion and distort political narratives during the election period.


Source:  

Social Media’s Influence on the 2025 Canadian Election 


The DFR Lab report on the 2025 Canadian federal election shows that it was marked by disinformation and hostile foreign influence. Meta’s news blackout under Bill C-18 created an information vacuum, contributing to the spread of hyperpartisan and misleading content. With Meta’s fact-checking programs ended, false narratives flourished on platforms like Facebook and Instagram. AI-generated content and deepfakes, particularly targeting candidates such as Liberal leader Mark Carney, were also widespread.


China was identified as a significant foreign actor, using platforms like WeChat to spread manipulated content aimed at the Chinese-Canadian community. Although domestic actors were dominant in the information landscape, these foreign influence campaigns contributed to distorting the political discourse. The Conservative Party further politicized the issue of foreign interference during the campaign.


The lack of transparency on platforms hindered efforts to trace the origins of these disinformation campaigns. While the disinformation did not directly alter the election outcome, it contributed to political polarization and eroded public trust in the electoral process. The combination of domestic and foreign disinformation presents a growing challenge to the integrity of democratic processes, particularly in the digital age.


Source: 

Study Highlights the Impact of Misinformation on the 2025 Australian Election​ 

According to a report by The Conversation, a national study in Australia found that at least two-thirds of Australians had already encountered false or misleading election content during the campaign's early stages. This content included distorted claims about candidates and policies, misinformation about voting procedures, and baseless allegations of election rigging. Some respondents had difficulty distinguishing fact from fiction, which led to increased uncertainty and impaired informed decision-making.


The most common misinformation related to core policy issues, such as Medicare, housing, and climate change, is often associated with figures like Donald Trump, Clive Palmer, and major political parties. Both social media and established news sources were flagged as sources of this misleading content. While exposure to disinformation does not always result in a change of opinions, it can erode trust in democratic processes, especially among voters more susceptible to such misinformation. The study also drew parallels with global trends, including false narratives seen after the 2020 US elections, which can undermine trust in democratic institutions.


The Australian Electoral Commission (AEC) has taken steps to combat disinformation, including a disinformation register, media partnerships, and public education initiatives. Most Australians recognize the severity of the issue and support proactive measures to counter disinformation. The study emphasizes that maintaining trust in democracy requires clear information and cooperation between institutions, the media, and voters.


Source:  

Appendix - Frameworks to Counter Disinformation

EU Funds 5 million Euros to Strengthen Resilience to Disinformation 

The European Commission announces the allocation of nearly €5 million in funding to combat disinformation and strengthen media literacy across the EU. The initiative includes two important calls for proposals. The first call, worth €3.15 million, focuses on detecting harmful information manipulation and understanding how disinformation affects citizens, while developing strategies to strengthen societal resilience. The second call, valued at €1.6 million, supports independent fact-checkers by promoting verified content using creative media formats and partnerships with influencers, podcasters, and media outlets.


Source:  

[Download Report]

GLOSSARY


Information Operations

Hybrid Warfare

Cyber Warfare

Cyfluence Attack

Soft Warfare

CIB

FIMI

Hostile Influence Campaign (HIC)

Digital Impact on Discourse (DID)

Misinformation

Disinformation

Inauthentic Behavior

Fake users

Unidentified users

Sockpuppet accounts

Bots

Repurposed accounts

Fake website

Deep Assets

Real platforms

Astroturfing

Cyberbullying


DISCLAIMER


Copyright and License of Product 

This report (the "Product") is the property of Cyfluence Research Center gGmbH ("Cyfluence") and is protected by German and international copyright laws. The User is granted a limited, non-transferable license to use the Product solely for internal purposes. Reproduction, redistribution, or disclosure of the Product, in whole or in part, without prior written consent from Cyfluence is strictly prohibited. All copyright, trademark, and proprietary notices must be maintained.


Disclaimer of Warranties

The Product is provided "as is" without warranties of any kind, express or implied, including but not limited to warranties of merchantability or fitness for a particular purpose. Although Cyfluence takes reasonable measures to screen for viruses and harmful code, it cannot guarantee the Product is free from such risks.


Accuracy of Information 

The information in the Product has been obtained from sources believed to be reliable. However, Cyfluence does not guarantee the information's accuracy, completeness, or adequacy. The User assumes full responsibility for how they use and interpret the Product. Cyfluence is not liable for errors or omissions; opinions may change without notice.


Limitation of Liability

To the fullest extent permitted by law, Cyfluence shall not be liable for any direct, indirect, incidental, or consequential damages, including lost profits or data, arising from the use of or inability to use the Product, even if advised of such possibilities. Liability for intent or gross negligence remains unaffected under German law.


Indemnification

The User agrees to indemnify and hold harmless Cyfluence, its affiliates, licensors, and employees from any claims or damages arising from the User’s use of the Product or violation of these terms.


Third-Party Rights

The provisions regarding Disclaimer of Warranties, Limitation of Liability, and Indemnification extend to Cyfluence, its affiliates, licensors, and their agents, who have the right to enforce these terms.


Governing Law and Jurisdiction 

This Agreement is governed by German law, and any disputes shall be resolved exclusively in the courts of Berlin. If any provision is found invalid, the remaining terms remain in full effect.



bottom of page