CRC Weekly: Cyber-based hostile influence campaigns 25th – 31st August
- Michael Bayliss Hack
- Sep 4
- 7 min read

[Introduction]
Cyber-based hostile influence campaigns are aimed at influencing target audiences by promoting information and/or disinformation over the internet, sometimes combined with cyber-attacks which enhance their effect. During the 25th to the 31st of August, 2025 we observed, collected and analyzed endpoints of information related to cyber-based hostile influence campaigns (including Cyfluence attacks). The following report is a summary of what we regard as the main events.
This week's events highlight the continued diversification of actors and tactics in the information domain. Persistent Russian campaigns targeting European states occurred alongside reported US-linked activities, including a domestic covert influencer network, as well as an alleged hostile operation in Greenland. Concurrently, state responses were relatively assertive, evidenced by Indonesia’s heightened regulatory pressure on tech platforms and Denmark's formal diplomatic summons of the US Ambassador.
[Contents]
[Report Highlights]
A pro-Russia network published 8,514 posts, which amassed at least 16 million views, attacking Romania’s electoral integrity and promoting anti-EU sentiment. - DFRLab
US-linked operatives allegedly compiled lists of sympathetic locals and sought negative narratives to undermine Danish authority in Greenland. – ABC News
An undisclosed political group is reportedly offering Influencers up to $8,000 monthly in exchange for promoting partisan Democrat messaging in the US. - WIRED
A forged screenshot of a credible news publication was circulated by conservative users to lend journalistic authority to the claim the UK is banning the English flag - NewsGuard
Indonesia is shifting content moderation responsibility to platforms by demanding proactive removal of disinformation to prevent social unrest. - Reuters
Poland’s absence of a permanent Digital Services Coordinator creates a critical regulatory vacuum, undermining effective enforcement of the EU Digital Services Act. - FIMI-ISAC
[Weekly Review]
False Flag Ban Narrative Exploits UK Patriotism Debate to Fuel Outrage
According to NewsGuard, conservative and nationalist social media users are circulating a fabricated screenshot of a Daily Mail article to falsely claim the U.K. government is considering a ban on the English flag. This disinformation leverages a real, recent controversy involving the Birmingham City Council's removal of flags from lampposts for safety reasons, a move which had already ignited a debate over patriotism and national identity. The campaign's core tactic is the use of a convincing but non-existent article, purportedly written by a real political editor and featuring Prime Minister Keir Starmer, to lend the claim false credibility. This method has proven effective, with one post receiving over 850,000 views and generating significant hostile engagement. The Daily Mail has officially confirmed the article is fake, and no credible news outlets or government sources have suggested any such policy is under consideration. The incident highlights the use of forged media to exploit existing societal tensions for political purposes.
Source: NewsGuard, Mascha Wolf, 2025, NewsGuard Reality Check, Available at: https://www.newsguardrealitycheck.com/p/uk-banning-english-flag-fake-news
Lavrov Interview Showcases Russia’s Standardized Disinformation and FIMI Playbook
An analysis from EUvsDisinfo deconstructs a recent interview by Russian Foreign Minister Sergey Lavrov, identifying it as a masterclass in the Kremlin's FIMI playbook. The piece systematically dismantles several core Russian narratives designed to manipulate Western audiences. These include the false pretense of being open to dialogue while pressing for surrender, the baseless claim of President Zelenskyy’s illegitimacy, and the lie that military strikes avoid civilian infrastructure. The analysis also refutes the foundational justifications for the invasion, such as the pretext of protecting Russian speakers from a fabricated genocide and the distortion of the Budapest Memorandum. These narratives collectively serve a strategy of denial, deflection, and distortion, aimed at rewriting history and justifying a war of aggression. The Kremlin's manipulation of legal language and international agreements is presented as a cynical tool to legitimize its military actions and pursue imperial ambitions while casting Ukraine as unreasonable.
Source: EUvsDisinfo, EUvsDisinfo, 2025, Russia’s narrative manipulation, [online] Available at: https://euvsdisinfo.eu/russias-narrative-manipulation/
Russia's SVR and Georgian Dream Execute Coordinated Anti-Western Disinformation Operations
A report from EUvsDisinfo details how Russia's foreign intelligence service (SVR) and Georgia's ruling party, Georgian Dream, are conducting a coordinated disinformation campaign to undermine the country's Western partners. The campaign aims to erode public trust in the US, EU, and UK by portraying them as destabilizing forces orchestrating a "color revolution." A distinct operational pattern involves the SVR releasing specific accusations, which are then swiftly amplified by pro-government media outlets like TV Imedi and POSTV and echoed by high-level Georgian officials. This synchronized messaging has systematically targeted different Western actors over time, beginning with the US before shifting focus to the EU and later the UK. These actions, often supported by fabricated or unverified video evidence, represent a deliberate strategy to discredit domestic civil society, derail Georgia's Euro-Atlantic integration, and maintain the nation's position within Moscow's sphere of influence.
Source: EUvsDisinfo, EUvsDisinfo, 2025, Russian scripts, Georgian voices: How disinformation targets the country’s Western allies: the US, EU, and UK in Georgia, [online] Available at: https://euvsdisinfo.eu/russian-scripts-georgian-voices-how-disinformation-targets-the-countrys-western-allies-the-us-eu-and-uk-in-georgia/
Polish Elections Withstand Foreign Influence, But Systemic Vulnerabilities Persist
A FIMI-ISAC research paper on the 2025 Polish presidential election concludes that while foreign information manipulation from Russia and Belarus posed a persistent threat, its overall impact was constrained by civil society resilience and the limited sophistication of some campaigns. Known operations like Doppelganger, Operation Overload, and the Pravda Network disseminated anti-EU, anti-Ukrainian, and anti-establishment narratives, often amplifying far-right candidates by portraying them as defenders of national sovereignty. The threat landscape was notable for its consistency with previous elections, although domestic political actors were observed adopting similar manipulative tactics, such as fabricating personas and spreading false claims. Significant systemic weaknesses persist, including vulnerabilities on platforms like X, Meta, and TikTok that are exploited for coordinated inauthentic activity. A critical vulnerability identified is Poland’s lack of a permanent Digital Services Coordinator, creating a regulatory vacuum. The report recommends strengthening platform accountability under the DSA and establishing permanent cross-sector coordination to safeguard Poland's democratic processes.
Source: FIMI-ISAC, 2025, Foreign Information Manipulation and Interference (FIMI) during the 2025 Polish presidential elections, [online] Available at: https://fimi-isac.org/wp-content/uploads/2025/08/FDEI-POLISH-ELECTION-COUNTRY-REPORT-2025-2.pdf
Denmark Summons US Envoy Amid Greenland Influence Campaign Allegations
A report by ABC News indicates Denmark has summoned a senior US diplomat following allegations of a covert American influence campaign in Greenland. The operation, reportedly conducted by at least three individuals with connections to US President Donald Trump, is believed to aim at weakening the relationship between Greenland and Denmark from within. Alleged tactics include compiling lists of US-friendly Greenlanders, identifying individuals opposed to Trump, and soliciting locals for information that could portray Denmark negatively in American media. These activities align with stated US interests in the strategically significant, resource-rich territory. In response, Denmark’s Foreign Minister deemed any interference unacceptable. The Danish Security and Intelligence Service further noted that Greenland is a target for influence campaigns designed to exploit or fabricate divisions, confirming it has increased its operational presence in the region. The incident underscores the growing geopolitical contestation in the Arctic, where influence operations are an emerging vector of statecraft.
Source: ABC News, 2025, Denmark summons US envoy over suspected influence operations in Greenland, [online] Available at: https://www.abc.net.au/news/2025-08-28/denmark-summons-us-envoy-people-carrying-influence-in-greenland/105705686
Pro-Russia Network Targets Romanian Election with Anti-Sandu Disinformation Campaign
Analysis from the Digital Forensic Research Lab (DFRLab) illustrates how a coordinated pro-Russia network of at least 215 accounts on Facebook, TikTok, and Instagram has been conducting a hostile influence campaign since December 2024. The operation sought to undermine Romania's presidential election by accusing Moldovan President Maia Sandu of electoral interference. The network initially supported one far-right candidate before pivoting to another, George Simion, after the first was barred from running. Operators deployed a range of tactics, including the use of generative AI for content and profile pictures, hijacked accounts, and coordinated hashtags in Russian and Romanian. Key narratives were anti-Sandu, anti-EU, and pro-Russian, with specific themes accusing Moldova of dragging Romania into conflict. With over 8,500 posts generating at least 16 million views, the campaign demonstrates a systemic effort to exploit platform vulnerabilities. The findings also reveal deficiencies in platform transparency, as many accounts operated below follower thresholds required for inclusion in public research datasets, potentially obscuring the campaign's full scale.
Source: Digital Forensic Research Lab (DFRLab), Valentin Châtelet, 2025, Cross-platform campaign accuses Moldova’s Sandu of meddling in Romanian elections, [online] Available at: https://dfrlab.org/2025/08/26/cross-platform-campaign-accuses-moldovas-sandu-of-meddling-in-romanian-elections/
Democratic-Aligned Dark Money Group Covertly Pays Influencers for Coordinated Messaging
According to an investigation by WIRED, a dark money organization is secretly funding prominent Democratic-aligned influencers to promote party narratives online. This initiative involves payments of up to $8,000 per month, contingent upon the influencers concealing the funding source and adhering to specific content restrictions. The operation signifies a notable evolution in domestic influence tactics, leveraging the parasocial trust and perceived authenticity of social media creators to conduct coordinated messaging campaigns. By requiring secrecy and imposing content controls, the effort intentionally blurs the line between genuine grassroots support and undisclosed paid promotion. This model effectively creates a network of astroturfed political messaging that appears organic to unwitting audiences. The use of such covert funding mechanisms within the domestic political landscape presents a significant challenge for platform transparency and the integrity of online discourse, mirroring strategies often associated with state-linked information operations.
Source: WIRED, Taylor Lorenz, 2025, A Dark Money Group Is Secretly Funding High-Profile Democratic Influencers, [online] Available at: https://www.wired.com/story/dark-money-group-secret-funding-democrat-influencers/
Indonesia Threatens Platforms with Fines and Expulsion Over Harmful Content
Reuters has published a story covering how the Indonesian government has summoned representatives from Meta, TikTok, and other platforms and demanded they proactively moderate harmful content. This move signifies a strategic shift, placing the onus on platforms to remove disinformation without waiting for government requests. The directive is a direct response to online campaigns that have successfully fueled public anger and real-world protests. Specific examples of this hostile influence include a deep fake video of the finance minister and mislabeled footage of past riots used to incite unrest. Notably, TikTok videos were reportedly used to mobilize youth for demonstrations, resulting in clashes and arrests. Jakarta is leveraging significant penalties for non-compliance, which range from fines and temporary suspension to the complete revocation of a platform's registration. The government's objective is to mitigate what it calls the "chaos" caused by inaccurate information and protect national stability. Meetings with X and YouTube are also planned as part of this broader regulatory push.
Source: Reuters, Reuters, 2025, Indonesia urges TikTok, Meta to act against harmful online content, [online] Available at: https://www.reuters.com/business/media-telecom/indonesia-urges-tiktok-meta-act-against-harmful-online-content-2025-08-27/
[Takeaways]
Indonesia’s move to impose direct liability on platforms, juxtaposed with Poland's persistent regulatory gaps, foreshadows an increasingly fragmented landscape for digital governance. This divergence creates a dual challenge: it imposes complex, country-specific compliance burdens on platforms while offering influence operators strategic havens, allowing them to exploit the jurisdictions of least resistance.
_edited.png)
.png)


