top of page

Op-Ed: A New Era of Russian Information Warfare Against Germany



Disclaimer: A German exploration of the topic can be found in Digital Threads Unmasked, the joint newsletter by FORTITUDE and PENEMUE.


Today is April 10th, 2024, and 776 days have passed since Russia's full-scale invasion of Ukraine began. As Ukraine lacks many resources, such as sufficient ammunition, support from the West is also increasingly crumbling. The possible AfD (“Alternative for Germany”) electoral victories in German states, the rise of populism in European elections, and much more threaten Western unity. Amidst this backdrop of uncertainty, Germany finds itself at the forefront of a new and more menacing phase of Russian information warfare. This campaign is not merely an extension of past efforts but a significantly more aggressive, manipulative, and harmful strategy aimed at undermining the societal fabric of Germany. It seeks not just to weaken the country's support for Ukraine but to sow discord and instability within its borders.


What has happened in Germany in recent weeks?


In recent weeks, Germany has witnessed a series of incidents that underscore the sophisticated nature of Russia's information operations. While most of those activities aim to undermine Germany’s support for Ukraine, different issues and topics have been weaponized by Russia to unsettle, destabilize and divide the population.


A notable instance of such tactics is the revelation by the Federal Foreign Office of a sprawling Russian disinformation campaign on X. This operation involved over 50,000 fake accounts and generated more than a million tweets in German, propagating the narrative that the German government's support for Ukraine is at the expense of its own citizens. The campaign employed sophisticated methods, such as using reputable media sources with slightly altered domain names to lend credibility to false claims.


Another event that garnered significant attention echoing the espionage tactics of the Cold War era, is the "Taurus Wiretapping Affair" that emerged in early March 2024. In February 2024, a web conference of the German Bundeswehr was intercepted, in which high-ranking Air Force officers and the Inspector of the Air Force discussed the technical and political implications of a possible delivery of Taurus cruise missiles to Ukraine. These sensitive pieces of information were presumably recorded by a Russian intelligence service and disseminated in early March 2024 for destabilization and propaganda on Russian channels, such as on the state-run Russian propaganda channel RT.


A third incident, indirectly linked to Germany, is that the Czech authorities have unveiled a disinformation network funded by Russia, which operated via the Prague-based "Voice of Europe" website. The Czech daily "Denik N" reported that the site hosted statements from politicians calling for an end to EU aid to Ukraine and revealed that European politicians involved with the network received Russian funds. That includes politicians from Germany. After uncovering this operation, Czech officials imposed sanctions on the operators of the website. In a related statement, German Interior Minister Nancy Faeser accused key figures from the AfD party, including politicians Maximilian Krah and Petr Bystron who had appeared on "Voice of Europe," of participating in the Russian disinformation network.


The Russian Disinformation Playbook


While the strategies and mechanisms of these Russian disinformation attacks follow certain patterns that partly can be traced back to the USSR, it is crucial to acknowledge that what we see is merely the tip of an iceberg. The daily manoeuvers — covert operations designed to coax segments of the German population into propaganda’s embrace — often pass under the radar. And just as true is: If the German public is not accurately informed about these methods and intentions, the damage to elections, public discourse, and public safety can be enormous.


However, the Kremlin's objectives behind this action are as strategic as they are deliberately vague:


  1. Exploiting societal fears and divisions: Russia aims to use any potential for division and controversy in German politics and society to its advantage. The question of the Taurus delivery to Ukraine for example, and Chancellor Scholz’s stance not to deliver the cruise missile, led to a highly controversial debate in the German public. Russian propaganda, particularly seizing on the Chancellor's anticipated concern that delivering Taurus might imply a potential involvement in the war, spread the disinformation narrative that Germany was already planning to enter the war. This method of destabilization is specifically designed to deepen existing fissures in society and exacerbate existing fears and political as well as social tensions. By stirring up dissatisfaction and exploiting controversies, societal cohesion is deliberately weakened.

  2. Leveraging multipliers and engaging influencers: Russian propaganda consistently seeks to engage sympathizers within German society, such as the AfD or the "Alliance Sahra Wagenknecht" (Bündnis Sahra Wagenknecht, BSW) to propagate Russian narratives, as observed in the instances outlined.



Disclaimer: The assessment of a potential arms delivery by German Bundeswehr generals does not constitute an entry into the war, as portrayed by the BSW politician in line with the Russian propaganda narrative.


This amplification of Russian narratives may occur subtly or overtly, spanning from propagandistic interpretations of historical events and developments to current defense or social security debates. Moreover, influencers have been systematically involved for years to showcase the purported benefits of life under Russia's authoritarian regime, brazenly endorsing Russian war propaganda, including the denial of war crimes, in contrast to the allegedly “value-neutral” and "lenient" Western systems. A prime example is Alina Lipp.


3. Discrediting and weakening the alliance: Another goal is to drive a wedge between the key European and transatlantic allies, especially regarding support for Ukraine. By strategically leaking sensitive information, backing anti-EU and anti-US politicians, and propagating misleading narratives against Ukrainians and the Ukrainian government, Russia aims to erode European unity and solidarity and undermine existing alliances. In the Taurus case, the goal is to depict Germany as a security risk and foster mistrust among alliance partners concerning the confidential treatment of intelligence information. In the other cases mentioned, the campaign seeks to sow doubts about the legitimacy of NATO and the EU, question Ukraine's sovereignty and democratic evolution, reject Western legal principles, and create an overall atmosphere of uncertainty, skepticism, irritation, and the perception that Russia holds the moral high ground in history.


Implementing the Democracy Playbook


While a single newsletter edition is far from sufficient to outline effective strategies against disinformation and for the preservation of our democracy, it is still important to convey methods of combating disinformation to a wider audience. Researchers Jon Bateman and Dean Jackson from the Carnegie Endowment for International Peace have created an overview of measures at different levels of action in their Policy Guide “Countering Disinformation Effectively”, evaluating them in terms of scientific assessment, effectiveness, and scalability.


Although tools such as promoting media literacy, fact-checking, or platform regulation are now widely discussed in society, it is important to highlight two aspects often associated with uncertainty.


Counter-messaging strategies: Now, more than ever, in situations like the full-scale Russian invasion of Ukraine, it is crucial to unequivocally identify the aggressor, proactively unveil methods of hybrid warfare, and convey clear, unified messages highlighting the advantages of democracy and the rule of law, especially when these principles are increasingly viewed with skepticism. In the case of German support for Ukraine, for example, it is important to emphasize the benefits of this action. Firstly, for the protection of the Ukrainian population from the loss of their homeland, oppression, and death, and secondly, for Germany and the EU against Russian aggression on our territories. The importance of creating communication campaigns that do more than just present facts becomes increasingly evident. Moreover, it is crucial not to repeat harmful narratives, thereby amplifying them, but rather to identify the emotional elements of disinformation and counteract those emotions with positive messages. Proactive, constructive, and positive campaigns should aim to resonate on a narrative and psychological level, and promote an environment of understanding and positive resonance. Along with established AI tools that help us tailor messages in a data-based and increasingly granular manner to target groups, the Messaging Wheel method can aid in the development of effective messages, as elaborated in the Public Arena Playbook.



Effective Prebunking: Even though reactive intervention holds significant importance after the fact, in the context of disinformation, it's equally crucial to ensure that the situation is prevented from occurring in the first place. This calls for proactivity, focusing on measures that preemptively address challenges before they manifest: The so-called Prebunking.


Prebunking works by exposing audiences to a diluted form of misleading arguments, thus building their immunity against more persuasive, misleading narratives they might encounter in the future. By shedding light on common tactics used in disinformation campaigns — such as emotional manipulation, misleading headlines, or the misuse of statistics — prebunking efforts are designed to equip the public with the skills needed to critically evaluate the information they encounter. This proactive method not only guards the audience against false narratives but also empowers them to make informed decisions based on accurate content. The effectiveness of prebunking hinges on a deep understanding of the target audience, the potential disinformation they may be exposed to, and identifying the most effective channels for reaching them. With the sophistication of disinformation, especially in the era of rapidly developing generative AI, implementing prebunking strategies has become an indispensable part of a comprehensive approach to maintain the integrity of information in the public sphere. A prime example of prebunking in action can be seen in Central and Eastern Europe, where Google has launched an initiative to counter disinformation by producing a series of videos, crafted to demystify the mechanics behind Russian disinformation.


And the future? Is it all dark?



While many of us, apart from the "big cases" on the political stage described initially, can relate to the scene from the Google spot, we should not succumb to negativity or paralysis. The better harmful forces harness advanced technologies like generative AI, the more we should unite, form alliances, conduct research, advance our AI-literacy and test techniques in the fight against disinformation. As highlighted before: Just as important as the effort to de- or prebunk a negative narrative is the development and, crucially, the dissemination of a positive narrative about the rule of law, freedom, and democracy. At the dinner table, in schools, on sports fields, at workplaces, and wherever else we can. Let's unite and harness synergies in the fight against disinformation.


About the author


Linus Siebert is Co-Founder of FORTITUDE, Germany's first consultancy against disinformation campaigns, and Consultant at Cosmonauts & Kings. With his expertise in crisis communication and strategy development against disinformation, he advises clients from both the public and private sectors on how to protect themselves from harmful narratives. Subscribe now to his bi-weekly newsletter, Digital Threads Unmasked, where he analyzes relevant cases of disinformation and presents effective counter-strategies!

We are proud that in 2022 Linus was an intern in "Media Voice".



Disclaimer: the content of the Op-Ed is the sole responsibility of the author and do not necessarily reflect the views of the Media Voice.

survey presentation_17 July
bottom of page