Disinformation, a weapon of war

Organization : Ministère des Armées / Published : March 01, 2025

Disinformation is a true weapon of destabilisation, upsetting the balance of power between states. Some countries, like Russia, no longer hesitate to use it to influence public opinion to gain a strategic advantage. Through several bodies France has been organising itself in the face of these threats of a different kind.

Troisième barrage de manifestants à la sortie de la ville de Tera, au Niger, en 2021 © Thomas Paudeleux/ECPAD/Défense

On 18 November 2021, in the small town of Kaya, in the north of Burkina Faso. Demonstrators barricaded the road. Piled-up tyres and branches prevented the passage of a supply convoy for Operation Barkhane. French army vehicles bound for the Gao base in Mali were pelted with stones. They had to turn back. The demonstrators were convinced that the soldiers were transporting weapons to deliver them to the jihadists. The deterioration in the security situation has given way to a proliferation of false information on social media, exploiting the population’s anger towards the French armed forces.

These violent demonstrations illustrate the challenge of the battle of perceptions in a sensitive strategic context. Disinformation, i.e. the dissemination of inaccurate information with a view to misleading and causing harm, has become a means of causing harm. What for? To destabilise an adversary, isolate it and weaken its capacity to act. Disinformation has always existed in wartime. Every conflict has been accompanied by its share of propaganda campaigns designed to affect the enemy’s morale and hope to gain an advantage on the ground. But the digital revolution has drastically changed this practice. From now on widespread access to the internet allows every user to overcome time and geographical boundaries. Information is disseminated almost instantaneously, particularly through the constant use of social media across the globe.

Social media have revolutionised our relationship with information. "To some extent, they are the factory of opinion", underlines Marc-Antoine Brillant, head of the department for vigilance and protection against foreign digital interference. Indeed, 46% of French people get their daily information from social media or influencers1. The uncertainty lies in the economic logic of digital platforms which trap users in "filter bubbles" to maximise their advertising revenues. Cooped up in a digital environment only matching with their tastes, users spend more time on the social media. A logic that is not conducive to reliable and diversified information.

Faced with these new digital habits, “infox” in French - or fake news in English - specifically designed to influence public opinion - are circulating at a frantic pace. 88 per cent  of them are spread via social media, mainly X and TikTok. "This success can be explained by the fact that they appeal to emotions, and that sells," notes Carole, a psychologist at Joint Psychological Operations Centre. These fake items of news seek to generate commitment - a 'like'2, comment or share - from users to increase their visibility. The risk? Reposting an item of fake news without realising that the information is manipulated, or without any intent of causing harm: this is misinformation.

1 According to the newspaper La Croix poll - Verian - La Poste, on French confidence in the media, published in January 2025.

2 Like, in English

Technological breakthrough

"Threats are reinforced with the arrival of new technologies. They make informational manoeuvres formidable", says Céline Marangé, a researcher at the Strategic Research Institute at the French War College. Bots1and troll farms2, which are increasingly easy to detect, have been outpaced by artificial intelligence (AI). From then on, the production and falsification of contents has become industrialised on a massive scale. These realistic and credible contents are produced in record time and at limited cost. They are then used to run a multitude of fake accounts on social media, giving the appearance of almost human activity. Coordinated publications on several platforms are capable of saturating public debate: they can artificially make an issue visible to polarise public opinion or, conversely, divert attention from a news item.

This was the case, a few days after the Hamas attack on 7 October 2023, when a multitude of internet users shared photos of blue Stars of David tagged in the streets of the 10th arrondissement of Paris. These publications, which went viral, stirred up high political tensions, even though they came from the Recent Reliable News network, made up of 1,095 bots on the X platform. With a total of 2,589 publications, the network created this controversy in every direction. Behind these inauthentic profiles were several Russian individuals and companies.

AI can also be used to create deepfakes – these manipulated audio or video recordings that are virtually undetectable. "One day, it will be possible to imitate the voice of a commander giving such and such an order, or to make a leader say things meant to fan the flames," emphasises Céline Marangé. Distinguishing the real from the fake is proving increasingly difficult. Five thousand New Hampshire voters have already paid the price. During the American Democratic primaries in January 2024, they received a telephone call from candidate Joe Biden. The message? Don't go to the polls because "voting on Tuesday will only help the Republicans get Donald Trump re-elected". Yet, the voice behind the handset was a deepfake, created with the goal of disrupting the elections.

1 Abbreviation for robot. Autonomous software that performs automated, repetitive and predefined tasks.

2 Individuals seeking to create controversy on the internet. The term "troll farm" is used when the practice is industrialised.

The information space, an extension of the territory

While misinformation used to be rife during conflicts, it is now taking hold in peace time. "Competitors use it to weaken an opponent's ability to react and resist," says Céline Marangé. And they do so without resorting to force. "Part of the Taiwanese population has revealed that it would not fight in the event of a Chinese invasion, because, according to them, the Chinese are friendly on the TikTok application", says General Jean-Michel Meunier, head of the Strategic Anticipation and Orientation unit at the Joint Services Staff. This strategy exploits a context of vulnerabilities of Western democracies, caused by strong political polarisation, the rise of populism and growing mistrust of elites and democratic institutions.

Russia is aware of these vulnerabilities and is working hard to exploit them. "The country's authorities consider the information space, i.e. the space where information circulates in digital format, as an extension of their physical territory," points out Kevin Limonier, lecturer at the French Institute of Geopolitics (University of Paris 8). France has become a target, particularly in French-speaking Africa where the French armed forces have been targeted by disinformation campaigns for the past ten years. More recently, it is the French support for Ukraine that is in the crosshairs of the Russian authorities. In early 2023, the Reliable Recent News network disseminated numerous pro-Russian contents by impersonating websites belonging to both traditional and government media. The network created a replica of the newspaper Le Monde and Ministry of Europe and Foreign Affairs websites to publish articles hostile to Ukraine. The armed forces also fell prey to this practice called typosquating: on 15 March 2024, a fake recruitment portal using the graphic charter of the French army's website offered to sign up for Ukraine, even though France is not deployed there.

Russia has stepped up its fake news about the presence of French mercenaries fighting alongside Ukrainians. In January 2024, the Russian Ministry of Defence claimed to have struck a building in Kharkiv, Ukraine, allegedly used as a "temporary deployment area by mercenaries, most of whom were French citizens". The Kremlin thus seeks to "weaken our national resilience wherever it can. It sows doubts about our institutions and, in the long run, tries to instil anxiety in the population", summarises Kevin Limonier.

Russia is not the only one to use an aggressive approach. Azerbaijan, for instance, has proven hostile towards France. The country has specifically targeted France's overseas territories to fuel and exploit the ideas of pro-independence movements. This was namely the case in New Caledonia in 2024, when Azerbaijani campaigns amplified and orchestrated the riots which disrupted the island.

Winning the war of influence

France takes these threats in the information space seriously. In December 2022, the update of the National Strategic Review elevated influence, or the fact of having an effect on attitudes and behaviour by influencing perceptions, to the rank of sixth strategic function2. The aim of this change in status is to defend France's values, promote its commitments and respond to attacks against its interests, particularly in the information field.

As regards the armed forces, influence has become a prerequisite for any overseas deployment. As for General Thierry Burkhard, Chief of the Defence Staff, this means "winning the war before the war". The ASO unit of the Joint Services Staff was created in 2022. Its mission is to structure a chain operating in the field of influence and information warfare. This includes the development of a doctrine and thoughts hinging on equipment, its use and the legal framework for actions. Above all, the ASO unit ensures that influence is properly considered in the design of operations. "No more logistics convoy leaves without its information risk being studied," explains General Jean-Michel Meunier, head of the ASO unit, referring to the convoy that was blocked in Burkina Faso and Niger in 2021. "We then think about how our actions will be interpreted by the population, we analyse whether the informational terrain is a minefield, and we plan for evidence-based resources, such as image sensors, which could be used to expose false information", he explains.

The service also carries out prebunking actions (anticipation manoeuvre in the information field, editor's note). For example: in 2023, a French Navy warship was due to come alongside in the port of an African country. Rumours were circulating about the crew's motives. Many locals suspected that French legionnaires would disembark to invade the country. We invited influencers with quite a large audience in the country to come on board and make the crossing with us," explains General Meunier. They then published their life on board on social media. The false information disappeared, and no demonstrations were reported when coming alongside".

Setting the record straight

In this chain of influence, actions in the digital domain are led by the Cyber Defence Command (Comcyber). This unit conducts influence cyberwarfare. In practical terms, this means detecting, characterising and countering informational attacks in cyberspace, in support of military operations and always outside national territory. Cyber fighters monitor digital platforms to detect informational manoeuvres. "This is an uphill task, given the sheer volume of content", notes General Aymeric Bonnemaison, cyber defence commander. The next step is characterisation, because "a personal opinion that is hostile to us is not necessarily an informational attack", he continues. For it to be so, it must be inauthentic, coordinated and must use amplification systems, such as troll farms.

Comcyber can eventually respond to narratives hostile to the interests of the French armed forces, in strict compliance with the law and outside national territory: this is the restoration of the truth. The case of the Gossi mass grave is a striking example. On 19 April 2022, when the French military had just handed over the military base to the Malian forces, images of a mass grave allegedly present on the base were posted on Twitter in the following days. The French armed forces were accused of war crimes. Still, French surveillance drones filmed Wagner's men building this fake mass grave. As soon as the false information was disseminated on the social media, the cyber fighters retaliated with the surveillance images to denounce the attack. By waiting until the last moment to release their video, the French soldiers thwarted the enemy's disinformation manoeuvre.

1 Definition of the Joint Centre for Concepts, Doctrines and Experiments.

2 Alongside deterrence, prevention, protection, intervention, knowledge and anticipation.

Le Comcyber s’attache à conquérir et à maîtriser la supériorité dans le champ informationnel. © Jean-Christophe Mantrant/Etat-major des armées/Défense

Taking action allows us to counter our competitors, denounce them or expose them. These actions are only carried out in support of military operations and within a strict framework: "No lies, no coercion and no perfidy. We're not going to hide behind the Red Cross, for example," adds General Bonnemaison. Our adversaries don't limit themselves. We must therefore be more imaginative.”

Numerical investigations

All the influence actions of the Ministry of the Armed Forces are coordinated with the Ministry of Europe and Foreign Affairs, influence and diplomacy being two closely related concepts. Decisions to denounce a disinformation manoeuvre are taken jointly, as "this is tantamount to taking action against a State", explains Christophe Lemoine, spokesman for the Quai d'Orsay. The two entities meet twice a week within the disinformation task force, in conjunction with Viginum, the vigilance and protection service against foreign digital interference. What for? To draw up an exhaustive assessment of threats in the information field and propose countermeasures.

Viginum constitutes France's shield for detecting and characterising foreign digital interference, the digital component of information manipulation. Created in 2021, this service is the responsibility of the Prime Minister1and carries out online open-source investigations. These investigations, which can be published in the form of reports, are then used by all public institutions, but also by civil society.

A few months before the European Parliament elections in February 2024, Viginum analysts identified a French-language news website, Pravda.fr, with a pro-Russian editorial line. "From this URL, the service highlighted an ecosystem of 193 news websites mainly targeting European audiences. We named it Portal kombat", explains Marc-Antoine Brillant, head of Viginum. This vast propaganda machine perfectly exploited the cultural particularities of each targeted country, as well as the topics discussed in each public debate." Based on this analysis, the French authorities decided to reveal this vast campaign. "We suspected that this ecosystem was gaining momentum in the run-up to the European elections", says the head of Viginum.

Publicly revealing a campaign to manipulate information has virtues: it impedes the adversary and imposes a cost on them. "We send a clear message to the opposing operators: we know what you are doing", explains Marc-Antoine Brillant. The revelations also warn the general public. The aim is to raise awareness of the existence of information manipulation campaigns and the fact that anyone can fall victim to them.

Raising public awareness is one of the keys to fighting disinformation. It has become essential to encourage every citizen, when in doubt about an item of news, to question it, cross-check their sources and verify facts. During the Summit for Action on Artificial Intelligence, organised by the Élysée Palace on 10 and 11 February 2025, Viginum announced that it had developed two tools for the benefice of civil society to better help detect certain processes used to carry out digital interference. These AI-based tools range from the detection of duplicate textual content to a "meta-detector of artificial content", rejoices Marc-Antoine Brillant. Although artificial intelligence can represent a threat through its misuse, it is also a solution to fight against the manipulation of information.

1To be more precise, from the General Secretariat for Defence and National Security.

Viginum détecte la propagation massive de contenus inexacts ou trompeurs sur les manifestants kanaks © Delphine Mayeur/AFP


Featured