In early 2019, a handful of soldiers attempted a coup in Gabon. The Gabonese domestic situation was conflictual since the 2016 presidential election and worsened with the stroke of Ali Bongo, who decided to recover in Morocco. At the beginning of December 2018, the Lettre du continent also revealed the arrest of Stephan Privat, a former French non-commissioned officer who had entered Gabon accompanied by a team of ten men, some of them presented as former members of the DGSE. Unaware of the reason for the presence of this squad in Libreville, the Director of Intelligence of the Presidency, Frédéric Bongo, detained Privat for 48 hours to question him, as the other members of the group had left Gabon. Finally, on December 31, Ali Bongo presented his wishes to the Gabonese, in a video shot in Morocco: some actors then suspected that this video was a deepfake, and several observers have since argued that the coup attempted on January 7, 2019 by Lieutenant Kelly Ondo Obiang was the consequence of this suspicion of deepfake.

Thus, the description of a Washington Post video states that : "quand une adresse vidéo de Bongo est apparue fin décembre 2018, certains pensaient que c'était un deepfake. Une semaine plus tard, des soldats ont tenté un coup d'État". ("when a video of Bongo appeared in late december 2018, some thought it was a deepfake. A week later, soldiers attempted a coup").

The Financial Times quotes a specialist from Deeptrace, a startup specialising in the issue of deepfakes: " “It just looked odd: the eyes didn’t move properly, the head didn’t move in a natural way — the immediate kind of response was that this is a deepfake,” says Ajder. A week after the video was released, junior officers attempted a coup d’état, which was quickly crushed".

MIT Technology Review quotes the opinion of Internet Without Borders: "As uncovered by the digital rights organisation Internet Without Borders, many people, thinking Bongo looked off in the footage, immediately suspected it was a deepfake—a piece of media forged or altered with the help of AI. The belief fueled their suspicions that the government was hiding something. One week later, the military launched an unsuccessful coup, citing the video as part of the motivation".

Similarly, SiecleDigital refers to the Deeptrace report and claims that: "Far from calming the rumor, the video will on the contrary revive speculation, as political opponents denounced a deepfake. The affair is far from anecdotal: a few days later, the supposedly fake video will be used to justify a (failed) attempted coup".

In a March 2019 newsletter, Henry Ajder (also author of the Deeptrace report on deepfakes) states: "An official government video of the Gabonese President Ali Bongo was branded a deepfake by his political opposition, resulting in rumours that he had died, and an attempted military coup".

All of these analyses stem from three sources:

A first article by Ali Breland, for MotherJones: The Bizarre and Terrifying Case of the “Deepfake” Video that Helped Bring an African Nation to the Brink, which describes how the Gabonese opposition reckoned Ali Bongo's video as a deepfake , and states: "One week after the video’s release, Gabon’s military attempted an ultimately unsuccessful coup—the country’s first since 1964—citing the video’s oddness as proof something was amiss with the president". Ali Breland says it was Julie Owono, an Internet Without Borders official, who reported the suspicion of deepfake to MotherJones. He also quotes an expert from Deeptrace, who felt that the video was probably not a deepfake, but could not be sure.

Second source: Deeptrace, which published a report on deepfakes in September 2019: The State of Deepfakes, Landscape, Threats, and Impacts. Citing Ali Breland's article as a reference, this report states: "However, Bongo’s unusual appearance in the video led many on social media, including Gabonese politician Bruno Moubamba, to declare that the video was a deepfake, confirming their suspicion that the government was covering up Bongo’s ill health or death. A week after the video’s release amid growing unrest, members of Gabon’s military launched an attempted coup against the government. In a video announcing the coup, the military mentioned the video’s odd appearance as proof that something was wrong with the President".

Third source, at the origin of the other sources: Internet Without Borders, which after having published a press release describing the video as a trigger for the coup attempt, stated in a column published by Jeune Afrique: "We flagged the case of this video to Mother Jones magazine, a publication specialising in new technologies and their impact on societies. They thus produced an article on the risk posed by deepfakes in countries where institutions are quite fragile and not sufficiently armed to respond to these new digital threats .... It was after the broadcast of this contentious speech that the first coup attempt since 1964 in Gabon took place on January 7, 2019 in Libreville. To justify their act, the putschists in particular affirmed not to have been convinced by the new year presidential video".

Beyond these speculations, careful listening to the video of Kelly Ondo Obiang's intervention during the coup attempt shows that not only did the putschists never consider Ali Bongo's video to be a deepfake, but they correctly analysed it as an authentic video showing a sick man, and this is precisely what justified the attempted coup. The video clearly reveals several after-effects affecting Ali Bongo: an ophthalmic problem, difficulty speaking, and paralysis of the upper right limb. For the putschists, it is therefore the degraded health of Ali Bongo highlighted by the video that would have prompted them to act, since they considered that he appeared to be physically unfit for the presidential function.

However, the development of the use of deepfakes, and the inevitable technological progress which will soon make them difficult to detect, will have a considerable impact on the genesis of conflictualities. Conflictuality is defined by second-order cindynical models as the propensity of a situation to degenerate into conflicts. Second order means here that we take into account the fact that the actors observed in a situation are also observers: they each have their perception of the actors in a real situation, i.e. their perspective, and they each have a personal estimate of how should be the actors for that situation to be ideal, i.e. a prospective. The differences, or disparities, between these perspectives, and the differences or divergences between prospectives are the factors of conflictuality.

The emergence of deepfakes will disrupt the construction of actors' perspectives, which will also affect the construction of their prospectives, and will intensify the genesis of conflictualities. Deepfakes could in particular be used by entities, states or non-state actors, whose objectives are to manipulate public opinion, for example to destabilise a country by exacerbating its internal tensions. It is therefore urgent to invest not only in deepfake detection capacities, but also in raising awareness and training people in order to make them more resilient to the evolution of online disinformation practices, in particular through the dissemination of a genuine culture of conflictuality reduction.