Demystifying the Fog of War: Establishment of Deterrence in Post-Truth Era The case of India-Pakistan Conflict

0
12
A stylized graphic showing India and Pakistan flags pointing toward a central missile and warning sign, all set against a smoky, fog-filled background symbolizing tension and deterrence in a post-truth environment.
The fog of war deepens as India and Pakistan navigate deterrence, perception, and power in a truth-challenged geopolitical landscape.

For the past six years, relations between the two old adversaries are over the edge. Pakistan and India have been engaged in recurring punitive actions since 2019. In early that year, the Palwama-Balakot incident and later the abrogation of article 370, followed by lockdown of Jammu was seen as a rapid and tense escalation, deteriorating the relations between two nuclear armed neighbors. Fast forward to May 2025, the conflict not only once again resurfaced but also evolved—the 2019 Air Force combats were replaced by drone warfare in 2025, state run news was replaced by social media and objective truth itself became a casualty.

The waves of the recent India-Pakistan confrontation not only reached the Line of Control but also travelled into the digital spaces of a common user. Social media became the battlefields where information and cyber warriors held their defensive and offensive positions. The tools of perception, misinformation, and narrative dominance dictated the online-strategic outcomes. By the end of the four day confrontation, leaders of both countries claimed victory, both populations believed their side had triumphed, celebrating their returning armies while the rest of the world was left parsing through a fog of war. 

The credit of this fog goes to the doctrine of post truth politics. Post-truth plays as a significant yet malicious force in defining modern warfare of the twenty-first century.  Post truth influences the credibility of a fact by manufacturing it and therefore, impacts the communication between rival states. Fundamentally, deterrence relies on three important components; capability, credibility and communication.

An effective deterrence means that the country who is deterring has exerted its capabilities and resolve on its adversary. Since the perception of capabilities and credibility is increasingly complicated in a post-truth environment, the establishment of deterrence becomes questionable. In other words, in the post-truth era objective facts are less persuasive than emotional appeals and personal dogmas, therefore credible deterrence is not maintained.

This post truth environment affects the communication component the most. The traditional deterrence relies on clear communication of threats and capabilities but in the post-truth era, deliberate distortion of reality and manipulation of facts and beliefs can either strengthen or undermine such threats. Either these threats, very real in nature, are undermined or exaggerated, making it challenging for the strategist and policy makers of both sides to comprehend their intensity. This blurring of truth or falsehood affects the interpretation of deterrent signals, therefore compromising the assessment of credibility.

Credibility, during the Cold War, was assessed by the doctrine of  Mutually Assured Destruction (MAD). It emphasized on the willingness of each side to retaliate in case of nuclear attack. If one side doubted the other’s willingness to retaliate, deterrence failed.  Now, decades after the end of the cold war, the challenges to establishing credible deterrence have changed.

Even the information and propaganda has transformed from state-controlled lies (Pravda’s censorship) to more democratized versions of lies evident in the form of WhatsApp forwards, AI-generated videos, and echo chambers controlled by the algorithms that consolidate the already established perceptions much faster than governments can counter them.  For instance, from the Indian perspective, the conduct of Operation Sindoor– a series of precision strikes against alleged militant seminaries in mainland Pakistan- was broadcasted with carefully curated footage—some real, some doctored. Whereas, Pakistan’s counter-narrative on the social media revolved around drone footage of downed Indian UAVs and dog flights of jets claiming to have taken place in the Indian Territory. This social media exaggeration based on misinformation turned partial truths into absolute ones for millions, with hashtags like #PehelgamVengence and #IndiaStrikesBack on the trend. The result? Two entirely different wars were being fought in the minds of Indians and Pakistanis. 

The algorithmic manipulation was used as a tool to ensure that both nations remain in distinct bubbles. It was through these highly controlled and effective echo chambers that millions of people on both sides of the border only saw their country as victorious. The Digital Platforms played an intense role in Escalation of the conflict behind the screen. Applications like the telegram and WhatsApp became hubs for real-time war updates which were often unverified. Twitter (now X) amplified this through confirmation bias by algorithmically boosting the hashtags from both sides. Deep Fake videos of captured pilots, video game jets and “leaked” documents further muddied the waters. All of this ended in generating a conflict where beliefs of the people mattered more than facts. As a result deterrence became a game of viral one-upmanship. 

It is an open secret that a state tends to propagate lies in order to gain domestic support and export their narrative but these lies can backfire in case of assessment of the threats from the enemy. To deter the opponent, states must work to rebuild the credibility in an Era of Lies. This can be obtained by cooperating with a Third-party for the verification of facts. The UN or neutral states could deploy real-time monitoring tech (satellites, cyber forensics) to fact-checks. Yet the most crucial step in this regard is to advance digital literacy. Literacy campaigns should aim at teaching the masses the use of tech and senses to spot deep-fakes and bot networks. Lastly, the responsibility lies with the social media owners and application developers to process transparency for demotion of unverified content. 

As for securing deterrence itself, apart from the traditional rules of deterrence (nukes, red lines), new dimensions must be considered. For example,  Cyber-deterrence can be used for punishing digital deception with proportional cyber strikes. In order to prevent the bullets of misinformation from sparking any unprecedented controversy, Information ceasefires (Agreements to halt propaganda during crises) can be made between conflicting governments. 

To debunk the illusion of victory claimed by both sides no mathematical or Pythagorean principle is required for the calculation of strategic benefits. This can be done by having a closer look at the events of these four days. Nevertheless, recent conflict turned into one where states wanted to control the perception. Such perception is the currency of power in the post truth world.

As George Orwell wrote in 1984, “Who controls the past controls the future. Who controls the present controls the past.” This indicates that in the post truth world wars are won not just on the battlefield but in the minds of billions scrolling behind the screen. Conclusively, Deterrence can no longer rely solely on missiles and tanks. It must now account for deep fakes, viral lies, and algorithmic warfare. If we fail to adapt and discipline the trends into logical assessment of wins and losses, the next conflict may not start with a gunshot—but with a tweet. 

Author

  • Umaima

    Author is a graduate of International Relations and Politics from Quaid-i-azam university, Islamabad. she is currently working as a researcher at the Strategic Vision Institute, Islamabad.

    View all posts