A few weeks ago I heard a BBC documentary about a Russian military strategy called Maskirovka. Essentially this is the art of disinformation, about engineering truth to distract or mislead your enemies. Clearly written in the wake of activity in Ukraine and the assassination of Boris Nemstov, the conceit of the piece was that this was a worrying development in a major global power’s approach to war.
At first I questioned what the issue was with this. Misdirection is surely a legitimate tool in war; it’s a legitimate tool in any strategic contest. It is common too. Indeed, as Lucy Ash says, it’s not even exclusive to Russia. But as I reflected further on it, I felt less at ease. If in war subterfuge is acceptable, then a tight definition of “war” is required. Without this, a policy of lying is able to exist in the grey, fuzzy areas between war and peace.
As I considered the potential issues, I was reminded of Wittgenstein’s language-game theory. In this context, language-game theory would assert that while you are war, language adopts a unique set of rules – in the “game” of war “truth” is not always true. The problem is that government defines these rules and they can do so without their citizens knowing. In fact they must do so, because if their opponent(s) know the context in which truth may be manipulated, it negates the potency of the strategy. Because of the imbalance of power between a government and its citizens, and because the strategy necessarily involves deception, it is impossible to know what is truth and what is reality; if you are playing the game or not. Essentially, once you legitimize a strategy of misinformation and do not parameterize its usage, you create a situation where “truth” in any context becomes fallible, or subject to interpretation.
Furthermore, it seems in today’s society a lot of state sponsored (violent) action occurs in the space between war and peace. Consider acts of espionage (Western government’s policy of Extraordinary Rendition, for example) or certain acts of foreign policy (such as claims about Iraq’s WMDs). These all involve elements of subterfuge, disinformation and misdirection made to achieve a specific objective in response to a perceived threat.
Finally, because national security and foreign policy are so closely intertwined with matters of state, arguably policies like Maskirovka necessarily affect the wider democratic process (consider the political decisions that were made in the wake of the invasion of Iraq, for example). The point is, we make democratic decisions on the basis of information we are fed, but we can’t honestly say if this information is correct or not. The very basis of democracy can be challenged once you set a precedent that truth is definable.
The most obvious solution to this, as I see it, is a free press and intra-national oversight of news. But as one must exist within national borders, and governments fund the other, I do not truly see this as the answer. The alternative would be to more tightly control policies that propagate disinformation, but this would surely be rejected because of the impact on national security.
So what’s the real answer? What do you do if you can’t trust a government you have democratically elected?
It all reminds me of George Orwell’s prescient novel, 1984. You could pick half-a-hundred quotes from the book that apply, but I chose this one: “And if all others accepted the lie which the Party imposed – if all records told the same tale – then the lie passed into history and became truth. ‘Who controls the past’ ran the Party slogan, ‘controls the future: who controls the present controls the past’.”