The conquest of the authoritarian character in the age of simulation
By:

I recently landed on a film about conventional combat between American and Russian submarines. In it, I encountered the term "decoy" for the first time. Decoys are distraction dummies launched by an attacked submarine that misguided the enemy torpedo into mistaking the dummy for the real target. It then collides with them and explodes. The attacked submarine remains unharmed.

"Decoys," I found interesting for three reasons. First, the concept of using a decoy to divert an aggressor's attention and energy.

Second, the idea that by now not only warring parties, but most people own such a decoy as a virtual self on social platforms on the internet. And third, the question of what happens to a society that has agreed to communicate from decoy to decoy.

How do decoys communicate?

If, for example, a virtual decoy is damaged by another, we treat this as almost real damage, as if the person to whom this decoy belongs had suffered a black eye or worse from a Shitstorm. This reminds me of the old-fashioned concept of honor. Honor is a pre-modern construct that had to be defended by all means necessary. From the slap with the loosely held glove to the duel, from the rapier to the monster cannon Dicke Berta, everything is acceptable for disciplining the honor violator and a respected excuse to assert oneself aggressively.

A look at the present day shows that today we have not gone far in that respect. To defend our virtual digital selves, we are also resorting to increasingly drastic means. In the extreme, Internet mobs gather for massive brawls like the one on Berlin's Alexanderplatz. Usually, however, the decoy on the Internet is protected, upgraded and avenged as if it was our physical body, often with legal vehemence. At the same time, powerful players strive for sovereignty over it. Mostly with information or disinformation, often with simple seduction, demagogy or provocation.

So, in the digital world, parallel to our seemingly peaceful everyday life, a war is fought between and around the decoys, which leaves the affected that is under constant fire, unsettled. Because it is not a fortress wall that is being blown to pieces here, but the border between knowledge and superstition, between conspiracy theory and reality. Those who are not completely confused must ask themselves in the end:

Am I allowed to still believe in what I know?

In the event of war, this everyday level of digital aggression would be the foundation of the Cyberwar. Targeted information and disinformation, intended to serve tactical and strategic goals, thus hits recipients who, have ceased to believe in the validity of information, because they are disoriented or have partly irrational convictions. Especially in information that does not fit into their worldview. What does this mean for a cyber warfare that wants to reach such recipients?

Of course, there are qualitative codes that can be used to target Internet tribes specifically, or you can use quantitative superiority and bombard the system with your core messages. Cynics like Putin, Orban, Erdogan or Xi Jinping act as dividers and in order to pursue their enrichment goals undisturbed, openly or covertly drive the blocs, which already alienated by conventional media, at each other.

In cyberspace, we seem to be returning to this tribal logic more and more. With tribal boundaries being replaced by the sensibilities of the individual or collective narcissism. So, in a sense, they are even more arbitrary than those of the New Guinean tribes, who were concerned with life, death, and chances for reproduction, rather than just being opinion leaders or vanity.### To kill a man, to kidnap a woman.

However, separation and alienation are questionable means for an actor in cyberspace that is committed to credibility and philanthropy. Which I understand the Western democracies to be, even in the event of war. The situation, encountered by this actor in cyberspace, is like that described by anthropologist Jared Diamond:

In the mountainous country of New Guinea, there was a different tribe in almost every valley. Hundreds of valleys, hundreds of tribes - almost all hostile towards each other. If the member of one tribe was discovered near the territory of another tribe, the archaic logic went, he was allowed to be killed. Because he could only have two things in mind: Killing tribal members of the residing tribe or kidnapping women.

These narcissistic sensibilities - whether individual or collective - are increasingly unfolding in a digitally simulated parallel world that, despite all the cross-border similarities, is still strongly shaped by two opposing trends. In the virtual space of cyberwar, two mentalities oppose each other.

To patronize or to protect.

Let's just call them the two systems of infantilization: The more paternalistic one, which provides paternalism and strict care in exchange for obedience (China/Russia, etc.), and our more maternalistic one, which cares for and protects the infantile ego and offers participation, but often no longer even dares to demand active participation.

In coming cyber conflicts, these two methods of infantilizing the citizen will clash. In a digital warfare of democratic states, which is about mental conquest and not destruction, it will be crucial to understand and properly address the paternalistic ego of the citizen of authoritarian states. Pure promises of consumption and freedom, like in the Cold War, no longer make much of a difference here. We must take a more differentiated approach. But how?

How do I convince an opponent who does not believe me? Not propagandistic strategies.

How do I bring out the real in a world of simulations? I suggest non-propagandistic thinking. Instead of insulting intelligence and rising emotions as old-style propaganda often does, intelligence and emotions of recipients should be acknowledged, strengthened and challenged fairly. The potentially dissident recipient, who is accustomed to authoritarian arbitrariness, must experience through attitude, tone and content that the democratic channel he is dealing with, is a self-confident but fair counterpart.

The longing for trustworthy narratives.

There are several starting points for this, of which I will only mention a few as an example: There is a possibility that the recipient is trapped in the tunnel of fear or emotional isolation. Understanding this is a first step in guiding him out of the constriction. His stress level is likely to be high.

To relieve it, he is searching for familiar clues, content, persons of integrity or even idols. Repelled by untrustworthy leaders, the recipient looks for new narratives and concepts that promise support and that he may follow. Such recipients often need not only reinforcement themselves, but also assistance to help others. The feeling of being able to become a hero or heroine for others also has a motivating effect.

So we should turn recipients into guided allies by sharing our own ideas and addressing them with determination at eye level. With these and other measures, we successively change the recipients' perspective on us until sufficient credibility is created to make our own agenda visible and to push it through. In other words, recipients are more willing to believe what they know after this process outlined above.

Emotional anchor points in the authoritarian ego.

Non-propagandistic strategies in a digital environment influenced by mistrust and fear thus stabilize recipients in the mind of the self-aware communicator, rather than destabilizing them. In this way, such strategies create new emotional anchor points in the authoritarian counterpart, on which the mental conquest by humane and democracy-friendly content can be built in the next step.

i can tell
you more

Learn more about ...