- Internet: Medium For Communication, Medium For Narrative Control
- Part 2 — The Actors and Incentives
- Section 3 — State Actors: PSYOP, Narrative Warfare, And Weaponized Tech
Table Of Content
- Introduction
- Part 1: The Artifacts And Spaces
In this part we'll describe the important artifacts and places. Going over these essential, but basic, pieces is mandatory to understand how they come into play as tools. - Part 2: The Actors and Incentives
In this part we'll go over how the previous elements are put into work by the different actors, who these actors are, what are their incentives, and the new dynamics. - Part 3: Biases & Self
In this part we'll try to understand why we are prone to manipulation, why they work so effectively or not on us, and who is subject to them. - Part 4: The Big Picture
In this part we'll put forward the reasons why we should care about what is happening in the online sphere. Why it's important to pay attention to it and the effects it could have at the scale of societies, and individuals. This part will attempt to give the bigger picture of the situation. - Part 5: Adapting
In this concluding part we'll go over the multiple solutions that have been proposed or tried to counter the negative aspects of the internet. - Conclusion & Bibliography
- Nothing New — Censorship and Propaganda
- Psychology as a Weapon
- Information Society = Information War & Weaponized Tech
- Internal: Population Control, Data and Surveillance
- Computational Propaganda
- External: Cyber Wars
- Social Media As Battleground — State Sponsored Trolling
- Memes as Vectors of Propaganda
Since ancient times, nations have tried to write history from their own point of view. As they say, history is written by the victors. Today, the speed of the internet allows rewriting the narrative in real-time, and state actors will certainly take advantage of this. Namely, there are two ways, or generations, of state information-control practices: an information scarcity approach, aka censorship, and an embracing of information approach, speech itself as a censorial weapon. Both require monitoring of the population to properly adapt the tactic in use.
State surveillance is nothing new either, but the era of information abundance gives the capability of amassing data on a colossal scale.
Historically, nations have used propaganda and surveillance campaigns
internally, to control their population, and externally, to sway
the opinion of other countries, direct revolts, and other favorable
incentives.
Today, these dynamics have moved to the cyberspace. The internet has
been part of all contemporary conflicts as a force acting on the public
opinion. Ultimately, the entity that is able to control the media, the
discussion environment, the social networking sites, is in control of
the narrative.
Before diving into the new era war, let’s review how psychology can be used as a weapon.
Psychology has always been used for military gain but got more
traction with the concept of total war. This type of warfare includes
civilian-associated resources and infrastructure as legitimate targets,
mobilizing all resources of society to fight the war, prioritizing it
over other needs — an unrestricted war. Thus, having a way to sway
civilians in this type of war is a must-have in the fighting arsenal.
Although alone it isn’t sufficient and needs to be complemented with
simultaneous diplomatic, military, and political tactics. Psychological
war takes into consideration the full environmental scope.
A psychological war consists of psychological operations, also known
in the contracted form as PSYOPs, which consist of planning psychological
activities designed to influence the attitudes and behavior for a
political or military objective — to evoke a planned reaction from
the people.
This is a vague definition that includes anything we’ve seen from the
previous influence, propaganda, and persuasion section. These tactics,
when used as weapons, will be backed up by important state incentives
making them more powerful, and arguably more successful, than when they
are employed by other entities.
In a total war the target can be individuals, civilians, organizations, groups, or governments. The audience needs to be well delineated and studied so that the message is appropriately formed, as we’ve discussed beforehand. There are four audiences to a message:
- The ultimate one, the real target.
- The intermediate one, which are likely to receive the message and could be part of the target audience.
- The apparent one, an audience that seemingly appears to be the target but isn’t the real intended one.
- The unintended one, which the planner didn’t intend to reach but still received the message.
The receptivity, as we’ve kept saying, depends on many cultural and
environment factors. However, in a military setting and with government
sponsorship, vulnerabilities can be artificially created through means
such as kinetic (bombs and guns), non-lethal biological weapons affecting
human psyche and mood, and physical threats. The full environmental
scope comes into play.
These operations will employ the schemes we’ve seen such as anchors,
hooks, and imageries. They will be delivered through different methods,
and will have a clear aim. The objective is either a positive one —
to reinforce the behaviour and feelings of friendliness —, a negative
one — to destroy morale, weaken the adversary, create dissonance and
disaffection —, or to gain support from the uncommitted and undecided.
When faced with a new weapon, states have to develop defensive systems. These include doing PSYOPs on their own population, censorship, and counterpropaganda. Although some countries have laws that forbid applying psychological operations on domestic grounds and on their own citizens.
Nonetheless, most countries have integrated in their military structure
distinctive branches that are specialized in the type of operations
related to the information and psychological sector.
Here is a generic definition of the role of such unit:
The integrated employment of the core capabilities of electronic warfare, computer network operations, psychological operations, military deception, and operations security, in concert with specified supporting and related capabilities, to influence, disrupt, corrupt or usurp adversarial human and automated decision-making while protecting our own.
When deployed these well-trained units can apply these tactics at multiple levels.
- Strategic: applies to anything outside the military area, even in peacetime, to prepare the global information environment.
- Operational: applies to a joint military operation to help achieve consistency between the national objective and the objective of the coalition who they are partnering with.
- Tactical: applies to on the ground operations, especially when facing an opposing force, to support the actions of a tactical commander.
What’s important for these units is to be able to measure the effectiveness of their operations (MOE). In military, if it can’t be measured then it’s no good. The internet offers clear metrics and is the perfect ground for that.
The ethics of these methods is arguable. Some say that it’s an arms race and that they are forced to apply them, that the development is unavoidable. Others say they offer a way to reduce the “significant loss of life”. As with anything, there’s a lot of subjectivity when it comes to these.
In an information society, information is the center of the new-generation
war and thus these operational units are put to the forefront.
Information superiority is a must to win. Everyone fights for their
narrative.
This new generation war is often referred to as the 6th one, where the aim
is to destroy economic potential, not take part in physical battlespace,
and be the dominant information holder.
Like the information society, this type of war is network-centric,
taking into consideration the inter-relations between bubbles and how
affecting one could lead to a quick victory. This requires information
superiority to be able to plan in advance, conceal, and prepare the
multiple aspects that will jointly create a favorable ground for the
goal of the operation — to easily make a nation ply to your intentions.
From information, moral, psychological, ideological, diplomatic, economic,
and so on, these are all required. Practically, this means using mass
media, religious organizations, cultural institutions, NGOs, scholars,
etc.. In today’s world this translates into using the right influential
methods, and influencers on social media.
The data that is available on the internet, and particularly social media, can be gathered, stored, and linked, to finally become an armor or weapon in the hands of a state entity. It is a tool that can be used both defensively and offensively.
This intelligence gathering often take the form of social media surveillance. The information can be used to identify possible bad actors or individuals vulnerable for a PSYOPs campaign.
On the defensive side, a prominent explicit example of intel gathering
is how the USA is now asking VISA applicants to submit five years of
social media handles for some selected platforms.
However, the intel gathering probably happens most of the time
unbeknownst to the data subjects. This is what has been revealed by
internal whistle-blowers, and by more explicit state requirements like ISP
keeping internet connection logs for at least 2 months in some countries.
Additionally, we need to remember that social media platforms are
“independent” businesses. As such, they are bound by the legalities
of the geographic areas in which they want to operate. Consequentially,
governments can pressure them by taking the legal route and force them
to act a certain way to operate on their legislature. Or they can simply
buy access to that data from data brokers.
For example, it’s not uncommon that the platforms will have to obey
when receiving an order to censure certain messages. Sometimes notifying
users, sometimes not notifying them, according to the transparency rules
of the platforms and the legislation in which it is operating. This
is the simplest method to get rid of dissidents and keep order in a
country. However, for the citizen of information societies, censure isn’t
looked at too well. The people are craving honesty and individuality,
while rejecting authority or association to greater organizations like
nations and their wealth, as we’ve seen in the previous section.
On the offensive side, micro-targeting — that is using a conjunction of very specific attributes such as political leaning and interests to target individuals — can be used to amplify a message and have measurable metrics when performing a psychological operation. We’ll come back to this topic in a bit.
No need to use forceful actions when speech itself can be used as a
weapon. This is what computational propaganda is about, using social
media political discussions to sway opinions. Especially that this is
the go-to place to discuss these things. This is known as platform
weaponization. It can be used internally, to thwart opposition, or
externally, to influence the decision-making of other states.
The act itself could either be state-executed, by the related military
branch, state-coordinated, state-incited, or state-leveraged or endorsed.
Computational propaganda relies on the anonymity provided by the platforms
which favors black propaganda and gives more credibility, making it seem
more genuine. This means that states can deny involvement while inciting
or leveraging activities happening on social media for their gain.
Without this knowledge, it is hard to know whether an opponent is
attacking or not and how to defend against such attacks.
The attacks can be conducted with bots, automated scripts to scrap and spread data, with trolls, online accounts that deliberately target individuals to trigger and harass them, PR, public relation through news outlets and others, and with memes, which we’ve covered previously.
Bots, and in particular social-bots, give the ability to create fake online voices, and so to fabricate a public opinion. They are one of the weapon of choice in the cyber warfare of who can shout their narrative the loudest. Online, popularity is measured with “likes”, “votes”, “friends”, “followers”, and others which can all be faked.
The advantage that algorithmic campaigns provide is the ability to manipulate what people see, to temporarily boost the visibility of a message, to bypass the social media “democratic” process.
Social bots are accounts that impersonate people of a certain demographic,
mimicking social interactions on the platforms. These can be used to
create a sense of consensus where there is none, or generate doubt on
a subject that is supposed to be widely accepted in the target audience.
With the visibility also comes awareness. This can be used by political
parties to obtain votes, and to look more popular than others, to look
more seductive and attractive.
Another use for such bot is something called astroturfing, fighting for a “turf”. This is about creating the impression of grassroots movements in favor or against something, be it an event, person, or idea.
This visibility can be used to choke off debates, smothering adversaries by making it seem like issues are one-sided, controlling the bulk of the conversation. A sort of artificially created social censorship. Trolls can be used for such effect too as we’ll see.
Alternatively, algorithmic amplification can be used to muddy political
issues, create chaos and instability in the discourse (local or another
country), to generate uncertainty, distrust, and divisions.
This is absolutely destructive if combined with on-the-ground actions
when discord and confusion reigns in the population, and devastating
when combined with economic pressure.
Not all types of content prosper on social media, we’ve seen that
before. We’ve learned that outrage and politics resonate well, the things
that clash with our cultural insecurities. That is why the bot-led
political campaigns tend to come from the most radical parties, it is
the most appropriate medium for them.
Apart from these, any type of negative content spreads like wild-fire. This
is useful to amplify a sentiment of dissatisfaction and disagreement.
In retrospect, social media certainly has some part to play in the process of radicalization, however none of the messages there would resonate if they didn’t reflect some issues that were already present in us. As usual, it’s an exaggeration of our own ambiguous cultural code.
This is exactly what some state entities use to attempt to influence
elections in other countries. Like bots generating discord and chaos
regarding a certain topic, micro-targeting can be used to personalize
messages towards hyper-partisans. In turn, these selected partisans will
be pushed into their narrative, radicalized, polarized, and moving the
Overton window. Furthermore, this radicalization gives them a bigger
exposure and sets the ground for instability in a population.
This can be an attractive outcome in coordination with other types
of actions.
These hyper-partisans can be converted into trolls, online accounts that
deliberately target and harass particular individuals. They can either
be paid/sponsored real people or bots, either aware or unaware of the
tacit agenda, with or without explicit instructions, and either real
partisans or fabricated ones.
This is frequently used along with black PR that are campaigns to
deliberately engage in disinformation and harassment against a perceived
opponent. It is used to intimidate and silence individuals.
There are countless examples of this tactic being used by states, either
deliberately launched by them, or the states harnessing an already
ongoing attack. These include, but are not limited to, making death
and rape threats, amplifying vitriolic attacks, making accusations of
treason or collusion, disseminating libelous disinformation, spreading
doctored images and memes, and sowing acrimonious sexism.
The target of these attacks are usually journalists, activists, human
rights defenders, and vocal members of an opposite ideology. These types
of attacks are used everywhere in the world, some for capitalistic gains,
and others for state gains.
Yet another efficient way for states to impose their ideas on the minds
is through the news networks, all the headlines. They can employ the
typical colored language, the emotive stories, imageries, hooks, and
anchors we’ve seen before to shape the narrative.
Promotional culture is also an important aspect of pushing a narrative
through the headlines.
Aside from these, nations can rely on the visibility principle, once
again, to promote the stories they want. This could be by voting on
social media with bots, or by infiltrating the news network and trading
up the chain. News websites can be constructed from scratch and filled
with bogus content, to then rely on the spreading mechanism and citizen
journalism to gain credibility and attention. It might finally emerge
on the mainstream news media consumed by the desired target audience.
Moreover, trading up the chain can be achieved by using fringe social
media platforms which are prone to influence other online ecosystems. This
works very well when the message is based on a kernel of truth but with
a spin. Even when more reputable outlets are only debunking them, it’s
giving the content a position in mainstream news.
All these are extremely hard to defend against, a David vs Goliath affair, and so states prefer to defend through a good offense. Counterpropaganda, part of counter-insurgency (COIN), is hard to practice on social media. However, one particular vector of information that can be used as COIN are memes, the new favorite shell for political messages.
Governments are getting more and more interested in memes, especially
as a useful method to compact narrative and culture in a transportable
container.
All modern, arguably post-modern, PSYOPs involve the propagation of
memes on social media. Meme wars are an inherent part of political life.
We’ve amply discussed them in a previous section.
They are the embodiment of the competition over narrative, creating
coherent constellations of meanings and stories. We know it’s easy to
overlook them because they are a common way of expression and often use
humor, but that makes them the perfect craft for information warfare.
What appears like a prank or trope, soon turns into an influential narrative
campaign spread by bots and trolls. Remember that memes aren’t limited
to their internet format but that they go beyond, that this is only the
appropriate envelope they take to move on that medium.
As such, like all that we’ve seen in this section, they can be used
defensively, offensively, and to sway people on the fence, to recruit them
or drive them away. They are used both for local and international
sentiments.
Many countries employ, or discuss employing, memetics in their military
sector. It consists of having separate units in a specialized center
for these type of operations, which would be a sub-branch of the usual
information and psychological operations but tailored to social media.
These meme warfare centers, or referred to as meme farms when described
by opponents, would include interdisciplinary experts such as cultural
anthropologists, computer scientists, economists, and linguists.
There are many open discussions about which countries actually employ these units, knowing their associations would reduce their power, but we find implicit indicators in the wild. Most countries and international associations have them in sight, take them seriously, or are actually armed with them already. Nowadays, any side of a conflict will use them, that includes all NATO nations, Russia, Hong Kong, China, Armenia, Azerbaijan, Palestine, Israel, India, Pakistan, etc..
However, it is an unconventional mean and so it is tricky to be put in
place, or to let the citizens know it is in place. It is hard to include
in a military structure. Memes are abstract concepts and they might
sound unconvincing and bring obstacles related to finance — a lack
of investments —, culture — the mindset to grasp the new generation
warfare —, legalities — to encompass the ethics around using them —,
and bureaucracies — who should be in command.
Additionally, there needs to be scientific proofs of their efficiency to
justify the investment. Efficiency should be tested and measurable, and
this isn’t straight forward when a change in attitude doesn’t necessarily
correlate with a change in behavior.
If a nation lacks the framework, mindset, resources, knowledge, and tools, they’ll leave the advantage to others that can embrace this new paradigm. Nonetheless, it’s apparent that this is a threat but it wasn’t until 2017 that the EU and NATO established together a center to counter “hybrid threats”. Though its purpose is more analytical than proactive, for them, at least in the public eye, the best form of defence is awareness.
Theoretically, a meme, like a virus, can be categorized in infect,
inoculate, and treat. To transmit, prevent or minimize, and contain
a message.
A quarantine doesn’t really work in the virtual world though and combining
this with the neutralization of the infected memeoids by killing them
in the real world might have the opposite effect. History has shown that
it is likely it will instead validate the ideology of the meme.
Detection plays an important role in the defence mechanism, to then be
able to launch a counter-message. In practice, it has been attempted for
counter-radicalization, a form of counter-insurgency, with mixed effects.
An invisible enemy is and invincible enemy, that is why identifying,
cataloguing, and tracing should be done beforehand. Which can be almost
impossible when the opponent, be it foreign or local, hides within the
civilian population, netizens. Thus, the focus on “hybrid treats” center.
In all cases, displacing and overwriting dangerous pathogenic memes
is normally done by replacing them with more benign ones. These
reactionary memes have different degrees of applicability, actionability,
operationalization, and success.
The fruitful reactionary memes have one thing in common: they use
humor, irony, and sarcasm to deligitimize the message of an opponent by
ridiculing and mocking it. This is similar to WWII propaganda techniques
used in cartoons.
“No one is above satire.”
This concludes our review of how state actors are employing the internet narrative as part of their overall information and psychological operations. We’ve first seen how this is nothing new, how psychology has been used as a weapon for quite a while now, both on the defensive and offensive. Then we’ve looked at how data is at the center of wars, now that we’ve moved to an information society. Next, we’ve seen how this data can be part of the surveillance campaign of states. Later we’ve examined computational propaganda, how algorithm dictates our world and how consequently they can be used by nations. Finally, we’ve dived into social media, bots, trolls, news control, and memes, all communication vectors that rely on gaining visibility and credibility. Speech as a method of censorship, speech as a method of propaganda, speech as a weapon.
Table Of Content
- Introduction
- Part 1: The Artifacts And Spaces
In this part we'll describe the important artifacts and places. Going over these essential, but basic, pieces is mandatory to understand how they come into play as tools. - Part 2: The Actors and Incentives
In this part we'll go over how the previous elements are put into work by the different actors, who these actors are, what are their incentives, and the new dynamics. - Part 3: Biases & Self
In this part we'll try to understand why we are prone to manipulation, why they work so effectively or not on us, and who is subject to them. - Part 4: The Big Picture
In this part we'll put forward the reasons why we should care about what is happening in the online sphere. Why it's important to pay attention to it and the effects it could have at the scale of societies, and individuals. This part will attempt to give the bigger picture of the situation. - Part 5: Adapting
In this concluding part we'll go over the multiple solutions that have been proposed or tried to counter the negative aspects of the internet. - Conclusion & Bibliography
References
- Psychological Warfare
- Psychological operations (United States) (PSYOP)
- Psychological Operations: The Need to Understand the Psychological Plane of Warfare
- Countering Disinformation: Russia’s Infowar in Ukraine
- Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration
- The Nature and Content of a New-Generation War
- Memetic warfare (Wikipedia)
- DEFENCE STRATEGIC COMMUNICATIONS Winter 2015 - The official journal of the NATO Strategic Communications Centre of Excellence - IT’S TIME TO EMBRACE MEMETIC WARFARE - Jeff Giesea
- Memes: Persuasive Political Warfare
- STATE-SPONSORED TROLLING How Governments Are Deploying Disinformation as Part of Broader Digital Harassment Campaigns
- How memes are becoming the new frontier of information warfare
- All Hail the Meme, The New King of Political Communication
- TUTORIAL: MILITARY MEMETICS — Dr. Robert Finkelstein
- How memes got weaponized: A short history
- Social Media and Social Problems: A Complex Link
- Is your government requesting user data from tech giants?
- Muslims reel over a prayer app that sold user data: ‘A betrayal from within our own community’
- Find Out Where Apple, Facebook, Google, Twitter and Other Tech Giants Are Sending Your Data
- The network of fake foreign media
- MIPB April-June 2010 issue - Memetic Warfare: The Future of War by First Lieutenant Brian J. Hancock
- Exploring the Utility of Memes for U.S. Government Influence Campaigns
- MEMETICS — A GROWTH INDUSTRY IN US MILITARY OPERATIONS
- US to require would-be immigrants to turn over social media handles
- Big Data useful for mass surveillance (Wikipedia)
- ‘Weaponized Ad Technology’: Facebook’s Moneymaker Gets a Critical Eye
- Total war (Wikipedia)
- Mass surveillance in the United Kingdom
- Sheryl Sandberg and Top Facebook Execs Silenced an Enemy of Turkey to Prevent a Hit to the Company’s Business
- India Targets Climate Activists With the Help of Big Tech
- This is what happens when ICE asks Google for your user information
- Disinformation and ‘fake news’: Interim Report
- Google’s top security teams unilaterally shut down a counterterrorism operation
- Apple’s cooperation with authoritarian governments
- Honest Ads Act — USA
Attributions: Geheime Figuren der Rosenkreuzer, Altona, 1785
If you want to have a more in depth discussion I'm always available by email or irc.
We can discuss and argue about what you like and dislike, about new ideas to consider, opinions, etc..
If you don't feel like "having a discussion" or are intimidated by emails
then you can simply say something small in the comment sections below
and/or share it with your friends.