Adam-Troian-Propaganda-2022
Adam-Troian-Propaganda-2022
Adam-Troian-Propaganda-2022
Propaganda.
Jais Adam-Troian, PhD1
Abstract
Although pervasive in our history and modern ecology, propaganda has yet to be formally
described by social psychological science. The field has extensively documented the principles
of persuasion and social influence, but no integrative synthesis has been attempted to describe
the mechanisms of propagandist persuasion. In this paper, we propose to formalize a Fitness-
Validation Model (FVM) of propagandist persuasion. This synthetic model assumes that, in
essence, influence is proportional to the degree of fit between message characteristics (social
content, prescriptive content, descriptive content, framing) and recipients’ (group identity,
attitudes, knowledge, cognitive makeup). Fitness on these four characteristics then shapes
perceptions of thought validity regarding the message, and ultimately behavior change. Thus, we
propose that propaganda attempts to maximize fitness between message and recipient
characteristics and self-validation in the direction of the message. From the FVM, we then derive
five main pathways for propagandist persuasion and dissuasion, which we label the 5D: Deceive
social intuition, Divert resistant attitudes, Disrupt information processing, Decoy reasoning and
Disturb meta-cognition. The 5D are mutually inclusive and can be seen as the “building blocks”
of real-world propaganda. We discuss the theoretical implications of the FVM and conclude the
model should be used to craft more effective liberal democratic (counter)propaganda.
Keywords: propaganda, misinformation, psychological operations, fitness-validation, persuasion
Conflict of Interests: The authors declare no conflict of interests that may have affected their
work.
1
North-Atlantic Fellas Organization, NAFO Headquarters, Langley Falls, Virginia, USA
“Do not try to suppress or control an opponent unnaturally. Let attackers come any way they
like and then blend with them.”― Morihei Ueshiba, The Art of Peace.
“The supreme art of war is to subdue the enemy without fighting.” ― Sun Tzu, The Art of
War
Introduction
Propaganda can be broadly defined as the use of persuasion techniques by an agent with the aim
of modifying a recipient’s behavior in a targeted direction. Since its inception more than a
century ago, social psychology has extensively focused on the study of behavior and attitude
change, which in essence, leads to applications for propaganda. As an illustration, the early
studies on social influence (Lewin, 1947) and norm formation (Sherif, 1936), cognitive
dissonance (Festinger, 1957), social representations (Moscovici, 1981) and social identity
(Tajfel, 1974) were all born out of experience with war, genocide, and dictatorship. In the US,
the applications of these theories directly fed State propaganda to civilians (e.g. easing meat
rationing during WWII; Lewin, 1943), Cold War counter-narratives against Soviet
disinformation (Hovland et al., 1949) and civil rights campaigns to fight racism against African-
To this day, social psychologists still focus on attitude formation (Jones et al., 2010),
attitude change (Bohner & Dickel, 2011) and its moderators (e.g. personality traits, Haugtvedt &
Petty, 1992), as well as to the study of robust ways to predict and change related behaviors
(Schultz et al., 2018). Whether these attitudes pertain to the environment (Van Lange et al.,
2018), health (Van Bavel et al., 2020), intergroup relations (Halperin, & Schori‐Eyal, 2020) or
ideology (Jost, 2017), social psychological research is almost invariably justified with the aim to
help understand and change attitudes or behaviors (that is, to craft better propaganda) to tackle
important societal issues such as global warming, vaccine hesitancy, racism, or violent
extremism. In fact, most of the prescribed solutions to these societal issues rely on applying
better, more effective techniques to foster change in a desired direction (e.g. attitudes towards
psychological research, an operational model of propaganda is still lacking in the field, which
impedes applied and theoretical progress. In this paper, we propose a first attempt to
up-to-date advances in the fields of persuasion, namely Self-Validation Theory (Briñol & Petty,
2022), and by integrating advances in the fields of behavior change, conspiracy beliefs,
disinformation, motivated cognition, social influence, and identity. The FVM then allows us to
identify five derivative strategies of propagandist persuasion (5D) for applied purposes.
Propaganda as an object of investigation has been around for decades within the social sciences.
Early works on the topic sprung up in the 1920s following the rise of modern marketing methods
and far-left and far-right ideologies making use of propaganda to their political advantage
(Laswell, 1927; Bernays, 1942; Vöhringer, 2011). Propaganda was extensively studied from a
historical and political perspective, focusing on describing the materials used by and motivations
of propagandists, as well as the socio-political contexts that led to their emergence and attributed
In psychology too, attempts to explicitly define, model, and classify propaganda were
made in the wake of WWII. For instance, researchers tried to classify propaganda depending on
its origin and goals, thereby differentiating democratic from authoritarian forms of propaganda
(Bartlett, 1940). Other theoretical work tried to approach propaganda by analyzing its target,
namely public opinion (see for instance Herman, & Chomsky, 2010). Finally, another line of
research on propaganda emerged from “sociological” social psychology (Oishi et al., 2009), this
time focusing on its dissemination and spread through rumors and collective beliefs (Beauvois,
2007; Rouquette, 1996). In parallel, social psychologists have developed several models and
theories which are relevant to propaganda without including propaganda per se. These pertain to
persuasion and processes of attitude change more broadly speaking (Briñol & Petty, 2022), the
psychology of fake news (Pennycook & Rand, 2021), the inner workings of cognitive and
reasoning biases (Haselton et al., 2015), the determinants of conspiracy beliefs (Hornsey et al,
2022), the cognitive and motivational underpinnings of ideologies (Zmigrod et al., 2021) and the
mechanisms of motivated cognition (Ryan et al., 2021), including partisanship and social identity
The contrast between the two approaches is stark. On one hand, the intuition that
propaganda is a phenomenon of scientific relevance has led to “macro” theories with ecological
validity but low internal validity and usefulness for application. These theories often involve
unclear mechanisms, nebulous concepts and lie disconnected from modern social psychological
research. On the other hand, the experimental documentation of the cognitive mechanisms of
persuasion generated a wealth of “micro” models and empirical discoveries with high internal
validity but poor ecological validity. Disconnected from one another, these mechanisms often
yield small effects when applied and fail to mimic the features of real-world propaganda (e.g.
accuracy-nudge, Roozenbek et al., 2021). Therefore, a model bridging the gap between the
figure 1).
By proposing the FVM, we wish to offer a basis for working towards a true science of
social influence, ideology, fake news, and conspiracy beliefs. We also aim to align the model
with early propositions to explicitly consider links between cognitive and sociological
phenomena in social psychology theories (see McGuire, 1973). Finally, the model is proposed to
address more recent calls for psychologists to focus on theorizing to overcome the
Figure 1.
Situating a level of analysis for a social psychological model of propaganda.
2. Modeling propagandist persuasion: outlining the FVM
As per the introduction, we decided to adopt a consensual and broad definition of propaganda as
“the use of persuasion techniques by an agent with the aim of modifying a recipient’s behavior
in a targeted direction.” The term “techniques” admits any kind of intervention, ranging from
educational interventions to mass online advertisement. An “agent” can be any motivated actor,
propaganda need not be a vast entity such as the public opinion. Any individual or group of
Importantly, this definition assumes that ultimately, all propaganda aims at behavior
change (or prevention): attitude change is invariably instrumental in this regard. For instance, the
current “morale boosting” propaganda of the Ukrainian State targets attitudes towards war, but
authoritarian states (e.g. China, Russia, Turkey) seeks to prevent regime-challenging behavior,
such as tract distribution, online petition signing and protesting. Finally, the notion of “targeted
providing citizens with the least biased knowledge available on a given topic so they may exert
This broad definition avoids excessive focus on less relevant characteristics. For instance,
classifying propaganda based on its informational content (true or false, emotional, or rational;
Biddle, 1931) could miss critical aspects of the phenomenon, since much propaganda does make
use of true information framed in a persuasive way (Altay et al., 2021). Similarly, definitions of
exposure), specific actors (e.g. parties or States), recipients (e.g. public opinion; Herman, &
Chomsky, 2010) could dismiss relevant empirical similarities between different forms of
propaganda. Finally, a distinction of propaganda based on its overt or covert nature (e.g. white
and black propaganda, see Linebarger, 1954) would be too reductive and may not fit modern
empirical examples of psychological operations, which can involve multiple parallel strategies
and actors (both covert and overt, online and offline) at the service of a single goal. Now that we
have an operational definition of propaganda, let us turn to the constitutive elements of the FVM.
The idea behind the FVM is to make use of the most up-to-date science of attitude and behavior
change and synthesize it into basic elements or parameters. This synthesis operates primarily on
two social psychological concepts, which sum up a substantial part of the literature and structure
our integrative model across social psychological subfields. The first, and perhaps most
Across models of persuasion and attitude change, messages tend to be separated from
other components, such as their source (Eagly, & Chaiken, 1984, Brinol, & Petty, 2009) or the
way they are framed (e.g. as explicit losses or gains, Grazzini et al., 2018). Here, we propose to
combine all these elements into basic dimensions of a message. We propose that messages can
vary along 4 basic dimensions: social content (social identity cues, including source), factual
format (message framing). Similarly, social psychological research tends to separate the fields
investigating how group identity or partisanship shapes information processing (Abrams &
Hogg, 1990; Ditto et al., 2019a), how counter-attitudinal messages generate reactance (Belanger
et al., 2020, Proulx et al., 2012), how information shapes attitudes (Wood et al., 2019) and how
messages framed to fit the recipient’s psychological makeup generate more persuasion (e.g. use
influence processes. Hence, we propose that, just like for messages, recipients can be classified
along four basic dimensions of influence: social identity, information (factual knowledge),
attitudes (like-dislike stances on different issues), cognitive makeup (emotions, personality traits,
risk aversion…).
comparative fit test is engaged, involving both conscious or less conscious mechanisms. Fitness
will increase the message’s effectiveness, while discrepancies will do the opposite. This
comparative process occurs simultaneously across four domains, between the four above-
mentioned characteristics of message and recipient. We label these Social fit, Attitudinal fit,
Social fit is the degree of fit between the social content of a message and the social
identity of the recipient. In line with social influence (Cialdini, 2006) and social identity research
(Hogg & Smith, 2007), Social fit is maximized when the social elements of a message resonate
with the social identity of the target: its content indicates a majority position shared by
recipient’s ingroup (Schultz et al., 2018), its source is perceived to be ingroup or allied (Ditto et
al., 2019b), and the intended purpose behind the message is perceived to be benevolent towards
the recipient’s ingroup (i.e. positive imputation, see Prost et al., 2022). Although persuasion
research points at various relevant social characteristics (e.g. source expertise and credibility;
von Hohenberg, & Guess, 2022), we propose that these all stem as a consequence of Social fit.
In general, messages from ingroup sources tend to be more persuasive on average (Haslam et al.,
1996; Wyer, 2010). Individuals will distrust experts on the basis of ingroup-outgroup
distinctions, or if they perceive malevolent intentions towards their ingroup (i.e. conspiracy
beliefs, Bertin et al., 2020). These social preferences are so strong that they can come at a high
price for the recipient. Among those who followed the wrong prescription because of Social fit
during the COVID-19 pandemic, some paid with their life (i.e. US Republicans, Gollwitzer et al.,
Figure 2.
The Fitness process and four domains of application.
Attitudinal fit is the degree of fit between the prescriptive content of a message and the
attitudes of the recipient. In line with decades of research on reactance, polarization (Axelrod et
al., 2021) and attitude change (Festinger, 1957; Harmon-Jones, 2019), Attitudinal fit is obtained
when the prescriptions of a message match the recipient’s attitudes. Conversely, research points
at so-called “backfire effects” when exposed to counter-attitudinal messages (Bail et al, 2018).
Thus, Attitudinal fit can also be achieved by not overtly displaying a counter-attitudinal
prescription or by framing the message in a way that logically extends the recipient’s attitude to
an in fine counter-attitudinal conclusion (i.e. paradoxical thinking, see Hameiri et al., 2019). As
would achieve greater Attitudinal fit making the case that welcoming more immigrants
We also propose a parallel process of Informational fit, which is the fitness index
between the descriptive content of a message and the recipient’s level of knowledge.
Informational fit is maximized when the information delivered by the message is consistent with
a recipient’s factual knowledge. However, unlike in the case of attitudes, information tends to be
more malleable, and recipients generally display low levels of factual knowledge on many
different topics outside their area of expertise (especially activists, Rosling et al., 2018).
Consequently, research shows that information which does not strictly align with one’s attitudes
may still be integrated without triggering reactance (Mercier, 2016; Ecker et al., 2020; Wood et
al., 2019).
Finally, Cognitive fit is the degree of fit between the format of a message and the
cognitive makeup of the recipient. It draws upon the numerous research findings covering
framing effects (see Steiger, & Kühberger, 2018), models of persuasion (Albarracin, & Shavitt,
2018), as well as the emotional, personality and cognitive moderators of persuasion (e.g. Need
for Cognition, Petty et al., 2009; Need for Closure, Webster & Kruglanski, 1994). Hence,
Cognitive fit can be achieved in many ways. For instance, research shows that when the
emotional tone of the message matches recipients’ emotional state (see Lammers & Baldwin,
2018), they can be led to adopt initially counter-attitudinal positions. Likewise, matching the
complexity or quality of a message to the recipients’ cognitive sophistication can improve central
(in-depth) processing and thus persuasion (Griffith et al., 2018). Conversely, breaking down a
prescription into simple steps may create a greater sense of clarity and ease, which in turn
facilitates behavioral achievement (Gollwitzer & Sheeran, 2006). Framing can also attempt to fit
to the recipient’s personality, for instance by feeding threatening information to individuals who
Having defined Fitness and its subcomponents, we will now turn to the second key element of
the FVM: Validation. The concept of Validation initially stems from the most up-to-date theory
of persuasion and attitude change: Self-Validation Theory (Briñol & Petty, 2022). In short, the
theory states that perceived thought validity is the main mediator of persuasive effects.
Individuals will change their attitudes if they perceive their thoughts about the message to be
valid (Petty et al., 2002; Briñol & Petty, 2009). This meta-cognitive aspect of persuasion
research is of crucial relevance for modelling propaganda. In fact, much of the effectiveness of
real-world mechanisms propaganda comes from its capacity to instill doubt or to foster certainty
(i.e. metacognitive entities) depending on its objectives (see Charon & Vilmer, 2021).
capacity to generate more positive (or less negative) cognitions about the message among
recipients. This proposition is also consistent with research showing that attitude certainty and
confidence in one’s judgement is often a better predictor of behavior than attitudinal valence
(see Tormala & Rucker, 2018 for a recent review). Accordingly, the FVM posits that Fitness
effects on behavior should be mediated by self-validation (see figure 3). In other words, the
effect of Fitness in the four domains of the model on attitude change should be due to its ability
to foster increased perceptions of thought validity among the recipient. The higher the total
Fitness across domains, the more confident the recipient should be in their own positive
appreciation of the message. The concept of Validation also allows us to introduce the viability
persuasion to incite desired behaviors through validation is not possible (as when targeting
Figure 3.
Overview of the Fitness-Validation Model (FVM).
As we have seen, the FVM comprises five parameters: four types of Fitness, and a mediating
mechanism of meta-cognitive Validation. From these, we can directly derive five main pathways
through which propaganda can generate persuasion (or dissuasion), and which we have labeled
the five Ds, or 5D. Each pathway runs through one parameter of the FVM: Deceive taps into
social fit, Divert in attitudinal fit, Disrupt in informational fit, Decoy in cognitive fit and Disturb
targets self-validation. The 5D can be combined to maximize persuasive effects (see figure 4).
recipients’ social intuition by mischaracterizing the message as emanating from an ingroup (i.e. a
safe and benevolent) source or by depicting it as widely approved among the recipient’s ingroup.
Deception allows to bypass partisan and intergroup biases, thus improving the Social fit of a
message. To Deceive the recipient, a propagandist can hide the initial source of the message and
use a mediating source to disseminate the message instead. One way to do so is through online
coordinated operations using bots or paid “trolls” (i.e. astroturfing, Keller et al., 2020). Another
strategy among industry lobbyists is to use intermediary organizations such as charities or NGOs
who seem unrelated but are partially or fully funded by the source company (De Bruycker &
Beyers, 2019).
position on an issue can be powerful vectors of behavior change (Schultz et al., 2018), especially
since their influence goes unnoticed by the recipient (see Nolan et al., 2008). As an illustration,
the 2022 illegal Russian referenda in occupied Ukrainian territories depicted implausibly high
figures in favor of annexation between 87 to 93%. According to the FVM, these referenda served
a double deceitful propaganda purpose. First, it could have generated perceptions of false
consensus among the occupied population, deterring them from resisting occupation (Bauman &
Geher, 2002). Second, it could have perpetuated pluralistic ignorance to outside observers (i.e.
non-Ukrainian populations), and decreased support for helping the occupied populations (De
In addition to norm salience, a message can also Deceive a recipient by tapping into self-
(Leonardelli & Toh, 2015). This can be achieved using cues for specific group identities to
generate Social fit. In fact, a defining feature of populist rhetoric is the reliance on social identity
cues such as “the people” versus “the elite”, or “natives” versus “immigrants” (Bos et al., 2020).
Yet, cueing identities is just one of two ways to use social categorization. In fact, recent
successful propaganda efforts such as the initial video sparking the Yellow Vests protests in
France (Mahfud & Adam-Troian, 2021) or the North Atlantic Fella Organization online
campaign (NAFO, support for Ukraine, Taylor et al., 2022) all make use of completely novel
identities. These identities were crafted with the help of element already familiar to the
recipients: a yellow vest present in every French car (legal obligation) to incite protesting against
increases in gas taxes (“driver” identity) and a dog picture part of the internet meme culture to
Conversely, if the context is not favorable to improving Social fit between a message and
a recipient, deception can be achieved by degrading social fit between an alternative message
against the recipient’s ingroup are particularly suited for this purpose. For instance, in the 1980s,
the KGB fabricated and disseminated narratives accusing the US of having manufactured AIDS
anti-US government resentment among Western populations were so successful that, thirty years
later, 15% of African Americans still believed that AIDS was a tool of genocide against them
(see Boghardt, 2009). In a similar vein, the French far-right has coined the term “Islamo-leftist”
A lighter version of this strategy is used by Turkish, Russian, and Chinese propaganda
disproportionately covering – amongst other news - malevolent social behavior from Western
authorities towards “the people” (police violence against protesters, corruption cases from high
officials… Charon & Vilmer, 2021). Compounded over time, this low-intensity propaganda
frame spreads distrust towards the recipient’s own government, which is progressively perceived
as an outgroup. This could lead recipients to be more trusting of the foreign propaganda source
than of their own news media, further increasing their susceptibility to propaganda.
message already possesses high Attitudinal fit. But reactance can arise when targeting individuals
whose attitudes are opposed to a message’s prescriptions (e.g. message against political violence
aiming at violent extremists, Belanger et al., 2020). To minimize or avoid discrepancy in the
attitudinal domain, diversion can be used. To Divert and bypass recipients’ resistance, a
propagandist can try to change the attitude of reference for the comparison process. Instead of
directly confronting the recipient’s resistant attitudes, diversion will either blend in with the
recipients’ attitudes or activate a different set of nonresistant attitudes related to the prescription.
One way to do so is to shift the focus away and “zoom out” to redefine the problem at
hand. For instance, an anti-legalized abortion message would immediately encounter attitudinal
resistance among individuals who favor the progress of women’s rights. Attitudinal fit can be
women’s mental health (while carefully omitting its positive effects on maternal mortality;
Reardon, 2018). This way, de-legalizing abortion is presented as an improvement for women’s
mental health rather than as a regression in terms of women’s rights. A second strategy to Divert
uses a variation on this technique. Instead of redefining the problem to change the attitudinal
domain of reference, it consists in “zooming in” to the details of an issue to do the same. This
technique is extensively used to justify aggression, when the propagandist operates on behalf of a
perpetrator (i.e. victim blaming, Van den Bos, 2009). This technique is found in the Kremlin’s
propaganda narrative depicting their illegal invasion of Ukraine in 2022, framed as a response to
NATO eastwards expansion (which has no factual basis, Rühle, 2014). This type of narrative can
shunt resistant attitudes against war by shifting attention towards the putative causal mechanisms
moralization, Mooijman et al., 2018) to Divert the recipient towards higher-order moral attitudes
(instead of attitudes towards the issue per se). In line with our previous example, Kremlin-
backed actors spread another narrative regarding their invasion, aiming to bypass the positive
attitudes of Western audiences towards support for Ukraine. According to this narrative,
supporting weapons delivery to Ukraine means “escalating” the conflict. This narrative turns
support for an invaded country’s right to defend itself into a hawkish choice leading to more
deaths (Tenzer, 2022). Thereby, it decreases Attitudinal fit between recipients’ moral aversion
was already used by Nazi propaganda during WWII (Orwell, 2018). Likewise, lobbyists
frequently make use of euphemisms to tap into widely shared moral attitudes: who would
Unlike attitudes, which are sensitive to counter-positions, information is more malleable and less
subject to reactance. Because they are not tied with moral judgement, emotions and social
identities, factual descriptions tend to provoke less “backlash effects” if contradicting one’s
attitudes on a given issue (Ecker et al., 2020; Wood et al., 2019). However, it does not mean that
Informational fit should be less relevant than other types of fit. The way information is framed,
how facts are presented and described do matter and need to be tailored to the knowledge of the
recipient.
The Disrupt strategy fully exploits the information ecology in which recipients are
immersed. First, disruption can be created by owning temporal primacy over a narrative before
recipients could accumulate any alternative information. This can be done by exposing naïve
psychological inoculation (McGuire, 1964) works by exposing individuals to facts before they
encounter a specific fake news in order to protect them against misinformation (Maertens et al.,
2021). However, psychological inoculation does not depend on the content of a message: any
early exposure to a counterargument will protect individuals from an argument encountered later
(Banas, & Rains, 2010). Therefore, propaganda can also be used for psychological “intoxication”
by feeding a naïve recipient with a counter argument against an alternative, yet valid
information. This strategy is used by creationist online propaganda, which essentially provides
counter arguments to scientific claims before their audience was sufficiently exposed to the
recipients already possess extensive knowledge regarding the issue at hand (e.g. trying to
convince a geologist that the earth is flat). In these cases, inoculation and intoxication will not be
fit for purpose and Informational fit may not be a realistic goal. When this happens, another
strategy can be used to generate disruption through saturating available information. Saturation
aims not to generate fit between message and recipient, but misfit between an alternative
message and the recipient instead. This is typically done by feeding the recipient with alternative
narrative could be sufficient to generate Informational misfit between the recipient’s knowledge
and a narrative. This could be why the spokespersons for authoritarian regimes so confidently
repeat information or news that recipients and third-party observers know to be false. For
instance, the Chinese Communist Party’s spinning of their Xinjiang concentration camps for
Uyghurs as “educational” (BBC, 2020) could have a persuasive effect by breaking down
A variation on the saturation technique is to make use of true statements instead of fake
news. Accordingly, exploiting scientific doubt and uncertainty is a common practice of industry
contributing to (e.g. bee population decline). This is done to drown policy makers and the public
into fabricated complexity, which can then delay their deliberation and action (i.e. to achieve
dissuasion, see Bramoullé & Orset, 2018). This “benefit of doubt” tactic created by the mere
availability of alternative narratives is – in our view – one of the most pervasive forms of modern
propaganda. It is a defining feature of our post-truth era and of the political style of right-wing
The last propaganda strategy relying on Fitness is to Decoy by targeting Cognitive fit. The
propaganda-compatible conclusion. This can be achieved by tapping into various persuasion and
techniques afforded by social psychological research. The common feature of these techniques is
their reliance on generating a sense of fit between the message characteristics and the recipients’
psychological makeup. As such, successful Decoy will make the message feel intuitive to the
recipient, although its prescription could be contrary to their own beliefs and interests.
The first category of Decoy propaganda exploits cognitive consistency: the message is
crafted to fit the recipients’ cognitive architecture. Broadly speaking, mass campaigns towards
the general public can tap into universal cognitive biases (e.g. scarcity; Mittone & Savadori,
2009; sensitivity to semantic prosody; Hauser & Schwartz, 2018). Such messages can be cost-
effective, for instance when promoting health behaviors (gym attendance; Milkman et al., 2019).
However, to maximize impact, finer-grained propaganda can be crafted to fit the cognitive
content is more effective among ideological extremists (Mitts et al., 2022) because they are more
Perceptual and cognitive processes are not the sole targets of Decoy strategies. In fact,
emotional states too can be leveraged to generate Cognitive fit among recipients. Recent
evidence showed that matching the emotional tone of a message to recipients’ emotions like
enthusiasm and fear significantly increased its persuasiveness (Zarouali et al., 2022). This
emotional-congruence effect can even be leveraged to bypass attitudinal resistance. For example,
it is established that political conservatives tend to display higher chronic nostalgia than liberals
(Lammers & Baldwin, 2020). Consequently, messages with a past-focused temporal frame (“I
would prefer to go back to the old days, where people may have owned hunting rifles and pistols,
but no one had assault rifles.”) were found to increase conservatives’ endorsement of policies
that were opposed to their attitudes (increasing gun control; Lammers & Baldwin, 2018).
It is also possible to Decoy recipients based on personality. This strategy was used during
Donald Trump’s 2016 presidential campaign in the US. Data collected on social media allowed
the classification of voters based on their personality traits by a third-party company. This
company then sent targeted advertisements including, for example, threatening content for highly
neurotic individuals (who tend to focus more on negative stimuli; Bakir, 2020). Similarly,
messages matching the linguistic style of extraverts (confident, assertive) and introverts (open,
questioning) tend to achieve greater persuasiveness (Zarouali et al., 2022). It is possible to tap
into other salient individual differences, such as collective narcissism (Cichocka & Cislak,
2020). This was done in a recent campaign in Turkey emanating from the main opposition party
(CHP). The CHP aimed to sway partisans of the nationalist and anti-Western incumbent in the
upcoming 2023 presidential election. To Decoy this electorate, the pro-European CHP displayed
billboard messages using collective narcissist frames such as “Hey world… Turkey will not be
your refugee camp” and “…Turkey will not be your go to place for cheap labour” (Yeni Safak,
2022).
Figure 4.
The 5D of propagandist persuasion and their relationships with elements of the Fitness-
Validation Model (FVM).
Like efforts targeting Fitness, strategies to Disturb the Validation process can provide extremely
useful. Disturbance comes in two forms and can take a positive or a negative form. Positive
validity regarding the message. Examples of positive disturbance include populist rhetoric
flattering the epistemic virtues of “the people” over the “intellectual elite” and traditional
alternative medicine advertisement, religious indoctrination) often refer to the superior validity
of the recipient’s intuition (or faith) relative to expertise, scientific or rational analysis (Evans et
al., 2020; Šrol, 2022; Yilmaz, 2021). Typically, this is done with messages including relativistic
arguments which depict recipients’ intuition as equally valid to that of experts or scientists (i.e.
Positive disturbance is, however, less likely to affect resistant recipients. This is why
autocratic and illiberal regimes base much of their disinformation propaganda aiming at well-
populations, Muthukrishna et al., 2020) on negative disturbance. This type of disturbance aims at
paralyzing judgment and crippling action among recipients, instead of bolstering it. Negative
relationships (Johnson et al., 2021). Instead of flattering the recipient’s ability to reason and
make judgements, negative disturbance denigrates their capacity to make informed (or least
biased) judgement.
Negative disturbance can come in two variants. It can imply the recipient cannot take
position on an issue because they do not possess the abilities to do so, or due to their social or
cultural identity. For instance, an overly abstract and complex presentation of arguments devoid
of logical coherence can bully individuals into adhering to a prescription because they may feel
unable to assess its veracity. Although the use of pseudo-profound b******t of this kind is quite
frequent in religious, pseudoscientific, and new age propaganda (see Pennycook et al., 2015) it is
often used by activists targeting educated audiences. This is a staple of modern far-left
recipients, a tactic that seems to work especially well among academic audiences (see Andreski,
be making extensive use of negative disturbance based on social and cultural bias. Authoritarian
regimes often base their counter-propaganda efforts on culturally relativistic arguments. For
and prejudice towards Muslims (Adam-Troian, 2021; Imhoff & Recker, 2012). In a similar way,
criticisms of the current denial of the Armenian genocide in Turkey are often neutralized by
(Marchand & Perrier, 2022). Propaganda channels from illiberal actors targeting WEIRD
recipients often make use of narratives and documentaries about “American imperialism”,
European colonialism, and the ongoing “victimization” of non-Western diasporas in the West.
This strategy based on competitive victimhood (Young & Sullivan, 2016) can be very effective
in neutralizing negative attitudes towards the message source. As an illustration, in the wake of
the 2022 invasion of Ukraine, 43% of French citizens believed that “the Russian military
from the persecution they suffer at the hands of Ukrainian authorities” (IFOP & Reboot, 2022).
Discussion
In this paper, we have proposed the first model of propaganda based on the current findings in
social psychological research, the FVM. This model relies on Fitness along four dimensions
between message and recipient, and a mediation by the metacognitive process of Validation.
This synthetic account allows to integrate various social psychological subfields of interest to the
study of propaganda. It is also the basis for deriving five main pathways to propagandist
influence, the 5D. These strategies to Deceive social intuition, Divert resistant attitudes, Disrupt
information processing, Decoy reasoning and Disturb meta-cognition can be describe a variety of
real-world propaganda in different domains. Moreover, they constitute the basic “building-
blocks” or ingredients on which propaganda and counterpropaganda can be created and tested by
policy makers, propaganda professionals and applied researchers alike. Before concluding, we
will now briefly discuss some of the implications of the FVM and highlight some important
First and foremost, the FVM and its 5D constitute a substantial attempt at garnering
explanatory power while keeping a low number of parameters. The FVM can explain a variety of
real-world propaganda elements, tactics and strategies that seem, on the surface, unrelated to one
another. Without such a model, it could be difficult to detect the common mechanisms at play
behind online far-right trolling, anti-Western rhetoric in international news channels and the
mass of studies emanating from the food industry’s PR departments. The FVM is organized
around persuasion and meta-cognition, rather than propaganda content, which could be more
accurate to classify propaganda. In fact, the FVM allows a classification based on dynamic
psychological strategies (or “blueprints”) instead of fixed contents. The model possesses
ecological validity because it converges with other models and conceptualizations made
independently by other agents (e.g. “reflexive control” as theorized by the KGB; Til, 2021). The
FVM framework also enables researchers to test novel sets of hypotheses. As an example, the
processes of Informational fit and Disrupt suggest that the mere availability of alternative
The FVM can indeed offer a robust and integrative framework within which propaganda
researchers could build research programs for theoretical developments in the field. Yet, by
linking research in different areas, the FVM also creates affordances to get fresh insights on
already intensively researched phenomena. For instance, over the past decade, much research has
been dedicated to studying the causes and consequences of conspiracy beliefs in various domains
(see Hornsey et al., 2022 for a recent overview). In light of the FVM however, novel research
questions can emerge. At the moment, the role of conspiracy beliefs as persuasive narratives
remains underexplored. How and to what extent could conspiracist-framed messages foster
attitude change in other domains than intergroup relations? Likewise, the four Fitness domains
described by the FVM could be linked with the predictors of conspiracist ideation. This could
help to explain the psychological attractiveness of such narratives. Social fit matches the social
motives underpinning conspiracy beliefs (Sternisko et al., 2020), while Cognitive fit matches the
use of intuitive reasoning bias (e.g. conjunction fallacy) in conspiracist narration (see Douglas et
al., 2019).
More broadly speaking, the flexibility of the FVM and scope of the 5D allows to shed a
novel light on propaganda, extremism and misinformation. As we have seen, propaganda fully
exploits the principle of psychological equifinality (Heider, 1958): several causes can underly a
behavior or attitude, several mechanisms or processes can lead to the same outcome. As such,
several strategies, combinations of the 5D across various target publics can achieve a
propagandist’s goal. The use of one or several Ds over others is constrained only by the features
pathways for propaganda suggests that the behaviors under study by influence and persuasion
researchers should be expanded. The behavioral repertoire of propaganda can include spreading
a message, rumor or fake news, arguing with relatives for the propagandist’s cause, engaging in
collective action seemingly unrelated to the cause (e.g. protest for “peace” in a conflict instead of
protesting for support to one side), inaction itself and even lack of interest for an important issue
(i.e. depoliticization).
Similarly, extremism seen through the lenses of the FVM may appear to have positive
functions depending on its content. Indeed, current research in political psychology seems to
implicitly assume that ideological rigidity is a flaw, and often approaches ideology through the
prism of its distorting influence on reasoning, regardless of content (e.g. Zmigrod, 2020). Yet,
the FVM offers a functional approach to extremism. Attitude certainty, rigidity or ideological
extremism may be functional in preventing persuasion attempts from opposed actors. If so,
extreme, certain, and rigid pro-democratic attitudes could paradoxically be the best inoculation
strategy against political violence. Importantly, the FVM promotes the idea that individuals are
not irrational and passive vessels of information to be “brainwashed” (see Mercier, 2016).
Instead, propaganda goes to great kinetic extents to neutralize or excite motivated actors by
their outcomes. This is relevant to the current debate surrounding misinformation, because the
5D show that fake news is only one of the many tools of illiberal propaganda, and maybe among
liberal democratic actors. It can also stimulate applied research to design ever more persuasive
messages to different publics. In view of the 5D, we can see that the current approach to fighting
what can be done. In fact, liberal democratic and scientific actors have the advantage of the truth
when confronting illiberal authoritarian ones and could think of more active and creative ways of
countering their actions. As an illustration, anti-ISIS propaganda in the West could target
conservative Muslims with video messages using Deceive giving the real number of foreign
fighters (less than 0.1% of the community) and Decoying with nostalgia for “a time in the 60s,
before the spread of radical Islam when Muslims were more secular and open”. In parallel, news
channels could disseminate documentaries about the crimes of Ottoman colonialism to Disturb
the anti-Western rhetoric of djihadis. Political communication to foster support for Ukraine
among public opinions could Divert the far-right by highlighting the actions of Russia against the
West and the patriotic nature of support to Ukraine. The far-left could be diverted with content
showing how the Russian State oppresses minorities (sending them to the “meat-grinder”,
McKinnon, 2022), and how support for Ukraine helps minorities “fight against imperialist
oppression”. The possibilities are numerous and solely bounded by the imagination of our liberal
democratic propagandists.
Before concluding, however, we must note three caveats with the FVM, which we hope
further developments could help improve. The model’s mains strength is indeed its breadth, but
it is also one of its main limitations. For instance, the FVM does not focus on the temporal
dynamics of propaganda and does not include elements related to repeated exposure and
conditioning (see Montoya et al., 2017). Of course, these elements are part of the Cognitive fit
aspect of the FVM, but nonetheless, their time-related features are not quite integrated yet. The
limited number of parameters, which is essential to achieve parsimony, leaves questions open
regarding the description of some influence processes in the model. The principle of reciprocity
(see Cialdini, 2006), which is at the heart of some very effective lobbying strategies (e.g. in the
medical community) could be compatible with both Deceive/Social fit (relies on forming
“ingroup” relations) and Decoy/Cognitive fit (stems from universal cognitive biases towards
ingroups). This may be illustrative of an inherent limit of any model of propaganda. Although
ecologically valid, the FVM makes distinctions between features that may come bundled
together in the real world, which explains why some strategies may sometimes fit more than one
D at a time.
Conclusion
Although social psychology can be considered as the science of propaganda par excellence,
propaganda as an operational concept remained quasi-absent from the field until now. This
absence is painfully visible as, by the time this article was written, Europe was to enter its first
year in a war of the highest intensity since WWII. In this war of genocide against the Ukrainian
people (Connolly, 2022), Russian neo-fascist disinformation (Umland & Eichstaett, 2009),
weapons on the informational battlefront (Abrams, 2022). In this context, formal models, and
empirical analyses of propaganda efforts from all protagonists could greatly benefit the liberal
democratic war effort of Ukraine and help save a substantial number of lives. This could be
servicemembers surrender rates, the volume of donations to humanitarian aid and public opinion
pressure to supply weapons to the Ukrainian Armed Forces (thereby shortening the length of
Russian occupation). Likewise, the current issues confronting liberal democracies, such as the
resurgence of terrorism, conspiracism, antiscientific and illiberal movements are all issues where
propaganda acts as a pharmakon, by being both the ill and the remedy. We hope that the present
paper provides decision-makers and researchers alike with an integrative basis and roadmap to
conduct ever more useful studies of the mechanisms at play behind propaganda and their
Bauvois, J. L. (2007). La propagande dans les démocraties libérales [Propaganda within liberal
democracies]. Le journal des psychologues, (4), 39-43.
https://doi.org/10.3917/jdp.247.0039
Bechler, C. J., Tormala, Z. L., & Rucker, D. D. (2021). The attitude–behavior relationship
revisited. Psychological Science, 32(8), 1285-1297.
https://doi.org/10.1177/0956797621995206
Bélanger, J. J., Nisa, C. F., Schumpe, B. M., Gurmu, T., Williams, M. J., & Putra, I. E. (2020).
Do counter-narratives reduce support for ISIS? Yes, but not for their target
audience. Frontiers in psychology, 11, 1059. https://doi.org/10.3389/fpsyg.2020.01059
Bernays, E. L. (1942). The marketing of national policies: A study of war propaganda. Journal
of Marketing, 6(3), 236-244. https://doi.org/10.1177/002224294200600303
Bertin, P., Nera, K., & Delouvée, S. (2020). Conspiracy beliefs, rejection of vaccination, and
support for hydroxychloroquine: A conceptual replication-extension in the COVID-19
pandemic context. Frontiers in psychology, 2471.
https://doi.org/10.3389/fpsyg.2020.565128
Biddle, W. W. (1931). A psychological definition of propaganda. The Journal of Abnormal and
Social Psychology, 26(3), 283. https://doi.org/10.1037/h0074944
Boghardt, T. (2009). Soviet Bloc intelligence and its AIDS disinformation campaign. Studies in
intelligence, 53(4), 1-24.
Bohner, G., & Dickel, N. (2011). Attitudes and attitude change. Annual review of
psychology, 62, 391-417. https://doi.org/10.1146/annurev.psych.121208.131609
Bramoullé, Y., & Orset, C. (2018). Manufacturing doubt. Journal of Environmental Economics
and Management, 90, 119-133. https://doi.org/10.1016/j.jeem.2018.04.010
Briñol, P., & Petty, R. E. (2009). Source factors in persuasion: A self-validation
approach. European review of social psychology, 20(1), 49-96.
https://doi.org/10.1080/10463280802643640
Briñol, P., & Petty, R. E. (2022). Self-validation theory: An integrative framework for
understanding when thoughts become consequential. Psychological Review, 129(2),
340. https://doi.org/10.1037/rev0000340
Cialdini, R. B. (2006). Influence: the psychology of persuasion, revised edition. New York:
William Morrow.
Connolly, A. (2022, April). Canadian MPs unanimously back motion recognizing Russian
‘genocide’ in Ukraine. Global News. Accessed 21st December 2022 at
https://globalnews.ca/news/8791299/canadian-parliament-recognizes-russian-genocide-
ukraine/
De Bruycker, I., & Beyers, J. (2019). Lobbying strategies and success: Inside and outside
lobbying in European Union legislative politics. European Political Science
Review, 11(1), 57-74. https://doi.org/10.1017/S1755773918000218
De Souza, L., & Schmader, T. (2022). The misjudgment of men: Does pluralistic ignorance
inhibit allyship? Journal of Personality and Social Psychology, 122(2), 265.
https://doi.org/10.1037/pspi0000362
Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019b). Partisan bias and its discontents. Perspectives on Psychological Science, 14(2),
304-316. https://doi.org/10.1177/1745691618817753
Ditto, P. H., Liu, B. S., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019a). At least bias is bipartisan: A meta-analytic comparison of partisan bias in
liberals and conservatives. Perspectives on Psychological Science, 14(2), 273-291.
https://doi.org/10.1177/1745691617746796
Donohue, J. J., & Levitt, S. (2020). The impact of legalized abortion on crime over the last two
decades. American law and economics review, 22(2), 241-302.
https://doi.org/10.1093/aler/ahaa008
Doob, L. W. (1948). Public opinion and propaganda. Henry Holt.
Dovidio, J. F., Glick, P., & Rudman, L. A. (Eds.). (2008). On the nature of prejudice: Fifty
years after Allport. John Wiley & Sons.
Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi,
F. (2019). Understanding conspiracy theories. Political Psychology, 40, 3-35.
https://doi.org/10.1111/pops.12568
Eagly, A. H., & Chaiken, S. (1984). Cognitive theories of persuasion. In Advances in
experimental social psychology (Vol. 17, pp. 267-359). Academic Press.
https://doi.org/10.1016/S0065-2601(08)60122-7
Ecker, U. K., Lewandowsky, S., & Chadwick, M. (2020). Can corrections spread
misinformation to new audiences? Testing for the elusive familiarity backfire
effect. Cognitive Research: Principles and Implications, 5(1), 1-25.
https://doi.org/10.1186/s41235-020-00241-6
Evans, A., Sleegers, W., & Mlakar, Ž. (2020). Individual differences in receptivity to scientific
bullshit. Judgment and Decision Making, 15(3), 401.
Festinger, L. (1957). A theory of cognitive dissonance. Row, Peterson
Gollwitzer, A., Martel, C., Brady, W. J., Pärnamets, P., Freedman, I. G., Knowles, E. D., &
Van Bavel, J. J. (2020). Partisan differences in physical distancing are linked to health
outcomes during the COVID-19 pandemic. Nature human behaviour, 4(11), 1186-1197.
https://doi.org/10.1038/s41562-020-00977-7
Gollwitzer, P. M., & Sheeran, P. (2006). Implementation intentions and goal achievement: A
meta‐analysis of effects and processes. Advances in experimental social psychology, 38,
69-119. https://doi.org/10.1016/S0065-2601(06)38002-1
Grazzini, L., Rodrigo, P., Aiello, G., & Viglia, G. (2018). Loss or gain? The role of message
framing in hotel guests’ recycling behaviour. Journal of Sustainable Tourism, 26(11),
1944-1966. https://doi.org/10.1080/09669582.2018.1526294
Griffith, E. E., Nolder, C. J., & Petty, R. E. (2018). The elaboration likelihood model: A meta-
theory for synthesizing auditor judgment and decision-making research. Auditing: A
Journal of Practice & Theory, 37(4), 169-186. https://doi.org/10.2308/ajpt-52018
Halperin, E., & Schori‐Eyal, N. (2020). Towards a new framework of personalized
psychological interventions to improve intergroup relations and promote peace. Social
and Personality Psychology Compass, 14(5), 255-270.
https://doi.org/10.1111/spc3.12527
Hameiri, B., Bar‐Tal, D., & Halperin, E. (2019). Paradoxical thinking interventions: A
paradigm for societal change. Social Issues and Policy Review, 13(1), 36-62.
https://doi.org/10.1111/sipr.12053
Hameleers, M. (2021). Populist disinformation in fragmented information settings:
Understanding the nature and persuasiveness of populist and post-factual
communication. Routledge.
Harmon-Jones, E. (Ed.). (2019). Cognitive dissonance: Reexamining a pivotal theory in
psychology (2nd ed.). American Psychological
Association. https://doi.org/10.1037/0000135-000
Haselton, M. G., Nettle, D., & Andrews, P. W. (2015). The evolution of cognitive bias. The
handbook of evolutionary psychology, 724-746.
Haslam, S. A., McGarty, C., & Turner, J. C. (1996). Salient group memberships and
persuasion: The role of social identity in the validation of beliefs. In J. L. Nye & A. M.
Brower (Eds.), What's social about social cognition? Research on socially shared
cognition in small groups (pp. 29–56). Sage Publications,
Inc. https://doi.org/10.4135/9781483327648.n2
Haugtvedt, C. P., & Petty, R. E. (1992). Personality and persuasion: Need for cognition
moderates the persistence and resistance of attitude changes. Journal of Personality and
Social psychology, 63(2), 308-319. https://doi.org/10.1037/0022-3514.63.2.308
Hauser, D. J., & Schwarz, N. (2018). How seemingly innocuous words can bias judgment:
Semantic prosody and impression formation. Journal of experimental social
psychology, 75, 11-18. https://doi.org/10.1016/j.jesp.2017.10.012
Heider, F. (1958). The psychology of interpersonal relations. Psychology Press.
Herman, E. S., & Chomsky, N. (2010). Manufacturing consent: The political economy of the
mass media. Random House.
Hogg, M. A., & Smith, J. R. (2007). Attitudes in social context: A social identity
perspective. European Review of Social Psychology, 18(1), 89-131.
https://doi.org/10.1080/10463280701592070
Hornsey, M. J., Bierwiaczonek, K., Sassenberg, K., & Douglas, K. M. (2022). Individual,
intergroup and nation-level influences on belief in conspiracy theories. Nature Reviews
Psychology, 1-13. https://doi.org/10.1038/s44159-022-00133-0
Hovland, C. I., Lumsdaine, A. A., & Sheffield, F. D. (1949). Experiments on mass
communication. (Studies in social psychology in World War II). Princeton University
Press.
Jones, C. R., Olson, M. A., & Fazio, R. H. (2010). Evaluative conditioning: The “how”
question. In Advances in experimental social psychology (Vol. 43, pp. 205-255).
Academic Press.
IFOP & Reboot. (2022). Observatoire Reboot de l’information et du raisonnement critique –
Désinformation et populisme à l’heure de la crise sanitaire et de la guerre en Ukraine.
[Reboot Information and Critical Thinking Observatory - Disinformation and populism
in the time of the health crisis and the war in Ukraine.] Accessed December 28th, 2022
from https://www.ifop.com/publication/observatoire-reboot-de-linformation-et-du-
raisonnement-critique-desinformation-et-populisme-a-lheure-de-la-crise-sanitaire-et-de-
la-guerre-en-ukraine/
Imhoff, R., & Recker, J. (2012). Differentiating Islamophobia: Introducing a new scale to
measure Islamoprejudice and secular Islam critique. Political Psychology, 33(6), 811-
824. https://doi.org/10.1111/j.1467-9221.2012.00911.x
Johnson, V. E., Nadal, K. L., Sissoko, D. G., & King, R. (2021). “It’s not in your head”:
Gaslighting,‘splaining, victim blaming, and other harmful reactions to
microaggressions. Perspectives on psychological science, 16(5), 1024-1036.
https://doi.org/10.1177/17456916211011963
Jost, J. T. (2017). Ideological asymmetries and the essence of political psychology. Political
psychology, 38(2), 167-208. https://doi.org/10.1111/pops.12407
Jowett, G. S., & O'donnell, V. (2018). Propaganda & persuasion. Sage publications.
Kende, A., & Krekó, P. (2020). Xenophobia, prejudice, and right-wing populism in East-
Central Europe. Current Opinion in Behavioral Sciences, 34, 29-33.
https://doi.org/10.1016/j.cobeha.2019.11.011
Keller, F. B., Schoch, D., Stier, S., & Yang, J. (2020). Political astroturfing on Twitter: How to
coordinate a disinformation campaign. Political Communication, 37(2), 256-280.
https://doi.org/10.1080/10584609.2019.1661888
Lammers, J., & Baldwin, M. (2018). Past-focused temporal communication overcomes
conservatives’ resistance to liberal political ideas. Journal of Personality and Social
Psychology, 114(4), 599. https://doi.org/10.1037/pspi0000121
Lammers, J., & Baldwin, M. (2020). Make America gracious again: Collective nostalgia can
increase and decrease support for right‐wing populist rhetoric. European Journal of
Social Psychology, 50(5), 943-954. https://doi.org/10.1002/ejsp.2673
Lasswell, H. D. (1927). The theory of political propaganda. American Political Science
Review, 21(3), 627-631.
Latt, S. M., Milner, A., & Kavanagh, A. (2019). Abortion laws reform may reduce maternal
mortality: an ecological study in 162 countries. BMC women's health, 19(1), 1-9.
https://doi.org/10.1186/s12905-018-0705-y
Leonardelli, G. J., & Toh, S. M. (2015). Social categorization in intergroup contexts: Three
kinds of self‐categorization. Social and Personality Psychology Compass, 9(2), 69-87.
https://doi.org/10.1111/spc3.12150
Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding
and coping with the “post-truth” era. Journal of applied research in memory and
cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008
Lewin, K. (1943). Forces behind food habits and methods of change. Bulletin of the national
Research Council, 108(1043), 35-65.
Lewin, K. (1947). Group decision and social change. Readings in social psychology, 3(1), 197-
211
Linebarger, P. M. A. (1954). Psychological Warfare: International Propaganda and
Communications. Duell: Sloan and Pearce.
Maertens, R., Roozenbeek, J., Basol, M., & van der Linden, S. (2021). Long-term effectiveness
of inoculation against misinformation: Three longitudinal experiments. Journal of
Experimental Psychology: Applied, 27(1), 1–16. https://doi.org/10.1037/xap0000315
McWhorter, J. (2021). Woke racism: How a new religion has betrayed Black America.
Penguin.
Marchand L., & Perrier, G. (2022). Les loups aiment la brume: Enquête sur les opérations
clandestines de la Turquie en Europe [Wolves like fog : Investigating Turkish
clandestine operations in Europe]. Grasset.
Mahfud, Y., & Adam-Troian, J. (2021). “Macron demission!”: Loss of significance generates
violent extremism for the Yellow Vests through feelings of anomia. Group Processes &
Intergroup Relations, 24(1), 108-124. https://doi.org/10.1177/1368430219880954
McGuire, W. J. (1964). Inducing resistance to persuasion. Some contemporary approaches. CC
Haaland and WO Kaelber (Eds.), Self and Society. An Anthology of Readings,
Lexington, Mass.(Ginn Custom Publishing) 1981, pp. 192-230.
McGuire, W. J. (1973). The yin and yang of progress in social psychology: Seven
koan. Journal of personality and social psychology, 26(3), 446.
https://doi.org/10.1037/h0034345
McKinnon, A. (2022). Russia Is Sending Its Ethnic Minorities to the Meat Grinder. Accessed
December, 29th from https://foreignpolicy.com/2022/09/23/russia-partial-military-
mobilization-ethnic-minorities/
Mercier, H. (2016). The argumentative theory: Predictions and empirical evidence. Trends in
Cognitive Sciences, 20(9), 689-700. https://doi.org/10.1016/j.tics.2016.07.001
Milkman, K. L., Gromet, D., Ho, H., Kay, J. S., Lee, T. W., Pandiloski, P., ... & Duckworth, A.
L. (2021). Megastudies improve the impact of applied behavioural
science. Nature, 600(7889), 478-483. https://doi.org/10.1038/s41586-021-04128-4
Mittone, L., & Savadori, L. (2009). The scarcity bias. Applied Psychology, 58(3), 453-468.
https://doi.org/10.1111/j.1464-0597.2009.00401.x
Mitts, T., Phillips, G., & Walter, B. F. (2022). Studying the impact of ISIS propaganda
campaigns. The Journal of Politics, 84(2), 1220-1225.
Mooijman, M., Hoover, J., Lin, Y., Ji, H., & Dehghani, M. (2018). Moralization in social
networks and the emergence of violence during protests. Nature human behaviour, 2(6),
389-396. https://doi.org/10.1038/s41562-018-0353-0
Montoya, R. M., Horton, R. S., Vevea, J. L., Citkowicz, M., & Lauber, E. A. (2017). A re-
examination of the mere exposure effect: The influence of repeated exposure on
recognition, familiarity, and liking. Psychological bulletin, 143(5), 459-498.
https://doi.org/10.1037/bul0000085
Moscovici, S. (1981). On social representations. Social cognition: Perspectives on everyday
understanding, 8(12), 181-209.
Muthukrishna, M., Bell, A. V., Henrich, J., Curtin, C. M., Gedranovich, A., McInerney, J., &
Thue, B. (2020). Beyond Western, Educated, Industrial, Rich, and Democratic
(WEIRD) psychology: Measuring and mapping scales of cultural and psychological
distance. Psychological science, 31(6), 678-701.
https://doi.org/10.1177/0956797620916782
Muthukrishna, M., Henrich, J. A problem in theory. Nat Hum Behav 3, 221–229 (2019).
https://doi.org/10.1038/s41562-018-0522-1
Nolan, J. M., Schultz, P. W., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2008).
Normative social influence is underdetected. Personality and social psychology
bulletin, 34(7), 913-923. https://doi.org/10.1177/0146167208316691
Obaidi, M., Kunst, J., Ozer, S., & Kimel, S. Y. (2022). The “Great Replacement” conspiracy:
How the perceived ousting of Whites can evoke violent extremism and
Islamophobia. Group Processes & Intergroup Relations, 25(7), 1675-1695.
https://doi.org/10.1177/13684302211028293
Oishi, S., Kesebir, S., & Snyder, B. H. (2009). Sociology: A lost connection in social
psychology. Personality and Social Psychology Review, 13(4), 334-353.
https://doi.org/10.1177/1088868309347835
Orwell, G. (2018). Notes on nationalism. Penguin UK.
Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2015). On the
reception and detection of pseudo-profound bullshit. Judgment and Decision
making, 10(6), 549-563.
Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in cognitive
sciences, 25(5), 388-402. https://doi.org/10.1016/j.tics.2021.02.007
Petty, R. E., Brinol, P., Loersch, C., & McCaslin, M. J. (2009). The need for cognition. In M.
R. Leary & R. H. Hoyle (Eds.), Handbook of individual differences in social
behavior (pp. 318–329). The Guilford Press.
Petty, R. E., Briñol, P., & Tormala, Z. L. (2002). Thought confidence as a determinant of
persuasion: The self-validation hypothesis. Journal of Personality and Social
Psychology, 82(5), 722–741. https://doi.org/10.1037/0022-3514.82.5.722
Picheta, R. (2022). Russian forces have staged illegal ‘referendums’ in Ukraine. What comes
next? Accessed December 26th, 2022 from
https://edition.cnn.com/2022/09/27/europe/ukraine-russia-referendum-explainer-
intl/index.html
Priolo, D., Pelt, A., Bauzel, R. S., Rubens, L., Voisin, D., & Fointiat, V. (2019). Three decades
of research on induced hypocrisy: A meta-analysis. Personality and Social Psychology
Bulletin, 45(12), 1681-1701. https://doi.org/10.1177/0146167219841621
Prost, M., Piermattéo, A., & Lo Monaco, G. (2022). Social representations, social identity, and
representational imputation: A review and an agenda for future research. European
Psychologist. Advance online publication. https://doi.org/10.1027/1016-9040/a000489
Proulx, T., Inzlicht, M., & Harmon-Jones, E. (2012). Understanding all inconsistency
compensation as a palliative response to violated expectations. Trends in cognitive
sciences, 16(5), 285-291. https://doi.org/10.1016/j.tics.2012.04.002
Reardon, D. C. (2018). The abortion and mental health controversy: A comprehensive literature
review of common ground agreements, disagreements, actionable recommendations,
and research opportunities. SAGE open medicine, 6, 2050312118807624.
https://doi.org/10.1177/2050312118807624
Rodrik, D. (2021). Why does globalization fuel populism? Economics, culture, and the rise of
right-wing populism. Annual Review of Economics, 13, 133-170.
https://doi.org/10.1146/annurev-economics-070220-032416
Roozenbeek, J., Freeman, A. L., & van der Linden, S. (2021). How accurate are accuracy-
nudge interventions? A preregistered direct replication of Pennycook et
al.(2020). Psychological science, 32(7), 1169-1178.
https://doi.org/10.1177/0956797621102453
Rouquette, M. L. (1996). Social representations and mass communication research. Journal for
the theory of social behaviour, 26(2), 221-231.
Rosling, H., Rosling, O., & Rönnlund, A. R. (2018). Factfulness: ten reasons we're wrong
about the world--and why things are better than you think. Flatiron books.
Rühle, M. (2014). NATO enlargement and Russia: discerning fact from fiction. American
Foreign Policy Interests, 36(4), 234-239.
https://doi.org/10.1080/10803920.2014.947879
Ryan, R. M., Deci, E. L., Vansteenkiste, M., & Soenens, B. (2021). Building a science of
motivated persons: Self-determination theory’s empirical approach to human experience
and the regulation of behavior. Motivation Science, 7(2), 97-110.
https://doi.org/10.1037/mot0000194
Schultz, P. W., Nolan, J. M., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2018). The
constructive, destructive, and reconstructive power of social norms:
Reprise. Perspectives on psychological science, 13(2), 249-254.
https://doi.org/10.1177/1745691617693325
Shaffer, V. A., Focella, E. S., Hathaway, A., Scherer, L. D., & Zikmund-Fisher, B. J. (2018).
On the usefulness of narratives: an interdisciplinary review and theoretical
model. Annals of Behavioral Medicine, 52(5), 429-442.
https://doi.org/10.1093/abm/kax008
Sherif, M. (1936). The psychology of social norms. Harper.
Sokal, A., & Bricmont, J. (1997). Impostures intellectuelles [Intellectual Impostures]. Odile
Jacob.
Šrol, J. (2022). Individual differences in epistemically suspect beliefs: the role of analytic
thinking and susceptibility to cognitive biases. Thinking & Reasoning, 28(1), 125-162.
https://doi.org/10.1080/13546783.2021.1938220
Sternisko, A., Cichocka, A., & Van Bavel, J. J. (2020). The dark side of social movements:
Social identity, non-conformity, and the lure of conspiracy theories. Current opinion in
psychology, 35, 1-6. https://doi.org/10.1016/j.copsyc.2020.02.007
Tajfel, H. (1974). Social identity and intergroup behaviour. Social science information, 13(2),
65-93. https://doi.org/10.1177/053901847401300204
Tenzer, N. (2022). The heavy clouds of peace. Accessed December 26th, 2022 from
https://tenzerstrategics.substack.com/p/the-heavy-clouds-of-peace
Till, C. (2021). Propaganda through ‘reflexive control’ and the mediated construction of
reality. New Media & Society, 23(6), 1362-1378.
https://doi.org/10.1177/1461444820902446
Tormala, Z. L., & Rucker, D. D. (2018). Attitude certainty: Antecedents, consequences, and
new directions. Consumer Psychology Review, 1(1), 72-89.
https://doi.org/10.1002/arcp.1004
Steiger, A., & Kühberger, A. (2018). A meta-analytic re-appraisal of the framing
effect. Zeitschrift für Psychologie. https://doi.org/10.1027/2151-2604/a000321
Taylor, A. (2022). With NAFO, Ukraine turns the trolls on Russia. The Washington Post.
Accessed December 28th from
https://www.washingtonpost.com/world/2022/09/01/nafo-ukraine-russia/
Umland, A., & Eichstaett, B. (2009). Fascist Tendencies in Russia’s Political Establishment:
The Rise of the International Eurasian Movement. Russian analytical digest, 60(9), 13-
17.
Van Bavel, J. J. V., Baicker, K., Boggio, P. S., Capraro, V., Cichocka, A., Cikara, M., ... &
Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic
response. Nature human behaviour, 4(5), 460-471. https://doi.org/10.1038/s41562-020-
0884-z
Van Bavel, J. J., & Pereira, A. (2018). The partisan brain: An identity-based model of political
belief. Trends in cognitive sciences, 22(3), 213-224.
https://doi.org/10.1016/j.tics.2018.01.004
Van den Bos, K., & Maas, M. (2009). On the psychology of the belief in a just world:
Exploring experiential and rationalistic paths to victim blaming. Personality and Social
Psychology Bulletin, 35(12), 1567-1578. https://doi.org/10.1177/0146167209344628
Van Lange, P. A., Joireman, J., & Milinski, M. (2018). Climate change: what psychology can
offer in terms of insights and solutions. Current Directions in Psychological
Science, 27(4), 269-274. https://doi.org/10.1177/0963721417753945
Van Prooijen, J. W., Krouwel, A. P., & Pollet, T. V. (2015). Political extremism predicts belief
in conspiracy theories. Social Psychological and Personality Science, 6(5), 570-578.
https://doi.org/10.1177/1948550614567356
von Hohenberg, B. C., & Guess, A. M. (2022). When Do Sources Persuade? The Effect of
Source Credibility on Opinion Change. Journal of Experimental Political Science, 1-15.
https://doi.org/10.1017/XPS.2022.2
Vöhringer, M. (2011). A Concept in Application: How the Scientific Reflex Came to be
Employed against Nazi Propaganda. Contributions to the History of Concepts, 6(2),
105-123. https://doi.org/10.3167/choc.2011.060207
Wagener, A. (2022). France and the moral panic of “islamo-leftism”. CFC Intersections.
Wallace, J., Goldsmith-Pinkham, P., & Schwartz, J. L. (2022). Excess death rates for
Republicans and Democrats during the COVID-19 pandemic (No. w30512). National
Bureau of Economic Research.
Webster, D. M., & Kruglanski, A. W. (1994). Individual differences in need for cognitive
closure. Journal of personality and social psychology, 67(6), 1049-1062.
https://doi.org/10.1037/0022-3514.67.6.1049
Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual
adherence. Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-
9443-y