Adam-Troian-Propaganda-2022

Download as pdf or txt
Download as pdf or txt
You are on page 1of 40

The Fitness-Validation Model of Propagandist Persuasion and the 5Ds of

Propaganda.
Jais Adam-Troian, PhD1
Abstract
Although pervasive in our history and modern ecology, propaganda has yet to be formally
described by social psychological science. The field has extensively documented the principles
of persuasion and social influence, but no integrative synthesis has been attempted to describe
the mechanisms of propagandist persuasion. In this paper, we propose to formalize a Fitness-
Validation Model (FVM) of propagandist persuasion. This synthetic model assumes that, in
essence, influence is proportional to the degree of fit between message characteristics (social
content, prescriptive content, descriptive content, framing) and recipients’ (group identity,
attitudes, knowledge, cognitive makeup). Fitness on these four characteristics then shapes
perceptions of thought validity regarding the message, and ultimately behavior change. Thus, we
propose that propaganda attempts to maximize fitness between message and recipient
characteristics and self-validation in the direction of the message. From the FVM, we then derive
five main pathways for propagandist persuasion and dissuasion, which we label the 5D: Deceive
social intuition, Divert resistant attitudes, Disrupt information processing, Decoy reasoning and
Disturb meta-cognition. The 5D are mutually inclusive and can be seen as the “building blocks”
of real-world propaganda. We discuss the theoretical implications of the FVM and conclude the
model should be used to craft more effective liberal democratic (counter)propaganda.
Keywords: propaganda, misinformation, psychological operations, fitness-validation, persuasion

Conflict of Interests: The authors declare no conflict of interests that may have affected their
work.

1
North-Atlantic Fellas Organization, NAFO Headquarters, Langley Falls, Virginia, USA
“Do not try to suppress or control an opponent unnaturally. Let attackers come any way they
like and then blend with them.”― Morihei Ueshiba, The Art of Peace.
“The supreme art of war is to subdue the enemy without fighting.” ― Sun Tzu, The Art of
War
Introduction

Propaganda can be broadly defined as the use of persuasion techniques by an agent with the aim

of modifying a recipient’s behavior in a targeted direction. Since its inception more than a

century ago, social psychology has extensively focused on the study of behavior and attitude

change, which in essence, leads to applications for propaganda. As an illustration, the early

studies on social influence (Lewin, 1947) and norm formation (Sherif, 1936), cognitive

dissonance (Festinger, 1957), social representations (Moscovici, 1981) and social identity

(Tajfel, 1974) were all born out of experience with war, genocide, and dictatorship. In the US,

the applications of these theories directly fed State propaganda to civilians (e.g. easing meat

rationing during WWII; Lewin, 1943), Cold War counter-narratives against Soviet

disinformation (Hovland et al., 1949) and civil rights campaigns to fight racism against African-

Americans among Whites (Dovidio et al., 2008).

To this day, social psychologists still focus on attitude formation (Jones et al., 2010),

attitude change (Bohner & Dickel, 2011) and its moderators (e.g. personality traits, Haugtvedt &

Petty, 1992), as well as to the study of robust ways to predict and change related behaviors

(Schultz et al., 2018). Whether these attitudes pertain to the environment (Van Lange et al.,

2018), health (Van Bavel et al., 2020), intergroup relations (Halperin, & Schori‐Eyal, 2020) or

ideology (Jost, 2017), social psychological research is almost invariably justified with the aim to

help understand and change attitudes or behaviors (that is, to craft better propaganda) to tackle

important societal issues such as global warming, vaccine hesitancy, racism, or violent
extremism. In fact, most of the prescribed solutions to these societal issues rely on applying

better, more effective techniques to foster change in a desired direction (e.g. attitudes towards

reconciliation in intergroup conflicts; Bar‐Tal, & Hameiri, 2020).

Despite the obviousness of propaganda as a goal, tool and an object of social-

psychological research, an operational model of propaganda is still lacking in the field, which

impedes applied and theoretical progress. In this paper, we propose a first attempt to

conceptualize a coherent model of propaganda in modern social-psychological terms: the

Fitness-Validation Model (FVM) of propagandist persuasion. We do so by relying on the most

up-to-date advances in the fields of persuasion, namely Self-Validation Theory (Briñol & Petty,

2022), and by integrating advances in the fields of behavior change, conspiracy beliefs,

disinformation, motivated cognition, social influence, and identity. The FVM then allows us to

identify five derivative strategies of propagandist persuasion (5D) for applied purposes.

1. Why a social-psychological model of propaganda?

Propaganda as an object of investigation has been around for decades within the social sciences.

Early works on the topic sprung up in the 1920s following the rise of modern marketing methods

and far-left and far-right ideologies making use of propaganda to their political advantage

(Laswell, 1927; Bernays, 1942; Vöhringer, 2011). Propaganda was extensively studied from a

historical and political perspective, focusing on describing the materials used by and motivations

of propagandists, as well as the socio-political contexts that led to their emergence and attributed

success or failure (Jowett, & O'donnell, 2018).

In psychology too, attempts to explicitly define, model, and classify propaganda were

made in the wake of WWII. For instance, researchers tried to classify propaganda depending on
its origin and goals, thereby differentiating democratic from authoritarian forms of propaganda

(Bartlett, 1940). Other theoretical work tried to approach propaganda by analyzing its target,

namely public opinion (see for instance Herman, & Chomsky, 2010). Finally, another line of

research on propaganda emerged from “sociological” social psychology (Oishi et al., 2009), this

time focusing on its dissemination and spread through rumors and collective beliefs (Beauvois,

2007; Rouquette, 1996). In parallel, social psychologists have developed several models and

theories which are relevant to propaganda without including propaganda per se. These pertain to

persuasion and processes of attitude change more broadly speaking (Briñol & Petty, 2022), the

psychology of fake news (Pennycook & Rand, 2021), the inner workings of cognitive and

reasoning biases (Haselton et al., 2015), the determinants of conspiracy beliefs (Hornsey et al,

2022), the cognitive and motivational underpinnings of ideologies (Zmigrod et al., 2021) and the

mechanisms of motivated cognition (Ryan et al., 2021), including partisanship and social identity

(Van Bavel, & Pereira, 2018).

The contrast between the two approaches is stark. On one hand, the intuition that

propaganda is a phenomenon of scientific relevance has led to “macro” theories with ecological

validity but low internal validity and usefulness for application. These theories often involve

unclear mechanisms, nebulous concepts and lie disconnected from modern social psychological

research. On the other hand, the experimental documentation of the cognitive mechanisms of

persuasion generated a wealth of “micro” models and empirical discoveries with high internal

validity but poor ecological validity. Disconnected from one another, these mechanisms often

yield small effects when applied and fail to mimic the features of real-world propaganda (e.g.

accuracy-nudge, Roozenbek et al., 2021). Therefore, a model bridging the gap between the

sociological (macro) and cognitive (micro) levels of analysis pertaining to propaganda is


required to bring about further theoretical developments but also more applied effectiveness (see

figure 1).

By proposing the FVM, we wish to offer a basis for working towards a true science of

propaganda (Silverstein, 1987) by integrating currently disjoint empirical phenomena such as

social influence, ideology, fake news, and conspiracy beliefs. We also aim to align the model

with early propositions to explicitly consider links between cognitive and sociological

phenomena in social psychology theories (see McGuire, 1973). Finally, the model is proposed to

address more recent calls for psychologists to focus on theorizing to overcome the

epistemological obstacles highlighted by the replicability crisis (Muthukrishna, & Henrich,

2019). Let us now turn to the theoretical proposition itself.

Figure 1.
Situating a level of analysis for a social psychological model of propaganda.
2. Modeling propagandist persuasion: outlining the FVM

2.1. Defining propaganda

As per the introduction, we decided to adopt a consensual and broad definition of propaganda as

“the use of persuasion techniques by an agent with the aim of modifying a recipient’s behavior

in a targeted direction.” The term “techniques” admits any kind of intervention, ranging from

educational interventions to mass online advertisement. An “agent” can be any motivated actor,

individual or group of individuals from public or private organizations: activists, politicians,

lobbyists, State departments, companies’ PR department, NGOs… Likewise, the “recipients” of

propaganda need not be a vast entity such as the public opinion. Any individual or group of

individuals can be a discriminate or indiscriminate target of propagandists.

Importantly, this definition assumes that ultimately, all propaganda aims at behavior

change (or prevention): attitude change is invariably instrumental in this regard. For instance, the

current “morale boosting” propaganda of the Ukrainian State targets attitudes towards war, but

ultimately aims at maintaining citizens’ behavioral efforts such as volunteering, donating, or

enlisting in the armed forces. Conversely, the regime-legitimizing propaganda prevalent in

authoritarian states (e.g. China, Russia, Turkey) seeks to prevent regime-challenging behavior,

such as tract distribution, online petition signing and protesting. Finally, the notion of “targeted

direction” is important, as it allows us to conceptually delineate propaganda from education and

information in liberal democracies. Unlike propaganda, the latter (theoretically) aims at

providing citizens with the least biased knowledge available on a given topic so they may exert

“informed” judgement on related issues (no targeted direction).

This broad definition avoids excessive focus on less relevant characteristics. For instance,

classifying propaganda based on its informational content (true or false, emotional, or rational;
Biddle, 1931) could miss critical aspects of the phenomenon, since much propaganda does make

use of true information framed in a persuasive way (Altay et al., 2021). Similarly, definitions of

propaganda based on a narrow subset of persuasion techniques (e.g. conditioning, repeated

exposure), specific actors (e.g. parties or States), recipients (e.g. public opinion; Herman, &

Chomsky, 2010) could dismiss relevant empirical similarities between different forms of

propaganda. Finally, a distinction of propaganda based on its overt or covert nature (e.g. white

and black propaganda, see Linebarger, 1954) would be too reductive and may not fit modern

empirical examples of psychological operations, which can involve multiple parallel strategies

and actors (both covert and overt, online and offline) at the service of a single goal. Now that we

have an operational definition of propaganda, let us turn to the constitutive elements of the FVM.

2.2. Elements of the FVM: Fitness

The idea behind the FVM is to make use of the most up-to-date science of attitude and behavior

change and synthesize it into basic elements or parameters. This synthesis operates primarily on

two social psychological concepts, which sum up a substantial part of the literature and structure

our integrative model across social psychological subfields. The first, and perhaps most

important concept at the core of the FVM is that of Fitness.

Across models of persuasion and attitude change, messages tend to be separated from

other components, such as their source (Eagly, & Chaiken, 1984, Brinol, & Petty, 2009) or the

way they are framed (e.g. as explicit losses or gains, Grazzini et al., 2018). Here, we propose to

combine all these elements into basic dimensions of a message. We propose that messages can

vary along 4 basic dimensions: social content (social identity cues, including source), factual

content (informational value), prescriptive content (behavioral or attitudinal prescription) and

format (message framing). Similarly, social psychological research tends to separate the fields
investigating how group identity or partisanship shapes information processing (Abrams &

Hogg, 1990; Ditto et al., 2019a), how counter-attitudinal messages generate reactance (Belanger

et al., 2020, Proulx et al., 2012), how information shapes attitudes (Wood et al., 2019) and how

messages framed to fit the recipient’s psychological makeup generate more persuasion (e.g. use

of narratives, Shaffer et al., 2018). Instead of seeing those empirical phenomena as

compartmented entities, we propose to understand these as different domains of application for

influence processes. Hence, we propose that, just like for messages, recipients can be classified

along four basic dimensions of influence: social identity, information (factual knowledge),

attitudes (like-dislike stances on different issues), cognitive makeup (emotions, personality traits,

risk aversion…).

According to the Fitness principle, when an individual is exposed to a message, a

comparative fit test is engaged, involving both conscious or less conscious mechanisms. Fitness

will increase the message’s effectiveness, while discrepancies will do the opposite. This

comparative process occurs simultaneously across four domains, between the four above-

mentioned characteristics of message and recipient. We label these Social fit, Attitudinal fit,

Informational fit, and Cognitive fit (see figure 2).

Social fit is the degree of fit between the social content of a message and the social

identity of the recipient. In line with social influence (Cialdini, 2006) and social identity research

(Hogg & Smith, 2007), Social fit is maximized when the social elements of a message resonate

with the social identity of the target: its content indicates a majority position shared by

recipient’s ingroup (Schultz et al., 2018), its source is perceived to be ingroup or allied (Ditto et

al., 2019b), and the intended purpose behind the message is perceived to be benevolent towards

the recipient’s ingroup (i.e. positive imputation, see Prost et al., 2022). Although persuasion
research points at various relevant social characteristics (e.g. source expertise and credibility;

von Hohenberg, & Guess, 2022), we propose that these all stem as a consequence of Social fit.

In general, messages from ingroup sources tend to be more persuasive on average (Haslam et al.,

1996; Wyer, 2010). Individuals will distrust experts on the basis of ingroup-outgroup

distinctions, or if they perceive malevolent intentions towards their ingroup (i.e. conspiracy

beliefs, Bertin et al., 2020). These social preferences are so strong that they can come at a high

price for the recipient. Among those who followed the wrong prescription because of Social fit

during the COVID-19 pandemic, some paid with their life (i.e. US Republicans, Gollwitzer et al.,

2020; Wallace et al., 2022).

Figure 2.
The Fitness process and four domains of application.

Attitudinal fit is the degree of fit between the prescriptive content of a message and the

attitudes of the recipient. In line with decades of research on reactance, polarization (Axelrod et

al., 2021) and attitude change (Festinger, 1957; Harmon-Jones, 2019), Attitudinal fit is obtained
when the prescriptions of a message match the recipient’s attitudes. Conversely, research points

at so-called “backfire effects” when exposed to counter-attitudinal messages (Bail et al, 2018).

Thus, Attitudinal fit can also be achieved by not overtly displaying a counter-attitudinal

prescription or by framing the message in a way that logically extends the recipient’s attitude to

an in fine counter-attitudinal conclusion (i.e. paradoxical thinking, see Hameiri et al., 2019). As

an illustration, political propaganda to incite liberals to vote for an anti-immigration candidate

would achieve greater Attitudinal fit making the case that welcoming more immigrants

(recipient’s attitude) is detrimental to immigrants themselves because it fuels right wing

populism (paradoxical conclusion, Rodrik, 2021).

We also propose a parallel process of Informational fit, which is the fitness index

between the descriptive content of a message and the recipient’s level of knowledge.

Informational fit is maximized when the information delivered by the message is consistent with

a recipient’s factual knowledge. However, unlike in the case of attitudes, information tends to be

more malleable, and recipients generally display low levels of factual knowledge on many

different topics outside their area of expertise (especially activists, Rosling et al., 2018).

Consequently, research shows that information which does not strictly align with one’s attitudes

may still be integrated without triggering reactance (Mercier, 2016; Ecker et al., 2020; Wood et

al., 2019).

Finally, Cognitive fit is the degree of fit between the format of a message and the

cognitive makeup of the recipient. It draws upon the numerous research findings covering

framing effects (see Steiger, & Kühberger, 2018), models of persuasion (Albarracin, & Shavitt,

2018), as well as the emotional, personality and cognitive moderators of persuasion (e.g. Need

for Cognition, Petty et al., 2009; Need for Closure, Webster & Kruglanski, 1994). Hence,
Cognitive fit can be achieved in many ways. For instance, research shows that when the

emotional tone of the message matches recipients’ emotional state (see Lammers & Baldwin,

2018), they can be led to adopt initially counter-attitudinal positions. Likewise, matching the

complexity or quality of a message to the recipients’ cognitive sophistication can improve central

(in-depth) processing and thus persuasion (Griffith et al., 2018). Conversely, breaking down a

prescription into simple steps may create a greater sense of clarity and ease, which in turn

facilitates behavioral achievement (Gollwitzer & Sheeran, 2006). Framing can also attempt to fit

to the recipient’s personality, for instance by feeding threatening information to individuals who

pay more attention to negative stimuli (high on neuroticism; Bakir, 2020).

2.3. Elements of the FVM: Validation

Having defined Fitness and its subcomponents, we will now turn to the second key element of

the FVM: Validation. The concept of Validation initially stems from the most up-to-date theory

of persuasion and attitude change: Self-Validation Theory (Briñol & Petty, 2022). In short, the

theory states that perceived thought validity is the main mediator of persuasive effects.

Individuals will change their attitudes if they perceive their thoughts about the message to be

valid (Petty et al., 2002; Briñol & Petty, 2009). This meta-cognitive aspect of persuasion

research is of crucial relevance for modelling propaganda. In fact, much of the effectiveness of

real-world mechanisms propaganda comes from its capacity to instill doubt or to foster certainty

(i.e. metacognitive entities) depending on its objectives (see Charon & Vilmer, 2021).

Therefore, Validation proposes that propagandist persuasion is also determined its

capacity to generate more positive (or less negative) cognitions about the message among

recipients. This proposition is also consistent with research showing that attitude certainty and

confidence in one’s judgement is often a better predictor of behavior than attitudinal valence
(see Tormala & Rucker, 2018 for a recent review). Accordingly, the FVM posits that Fitness

effects on behavior should be mediated by self-validation (see figure 3). In other words, the

effect of Fitness in the four domains of the model on attitude change should be due to its ability

to foster increased perceptions of thought validity among the recipient. The higher the total

Fitness across domains, the more confident the recipient should be in their own positive

appreciation of the message. The concept of Validation also allows us to introduce the viability

of dissuasion (as opposed to persuasion) as an alternative goal for propagandists. Whenever

persuasion to incite desired behaviors through validation is not possible (as when targeting

ideologically opposed recipients), dissuasion through invalidation could be an effective way to

prevent the recipient from adopting undesirable behaviors instead.

Figure 3.
Overview of the Fitness-Validation Model (FVM).

3. Propaganda in action: the 5Ds of propagandist persuasion

As we have seen, the FVM comprises five parameters: four types of Fitness, and a mediating

mechanism of meta-cognitive Validation. From these, we can directly derive five main pathways
through which propaganda can generate persuasion (or dissuasion), and which we have labeled

the five Ds, or 5D. Each pathway runs through one parameter of the FVM: Deceive taps into

social fit, Divert in attitudinal fit, Disrupt in informational fit, Decoy in cognitive fit and Disturb

targets self-validation. The 5D can be combined to maximize persuasive effects (see figure 4).

3.1. D is for Deceive


Deception is one of the most common and basic propaganda tactics. It relies on hijacking the

recipients’ social intuition by mischaracterizing the message as emanating from an ingroup (i.e. a

safe and benevolent) source or by depicting it as widely approved among the recipient’s ingroup.

Deception allows to bypass partisan and intergroup biases, thus improving the Social fit of a

message. To Deceive the recipient, a propagandist can hide the initial source of the message and

use a mediating source to disseminate the message instead. One way to do so is through online

coordinated operations using bots or paid “trolls” (i.e. astroturfing, Keller et al., 2020). Another

strategy among industry lobbyists is to use intermediary organizations such as charities or NGOs

who seem unrelated but are partially or fully funded by the source company (De Bruycker &

Beyers, 2019).

Likewise, deception can manipulate perceptions of group consensus by adding social

content related to descriptive norms in a message. Descriptive norms depicting a majority

position on an issue can be powerful vectors of behavior change (Schultz et al., 2018), especially

since their influence goes unnoticed by the recipient (see Nolan et al., 2008). As an illustration,

the 2022 illegal Russian referenda in occupied Ukrainian territories depicted implausibly high

figures in favor of annexation between 87 to 93%. According to the FVM, these referenda served

a double deceitful propaganda purpose. First, it could have generated perceptions of false

consensus among the occupied population, deterring them from resisting occupation (Bauman &
Geher, 2002). Second, it could have perpetuated pluralistic ignorance to outside observers (i.e.

non-Ukrainian populations), and decreased support for helping the occupied populations (De

Souza & Schmader, 2022).

In addition to norm salience, a message can also Deceive a recipient by tapping into self-

categorization processes to activate or downplay the salience of specific group identities

(Leonardelli & Toh, 2015). This can be achieved using cues for specific group identities to

generate Social fit. In fact, a defining feature of populist rhetoric is the reliance on social identity

cues such as “the people” versus “the elite”, or “natives” versus “immigrants” (Bos et al., 2020).

Yet, cueing identities is just one of two ways to use social categorization. In fact, recent

successful propaganda efforts such as the initial video sparking the Yellow Vests protests in

France (Mahfud & Adam-Troian, 2021) or the North Atlantic Fella Organization online

campaign (NAFO, support for Ukraine, Taylor et al., 2022) all make use of completely novel

identities. These identities were crafted with the help of element already familiar to the

recipients: a yellow vest present in every French car (legal obligation) to incite protesting against

increases in gas taxes (“driver” identity) and a dog picture part of the internet meme culture to

leverage support for Ukraine (“Western” identity).

Conversely, if the context is not favorable to improving Social fit between a message and

a recipient, deception can be achieved by degrading social fit between an alternative message

and a recipient. Conspiracist narratives accusing an outgroup of taking malevolent actions

against the recipient’s ingroup are particularly suited for this purpose. For instance, in the 1980s,

the KGB fabricated and disseminated narratives accusing the US of having manufactured AIDS

as a bioweapon, using marginalized groups as experimental subjects. These attempts to foster

anti-US government resentment among Western populations were so successful that, thirty years
later, 15% of African Americans still believed that AIDS was a tool of genocide against them

(see Boghardt, 2009). In a similar vein, the French far-right has coined the term “Islamo-leftist”

to discredit their political opponents in a conspiracist narrative depicting left-wing organizations

as enablers of Islamist terrorism (Wagener, 2022).

A lighter version of this strategy is used by Turkish, Russian, and Chinese propaganda

media in Western countries (TRT World, RT, CGTN). It deceives recipients by

disproportionately covering – amongst other news - malevolent social behavior from Western

authorities towards “the people” (police violence against protesters, corruption cases from high

officials… Charon & Vilmer, 2021). Compounded over time, this low-intensity propaganda

frame spreads distrust towards the recipient’s own government, which is progressively perceived

as an outgroup. This could lead recipients to be more trusting of the foreign propaganda source

than of their own news media, further increasing their susceptibility to propaganda.

3.2. D is for Divert


Propaganda targeting individuals whose attitudes are already favorable or neutral towards the

message already possesses high Attitudinal fit. But reactance can arise when targeting individuals

whose attitudes are opposed to a message’s prescriptions (e.g. message against political violence

aiming at violent extremists, Belanger et al., 2020). To minimize or avoid discrepancy in the

attitudinal domain, diversion can be used. To Divert and bypass recipients’ resistance, a

propagandist can try to change the attitude of reference for the comparison process. Instead of

directly confronting the recipient’s resistant attitudes, diversion will either blend in with the

recipients’ attitudes or activate a different set of nonresistant attitudes related to the prescription.

One way to do so is to shift the focus away and “zoom out” to redefine the problem at

hand. For instance, an anti-legalized abortion message would immediately encounter attitudinal
resistance among individuals who favor the progress of women’s rights. Attitudinal fit can be

improved in this case by focusing on the negative consequences of legalized abortion on

women’s mental health (while carefully omitting its positive effects on maternal mortality;

Reardon, 2018). This way, de-legalizing abortion is presented as an improvement for women’s

mental health rather than as a regression in terms of women’s rights. A second strategy to Divert

uses a variation on this technique. Instead of redefining the problem to change the attitudinal

domain of reference, it consists in “zooming in” to the details of an issue to do the same. This

technique is extensively used to justify aggression, when the propagandist operates on behalf of a

perpetrator (i.e. victim blaming, Van den Bos, 2009). This technique is found in the Kremlin’s

propaganda narrative depicting their illegal invasion of Ukraine in 2022, framed as a response to

NATO eastwards expansion (which has no factual basis, Rühle, 2014). This type of narrative can

shunt resistant attitudes against war by shifting attention towards the putative causal mechanisms

leading up to the aggression.

In addition, a third technique consists in focusing on the morality of an issue (e.g.

moralization, Mooijman et al., 2018) to Divert the recipient towards higher-order moral attitudes

(instead of attitudes towards the issue per se). In line with our previous example, Kremlin-

backed actors spread another narrative regarding their invasion, aiming to bypass the positive

attitudes of Western audiences towards support for Ukraine. According to this narrative,

supporting weapons delivery to Ukraine means “escalating” the conflict. This narrative turns

support for an invaded country’s right to defend itself into a hawkish choice leading to more

deaths (Tenzer, 2022). Thereby, it decreases Attitudinal fit between recipients’ moral aversion

towards war and alternative messages inciting them to support Ukraine.


This subversion of pacifism to the profit of aggressive, war mongering political actors

was already used by Nazi propaganda during WWII (Orwell, 2018). Likewise, lobbyists

frequently make use of euphemisms to tap into widely shared moral attitudes: who would

consider themselves anti-life, or think that not all lives matter?

3.3. D is for Disrupt

Unlike attitudes, which are sensitive to counter-positions, information is more malleable and less

subject to reactance. Because they are not tied with moral judgement, emotions and social

identities, factual descriptions tend to provoke less “backlash effects” if contradicting one’s

attitudes on a given issue (Ecker et al., 2020; Wood et al., 2019). However, it does not mean that

Informational fit should be less relevant than other types of fit. The way information is framed,

how facts are presented and described do matter and need to be tailored to the knowledge of the

recipient.

The Disrupt strategy fully exploits the information ecology in which recipients are

immersed. First, disruption can be created by owning temporal primacy over a narrative before

recipients could accumulate any alternative information. This can be done by exposing naïve

recipients to propaganda compatible interpretations of an event or issue. For example,

psychological inoculation (McGuire, 1964) works by exposing individuals to facts before they

encounter a specific fake news in order to protect them against misinformation (Maertens et al.,

2021). However, psychological inoculation does not depend on the content of a message: any

early exposure to a counterargument will protect individuals from an argument encountered later

(Banas, & Rains, 2010). Therefore, propaganda can also be used for psychological “intoxication”

by feeding a naïve recipient with a counter argument against an alternative, yet valid

information. This strategy is used by creationist online propaganda, which essentially provides
counter arguments to scientific claims before their audience was sufficiently exposed to the

scientific evidence against creationism (Aechtner, 2010).

Sometimes, however, owning temporal primacy ownership is not possible, or targeted

recipients already possess extensive knowledge regarding the issue at hand (e.g. trying to

convince a geologist that the earth is flat). In these cases, inoculation and intoxication will not be

fit for purpose and Informational fit may not be a realistic goal. When this happens, another

strategy can be used to generate disruption through saturating available information. Saturation

aims not to generate fit between message and recipient, but misfit between an alternative

message and the recipient instead. This is typically done by feeding the recipient with alternative

– contradicting – information. Irrespective of content, the mere existence of an alternative

narrative could be sufficient to generate Informational misfit between the recipient’s knowledge

and a narrative. This could be why the spokespersons for authoritarian regimes so confidently

repeat information or news that recipients and third-party observers know to be false. For

instance, the Chinese Communist Party’s spinning of their Xinjiang concentration camps for

Uyghurs as “educational” (BBC, 2020) could have a persuasive effect by breaking down

Information fitness among recipients, even those knowledgeable of the issue.

A variation on the saturation technique is to make use of true statements instead of fake

news. Accordingly, exploiting scientific doubt and uncertainty is a common practice of industry

lobbyists (e.g. pesticide manufacturers). They can do so by funding studies investigating

alternative causes (e.g. climate change) to a phenomenon their company is accused of

contributing to (e.g. bee population decline). This is done to drown policy makers and the public

into fabricated complexity, which can then delay their deliberation and action (i.e. to achieve

dissuasion, see Bramoullé & Orset, 2018). This “benefit of doubt” tactic created by the mere
availability of alternative narratives is – in our view – one of the most pervasive forms of modern

propaganda. It is a defining feature of our post-truth era and of the political style of right-wing

populist politicians in the West (Lewandowsky et al., 2017).

3.4.D is for Decoy

The last propaganda strategy relying on Fitness is to Decoy by targeting Cognitive fit. The

purpose of Decoy is to leverage individuals’ cognitive and reasoning processes towards a

propaganda-compatible conclusion. This can be achieved by tapping into various persuasion and

techniques afforded by social psychological research. The common feature of these techniques is

their reliance on generating a sense of fit between the message characteristics and the recipients’

psychological makeup. As such, successful Decoy will make the message feel intuitive to the

recipient, although its prescription could be contrary to their own beliefs and interests.

The first category of Decoy propaganda exploits cognitive consistency: the message is

crafted to fit the recipients’ cognitive architecture. Broadly speaking, mass campaigns towards

the general public can tap into universal cognitive biases (e.g. scarcity; Mittone & Savadori,

2009; sensitivity to semantic prosody; Hauser & Schwartz, 2018). Such messages can be cost-

effective, for instance when promoting health behaviors (gym attendance; Milkman et al., 2019).

However, to maximize impact, finer-grained propaganda can be crafted to fit the cognitive

peculiarities of targeted subgroups. As an illustration, ISIS online propaganda using violent

content is more effective among ideological extremists (Mitts et al., 2022) because they are more

psychologically inclined towards sensation-seeking (Zmigrod et al., 2021). Likewise,

conservatism is associated with increased perceptual caution (Zmigrod et al., op cit).

Accordingly, political propaganda targeting conservative electorates make extensive use of


societal threats such as economic decline, crime increases and demographic change (Obaidi et

al., 2022; Kende & Krekó, 2020).

Perceptual and cognitive processes are not the sole targets of Decoy strategies. In fact,

emotional states too can be leveraged to generate Cognitive fit among recipients. Recent

evidence showed that matching the emotional tone of a message to recipients’ emotions like

enthusiasm and fear significantly increased its persuasiveness (Zarouali et al., 2022). This

emotional-congruence effect can even be leveraged to bypass attitudinal resistance. For example,

it is established that political conservatives tend to display higher chronic nostalgia than liberals

(Lammers & Baldwin, 2020). Consequently, messages with a past-focused temporal frame (“I

would prefer to go back to the old days, where people may have owned hunting rifles and pistols,

but no one had assault rifles.”) were found to increase conservatives’ endorsement of policies

that were opposed to their attitudes (increasing gun control; Lammers & Baldwin, 2018).

It is also possible to Decoy recipients based on personality. This strategy was used during

Donald Trump’s 2016 presidential campaign in the US. Data collected on social media allowed

the classification of voters based on their personality traits by a third-party company. This

company then sent targeted advertisements including, for example, threatening content for highly

neurotic individuals (who tend to focus more on negative stimuli; Bakir, 2020). Similarly,

messages matching the linguistic style of extraverts (confident, assertive) and introverts (open,

questioning) tend to achieve greater persuasiveness (Zarouali et al., 2022). It is possible to tap

into other salient individual differences, such as collective narcissism (Cichocka & Cislak,

2020). This was done in a recent campaign in Turkey emanating from the main opposition party

(CHP). The CHP aimed to sway partisans of the nationalist and anti-Western incumbent in the

upcoming 2023 presidential election. To Decoy this electorate, the pro-European CHP displayed
billboard messages using collective narcissist frames such as “Hey world… Turkey will not be

your refugee camp” and “…Turkey will not be your go to place for cheap labour” (Yeni Safak,

2022).

Figure 4.
The 5D of propagandist persuasion and their relationships with elements of the Fitness-
Validation Model (FVM).

3.5. D is for Disturb

Like efforts targeting Fitness, strategies to Disturb the Validation process can provide extremely

useful. Disturbance comes in two forms and can take a positive or a negative form. Positive

disturbance is rather straightforward as it aims to bolster the recipient’s perceptions of thought

validity regarding the message. Examples of positive disturbance include populist rhetoric

flattering the epistemic virtues of “the people” over the “intellectual elite” and traditional

marketing praising the expertise of consumers. Likewise, pseudoscientific propaganda (e.g.

alternative medicine advertisement, religious indoctrination) often refer to the superior validity
of the recipient’s intuition (or faith) relative to expertise, scientific or rational analysis (Evans et

al., 2020; Šrol, 2022; Yilmaz, 2021). Typically, this is done with messages including relativistic

arguments which depict recipients’ intuition as equally valid to that of experts or scientists (i.e.

post-factual relativism, see Hameleers, 2021).

Positive disturbance is, however, less likely to affect resistant recipients. This is why

autocratic and illiberal regimes base much of their disinformation propaganda aiming at well-

informed and educated citizens from liberal-democratic countries (so-called WEIRD

populations, Muthukrishna et al., 2020) on negative disturbance. This type of disturbance aims at

paralyzing judgment and crippling action among recipients, instead of bolstering it. Negative

disturbance can be seen as the higher-order equivalent of “gaslighting” in interpersonal

relationships (Johnson et al., 2021). Instead of flattering the recipient’s ability to reason and

make judgements, negative disturbance denigrates their capacity to make informed (or least

biased) judgement.

Negative disturbance can come in two variants. It can imply the recipient cannot take

position on an issue because they do not possess the abilities to do so, or due to their social or

cultural identity. For instance, an overly abstract and complex presentation of arguments devoid

of logical coherence can bully individuals into adhering to a prescription because they may feel

unable to assess its veracity. Although the use of pseudo-profound b******t of this kind is quite

frequent in religious, pseudoscientific, and new age propaganda (see Pennycook et al., 2015) it is

often used by activists targeting educated audiences. This is a staple of modern far-left

propaganda, which makes use of unfalsifiable sociological theories to intellectually intimidate

recipients, a tactic that seems to work especially well among academic audiences (see Andreski,

1972; Sokal & Bricmont, 1997; McWhorter, 2021).


Alternatively, propaganda involving international relations issues (but not only) seems to

be making extensive use of negative disturbance based on social and cultural bias. Authoritarian

regimes often base their counter-propaganda efforts on culturally relativistic arguments. For

instance, secular criticism of Islam as a religion is consistently countered by Islamist

organizations as “Islamophobic”, although research shows no correlation between such criticism

and prejudice towards Muslims (Adam-Troian, 2021; Imhoff & Recker, 2012). In a similar way,

criticisms of the current denial of the Armenian genocide in Turkey are often neutralized by

Turkish State counternarratives regarding past colonial crimes committed by Europeans

(Marchand & Perrier, 2022). Propaganda channels from illiberal actors targeting WEIRD

recipients often make use of narratives and documentaries about “American imperialism”,

European colonialism, and the ongoing “victimization” of non-Western diasporas in the West.

This strategy based on competitive victimhood (Young & Sullivan, 2016) can be very effective

in neutralizing negative attitudes towards the message source. As an illustration, in the wake of

the 2022 invasion of Ukraine, 43% of French citizens believed that “the Russian military

intervention […] is supported by Russian-speaking Ukrainians who wanted to free themselves

from the persecution they suffer at the hands of Ukrainian authorities” (IFOP & Reboot, 2022).

Discussion

In this paper, we have proposed the first model of propaganda based on the current findings in

social psychological research, the FVM. This model relies on Fitness along four dimensions

between message and recipient, and a mediation by the metacognitive process of Validation.

This synthetic account allows to integrate various social psychological subfields of interest to the

study of propaganda. It is also the basis for deriving five main pathways to propagandist

influence, the 5D. These strategies to Deceive social intuition, Divert resistant attitudes, Disrupt
information processing, Decoy reasoning and Disturb meta-cognition can be describe a variety of

real-world propaganda in different domains. Moreover, they constitute the basic “building-

blocks” or ingredients on which propaganda and counterpropaganda can be created and tested by

policy makers, propaganda professionals and applied researchers alike. Before concluding, we

will now briefly discuss some of the implications of the FVM and highlight some important

limitations to our approach.

First and foremost, the FVM and its 5D constitute a substantial attempt at garnering

explanatory power while keeping a low number of parameters. The FVM can explain a variety of

real-world propaganda elements, tactics and strategies that seem, on the surface, unrelated to one

another. Without such a model, it could be difficult to detect the common mechanisms at play

behind online far-right trolling, anti-Western rhetoric in international news channels and the

mass of studies emanating from the food industry’s PR departments. The FVM is organized

around persuasion and meta-cognition, rather than propaganda content, which could be more

accurate to classify propaganda. In fact, the FVM allows a classification based on dynamic

psychological strategies (or “blueprints”) instead of fixed contents. The model possesses

ecological validity because it converges with other models and conceptualizations made

independently by other agents (e.g. “reflexive control” as theorized by the KGB; Til, 2021). The

FVM framework also enables researchers to test novel sets of hypotheses. As an example, the

processes of Informational fit and Disrupt suggest that the mere availability of alternative

information could undermine trust in true information regardless of their content.

The FVM can indeed offer a robust and integrative framework within which propaganda

researchers could build research programs for theoretical developments in the field. Yet, by

linking research in different areas, the FVM also creates affordances to get fresh insights on
already intensively researched phenomena. For instance, over the past decade, much research has

been dedicated to studying the causes and consequences of conspiracy beliefs in various domains

(see Hornsey et al., 2022 for a recent overview). In light of the FVM however, novel research

questions can emerge. At the moment, the role of conspiracy beliefs as persuasive narratives

remains underexplored. How and to what extent could conspiracist-framed messages foster

attitude change in other domains than intergroup relations? Likewise, the four Fitness domains

described by the FVM could be linked with the predictors of conspiracist ideation. This could

help to explain the psychological attractiveness of such narratives. Social fit matches the social

motives underpinning conspiracy beliefs (Sternisko et al., 2020), while Cognitive fit matches the

use of intuitive reasoning bias (e.g. conjunction fallacy) in conspiracist narration (see Douglas et

al., 2019).

More broadly speaking, the flexibility of the FVM and scope of the 5D allows to shed a

novel light on propaganda, extremism and misinformation. As we have seen, propaganda fully

exploits the principle of psychological equifinality (Heider, 1958): several causes can underly a

behavior or attitude, several mechanisms or processes can lead to the same outcome. As such,

several strategies, combinations of the 5D across various target publics can achieve a

propagandist’s goal. The use of one or several Ds over others is constrained only by the features

of recipients and the means available to propagandists. Consequently, the multiplicity of

pathways for propaganda suggests that the behaviors under study by influence and persuasion

researchers should be expanded. The behavioral repertoire of propaganda can include spreading

a message, rumor or fake news, arguing with relatives for the propagandist’s cause, engaging in

collective action seemingly unrelated to the cause (e.g. protest for “peace” in a conflict instead of
protesting for support to one side), inaction itself and even lack of interest for an important issue

(i.e. depoliticization).

Similarly, extremism seen through the lenses of the FVM may appear to have positive

functions depending on its content. Indeed, current research in political psychology seems to

implicitly assume that ideological rigidity is a flaw, and often approaches ideology through the

prism of its distorting influence on reasoning, regardless of content (e.g. Zmigrod, 2020). Yet,

the FVM offers a functional approach to extremism. Attitude certainty, rigidity or ideological

extremism may be functional in preventing persuasion attempts from opposed actors. If so,

extreme, certain, and rigid pro-democratic attitudes could paradoxically be the best inoculation

strategy against political violence. Importantly, the FVM promotes the idea that individuals are

not irrational and passive vessels of information to be “brainwashed” (see Mercier, 2016).

Instead, propaganda goes to great kinetic extents to neutralize or excite motivated actors by

adapting, infiltrating, and hijacking individual’s sophisticated reasoning processes to redirect

their outcomes. This is relevant to the current debate surrounding misinformation, because the

5D show that fake news is only one of the many tools of illiberal propaganda, and maybe among

the least effective ones (Altay et al., 2021).

Also, the 5D can provide a useful guide to construct a (counter)propaganda campaign to

liberal democratic actors. It can also stimulate applied research to design ever more persuasive

messages to different publics. In view of the 5D, we can see that the current approach to fighting

misinformation and malevolent propaganda in WEIRD countries represents just a fraction of

what can be done. In fact, liberal democratic and scientific actors have the advantage of the truth

when confronting illiberal authoritarian ones and could think of more active and creative ways of

countering their actions. As an illustration, anti-ISIS propaganda in the West could target
conservative Muslims with video messages using Deceive giving the real number of foreign

fighters (less than 0.1% of the community) and Decoying with nostalgia for “a time in the 60s,

before the spread of radical Islam when Muslims were more secular and open”. In parallel, news

channels could disseminate documentaries about the crimes of Ottoman colonialism to Disturb

the anti-Western rhetoric of djihadis. Political communication to foster support for Ukraine

among public opinions could Divert the far-right by highlighting the actions of Russia against the

West and the patriotic nature of support to Ukraine. The far-left could be diverted with content

showing how the Russian State oppresses minorities (sending them to the “meat-grinder”,

McKinnon, 2022), and how support for Ukraine helps minorities “fight against imperialist

oppression”. The possibilities are numerous and solely bounded by the imagination of our liberal

democratic propagandists.

Before concluding, however, we must note three caveats with the FVM, which we hope

further developments could help improve. The model’s mains strength is indeed its breadth, but

it is also one of its main limitations. For instance, the FVM does not focus on the temporal

dynamics of propaganda and does not include elements related to repeated exposure and

conditioning (see Montoya et al., 2017). Of course, these elements are part of the Cognitive fit

aspect of the FVM, but nonetheless, their time-related features are not quite integrated yet. The

limited number of parameters, which is essential to achieve parsimony, leaves questions open

regarding the description of some influence processes in the model. The principle of reciprocity

(see Cialdini, 2006), which is at the heart of some very effective lobbying strategies (e.g. in the

medical community) could be compatible with both Deceive/Social fit (relies on forming

“ingroup” relations) and Decoy/Cognitive fit (stems from universal cognitive biases towards

ingroups). This may be illustrative of an inherent limit of any model of propaganda. Although
ecologically valid, the FVM makes distinctions between features that may come bundled

together in the real world, which explains why some strategies may sometimes fit more than one

D at a time.

Conclusion

Although social psychology can be considered as the science of propaganda par excellence,

propaganda as an operational concept remained quasi-absent from the field until now. This

absence is painfully visible as, by the time this article was written, Europe was to enter its first

year in a war of the highest intensity since WWII. In this war of genocide against the Ukrainian

people (Connolly, 2022), Russian neo-fascist disinformation (Umland & Eichstaett, 2009),

Ukrainian psychological operations and Western counterpropaganda constitute the main

weapons on the informational battlefront (Abrams, 2022). In this context, formal models, and

empirical analyses of propaganda efforts from all protagonists could greatly benefit the liberal

democratic war effort of Ukraine and help save a substantial number of lives. This could be

achieved by maximizing the effectiveness of messages which aim to increase Russian

servicemembers surrender rates, the volume of donations to humanitarian aid and public opinion

pressure to supply weapons to the Ukrainian Armed Forces (thereby shortening the length of

Russian occupation). Likewise, the current issues confronting liberal democracies, such as the

resurgence of terrorism, conspiracism, antiscientific and illiberal movements are all issues where

propaganda acts as a pharmakon, by being both the ill and the remedy. We hope that the present

paper provides decision-makers and researchers alike with an integrative basis and roadmap to

conduct ever more useful studies of the mechanisms at play behind propaganda and their

application in the service of secular, scientific and liberal democracies.


References
Abrams, Z. (2022). The role of psychological warfare in the battle for Ukraine. Accessed
December 23rd from https://www.apa.org/monitor/2022/06/news-psychological-warfare
Abrams, D., & Hogg, M. A. (1990). Social identification, self-categorization and social
influence. European review of social psychology, 1(1), 195-228.
https://doi.org/10.1080/14792779108401862
Adam-Troian, J. (2021). The French (non) Connection: A Closer Look at the Role of
Secularism and Socio-Educational Disparities on Domestic Islamist Radicalization in
France. Journal for Deradicalization, 24, 39-66. ISSN: 2363-9849
Aechtner, T. (2010). Online in the evolution wars: An analysis of young earth creationism
cyber-propaganda. The Australian Religious Studies Review, 23(3), 277-300.
https://doi.org/10.1558/arsr.v23i3.277
Albarracin, D., & Shavitt, S. (2018). Attitudes and attitude change. Annual review of
psychology, 69(1), 299-327. https://doi.org/10.1146/annurev-psych-122216-011911
Altay, S., Berriche, M., & Acerbi, A. (2021). Misinformation on misinformation: Conceptual
and methodological challenges. Accessed December 28th from https://hal-
sciencespo.archives-ouvertes.fr/hal-03700770
Andreski, S. (1972). Social sciences as sorcery. Deutsch.
Axelrod, R., Daymude, J. J., & Forrest, S. (2021). Preventing extreme polarization of political
attitudes. Proceedings of the National Academy of Sciences, 118(50), e2102139118.
https://doi.org/10.1073/pnas.2102139118
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. F., ... &
Volfovsky, A. (2018). Exposure to opposing views on social media can increase
political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216-
9221. https://doi.org/10.1073/pnas.1804840115
Bakir, V. (2020). Psychological operations in digital political campaigns: Assessing Cambridge
Analytica's psychographic profiling and targeting. Frontiers in Communication, 5, 67.
https://doi.org/10.3389/fcomm.2020.00067
Bauman, K. P., & Geher, G. (2002). We think you agree: The detrimental impact of the false
consensus effect on behavior. Current Psychology, 21(4), 293-318.
https://doi.org/10.1007/s12144-002-1020-0
Bos, L., Schemer, C., Corbu, N., Hameleers, M., Andreadis, I., Schulz, A., ... & Fawzi, N.
(2020). The effects of populism as a social identity frame on persuasion and
mobilisation: Evidence from a 15‐country experiment. European Journal of Political
Research, 59(1), 3-24. https://doi.org/10.1111/1475-6765.12334
Braddock, K. (2022). Vaccinating against hate: Using attitudinal inoculation to confer
resistance to persuasion by extremist propaganda. Terrorism and Political
Violence, 34(2), 240-262. https://doi.org/10.1080/09546553.2019.1693370
Charon, P., & Vilmer, J. B. J. (2021). Chinese Influence Operations: A Machiavellian
Moment. Institute for Strategic Research (IRSEM). Accessed December 28th 2022 from
https://www.irsem.fr/report.html
Cichocka, A., & Cislak, A. (2020). Nationalism as collective narcissism. Current Opinion in
Behavioral Sciences, 34, 69-74. https://doi.org/10.1016/j.cobeha.2019.12.013
Banas, J. A., & Rains, S. A. (2010). A meta-analysis of research on inoculation
theory. Communication Monographs, 77(3), 281-311.
https://doi.org/10.1080/03637751003758193
Bar‐Tal, D., & Hameiri, B. (2020). Interventions to change well‐anchored attitudes in the
context of intergroup conflict. Social and Personality Psychology Compass, 14(7),
e12534. https://doi.org/10.1111/spc3.12534
Bartlett, F. C. (1940). Political propaganda. Univ. Press, Macmillan.

Bauvois, J. L. (2007). La propagande dans les démocraties libérales [Propaganda within liberal
democracies]. Le journal des psychologues, (4), 39-43.
https://doi.org/10.3917/jdp.247.0039
Bechler, C. J., Tormala, Z. L., & Rucker, D. D. (2021). The attitude–behavior relationship
revisited. Psychological Science, 32(8), 1285-1297.
https://doi.org/10.1177/0956797621995206
Bélanger, J. J., Nisa, C. F., Schumpe, B. M., Gurmu, T., Williams, M. J., & Putra, I. E. (2020).
Do counter-narratives reduce support for ISIS? Yes, but not for their target
audience. Frontiers in psychology, 11, 1059. https://doi.org/10.3389/fpsyg.2020.01059
Bernays, E. L. (1942). The marketing of national policies: A study of war propaganda. Journal
of Marketing, 6(3), 236-244. https://doi.org/10.1177/002224294200600303
Bertin, P., Nera, K., & Delouvée, S. (2020). Conspiracy beliefs, rejection of vaccination, and
support for hydroxychloroquine: A conceptual replication-extension in the COVID-19
pandemic context. Frontiers in psychology, 2471.
https://doi.org/10.3389/fpsyg.2020.565128
Biddle, W. W. (1931). A psychological definition of propaganda. The Journal of Abnormal and
Social Psychology, 26(3), 283. https://doi.org/10.1037/h0074944
Boghardt, T. (2009). Soviet Bloc intelligence and its AIDS disinformation campaign. Studies in
intelligence, 53(4), 1-24.
Bohner, G., & Dickel, N. (2011). Attitudes and attitude change. Annual review of
psychology, 62, 391-417. https://doi.org/10.1146/annurev.psych.121208.131609
Bramoullé, Y., & Orset, C. (2018). Manufacturing doubt. Journal of Environmental Economics
and Management, 90, 119-133. https://doi.org/10.1016/j.jeem.2018.04.010
Briñol, P., & Petty, R. E. (2009). Source factors in persuasion: A self-validation
approach. European review of social psychology, 20(1), 49-96.
https://doi.org/10.1080/10463280802643640
Briñol, P., & Petty, R. E. (2022). Self-validation theory: An integrative framework for
understanding when thoughts become consequential. Psychological Review, 129(2),
340. https://doi.org/10.1037/rev0000340
Cialdini, R. B. (2006). Influence: the psychology of persuasion, revised edition. New York:
William Morrow.
Connolly, A. (2022, April). Canadian MPs unanimously back motion recognizing Russian
‘genocide’ in Ukraine. Global News. Accessed 21st December 2022 at
https://globalnews.ca/news/8791299/canadian-parliament-recognizes-russian-genocide-
ukraine/
De Bruycker, I., & Beyers, J. (2019). Lobbying strategies and success: Inside and outside
lobbying in European Union legislative politics. European Political Science
Review, 11(1), 57-74. https://doi.org/10.1017/S1755773918000218
De Souza, L., & Schmader, T. (2022). The misjudgment of men: Does pluralistic ignorance
inhibit allyship? Journal of Personality and Social Psychology, 122(2), 265.
https://doi.org/10.1037/pspi0000362
Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019b). Partisan bias and its discontents. Perspectives on Psychological Science, 14(2),
304-316. https://doi.org/10.1177/1745691618817753
Ditto, P. H., Liu, B. S., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019a). At least bias is bipartisan: A meta-analytic comparison of partisan bias in
liberals and conservatives. Perspectives on Psychological Science, 14(2), 273-291.
https://doi.org/10.1177/1745691617746796
Donohue, J. J., & Levitt, S. (2020). The impact of legalized abortion on crime over the last two
decades. American law and economics review, 22(2), 241-302.
https://doi.org/10.1093/aler/ahaa008
Doob, L. W. (1948). Public opinion and propaganda. Henry Holt.

Dovidio, J. F., Glick, P., & Rudman, L. A. (Eds.). (2008). On the nature of prejudice: Fifty
years after Allport. John Wiley & Sons.
Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi,
F. (2019). Understanding conspiracy theories. Political Psychology, 40, 3-35.
https://doi.org/10.1111/pops.12568
Eagly, A. H., & Chaiken, S. (1984). Cognitive theories of persuasion. In Advances in
experimental social psychology (Vol. 17, pp. 267-359). Academic Press.
https://doi.org/10.1016/S0065-2601(08)60122-7
Ecker, U. K., Lewandowsky, S., & Chadwick, M. (2020). Can corrections spread
misinformation to new audiences? Testing for the elusive familiarity backfire
effect. Cognitive Research: Principles and Implications, 5(1), 1-25.
https://doi.org/10.1186/s41235-020-00241-6
Evans, A., Sleegers, W., & Mlakar, Ž. (2020). Individual differences in receptivity to scientific
bullshit. Judgment and Decision Making, 15(3), 401.
Festinger, L. (1957). A theory of cognitive dissonance. Row, Peterson
Gollwitzer, A., Martel, C., Brady, W. J., Pärnamets, P., Freedman, I. G., Knowles, E. D., &
Van Bavel, J. J. (2020). Partisan differences in physical distancing are linked to health
outcomes during the COVID-19 pandemic. Nature human behaviour, 4(11), 1186-1197.
https://doi.org/10.1038/s41562-020-00977-7
Gollwitzer, P. M., & Sheeran, P. (2006). Implementation intentions and goal achievement: A
meta‐analysis of effects and processes. Advances in experimental social psychology, 38,
69-119. https://doi.org/10.1016/S0065-2601(06)38002-1
Grazzini, L., Rodrigo, P., Aiello, G., & Viglia, G. (2018). Loss or gain? The role of message
framing in hotel guests’ recycling behaviour. Journal of Sustainable Tourism, 26(11),
1944-1966. https://doi.org/10.1080/09669582.2018.1526294
Griffith, E. E., Nolder, C. J., & Petty, R. E. (2018). The elaboration likelihood model: A meta-
theory for synthesizing auditor judgment and decision-making research. Auditing: A
Journal of Practice & Theory, 37(4), 169-186. https://doi.org/10.2308/ajpt-52018
Halperin, E., & Schori‐Eyal, N. (2020). Towards a new framework of personalized
psychological interventions to improve intergroup relations and promote peace. Social
and Personality Psychology Compass, 14(5), 255-270.
https://doi.org/10.1111/spc3.12527
Hameiri, B., Bar‐Tal, D., & Halperin, E. (2019). Paradoxical thinking interventions: A
paradigm for societal change. Social Issues and Policy Review, 13(1), 36-62.
https://doi.org/10.1111/sipr.12053
Hameleers, M. (2021). Populist disinformation in fragmented information settings:
Understanding the nature and persuasiveness of populist and post-factual
communication. Routledge.
Harmon-Jones, E. (Ed.). (2019). Cognitive dissonance: Reexamining a pivotal theory in
psychology (2nd ed.). American Psychological
Association. https://doi.org/10.1037/0000135-000
Haselton, M. G., Nettle, D., & Andrews, P. W. (2015). The evolution of cognitive bias. The
handbook of evolutionary psychology, 724-746.
Haslam, S. A., McGarty, C., & Turner, J. C. (1996). Salient group memberships and
persuasion: The role of social identity in the validation of beliefs. In J. L. Nye & A. M.
Brower (Eds.), What's social about social cognition? Research on socially shared
cognition in small groups (pp. 29–56). Sage Publications,
Inc. https://doi.org/10.4135/9781483327648.n2
Haugtvedt, C. P., & Petty, R. E. (1992). Personality and persuasion: Need for cognition
moderates the persistence and resistance of attitude changes. Journal of Personality and
Social psychology, 63(2), 308-319. https://doi.org/10.1037/0022-3514.63.2.308
Hauser, D. J., & Schwarz, N. (2018). How seemingly innocuous words can bias judgment:
Semantic prosody and impression formation. Journal of experimental social
psychology, 75, 11-18. https://doi.org/10.1016/j.jesp.2017.10.012
Heider, F. (1958). The psychology of interpersonal relations. Psychology Press.
Herman, E. S., & Chomsky, N. (2010). Manufacturing consent: The political economy of the
mass media. Random House.
Hogg, M. A., & Smith, J. R. (2007). Attitudes in social context: A social identity
perspective. European Review of Social Psychology, 18(1), 89-131.
https://doi.org/10.1080/10463280701592070
Hornsey, M. J., Bierwiaczonek, K., Sassenberg, K., & Douglas, K. M. (2022). Individual,
intergroup and nation-level influences on belief in conspiracy theories. Nature Reviews
Psychology, 1-13. https://doi.org/10.1038/s44159-022-00133-0
Hovland, C. I., Lumsdaine, A. A., & Sheffield, F. D. (1949). Experiments on mass
communication. (Studies in social psychology in World War II). Princeton University
Press.
Jones, C. R., Olson, M. A., & Fazio, R. H. (2010). Evaluative conditioning: The “how”
question. In Advances in experimental social psychology (Vol. 43, pp. 205-255).
Academic Press.
IFOP & Reboot. (2022). Observatoire Reboot de l’information et du raisonnement critique –
Désinformation et populisme à l’heure de la crise sanitaire et de la guerre en Ukraine.
[Reboot Information and Critical Thinking Observatory - Disinformation and populism
in the time of the health crisis and the war in Ukraine.] Accessed December 28th, 2022
from https://www.ifop.com/publication/observatoire-reboot-de-linformation-et-du-
raisonnement-critique-desinformation-et-populisme-a-lheure-de-la-crise-sanitaire-et-de-
la-guerre-en-ukraine/
Imhoff, R., & Recker, J. (2012). Differentiating Islamophobia: Introducing a new scale to
measure Islamoprejudice and secular Islam critique. Political Psychology, 33(6), 811-
824. https://doi.org/10.1111/j.1467-9221.2012.00911.x
Johnson, V. E., Nadal, K. L., Sissoko, D. G., & King, R. (2021). “It’s not in your head”:
Gaslighting,‘splaining, victim blaming, and other harmful reactions to
microaggressions. Perspectives on psychological science, 16(5), 1024-1036.
https://doi.org/10.1177/17456916211011963
Jost, J. T. (2017). Ideological asymmetries and the essence of political psychology. Political
psychology, 38(2), 167-208. https://doi.org/10.1111/pops.12407
Jowett, G. S., & O'donnell, V. (2018). Propaganda & persuasion. Sage publications.
Kende, A., & Krekó, P. (2020). Xenophobia, prejudice, and right-wing populism in East-
Central Europe. Current Opinion in Behavioral Sciences, 34, 29-33.
https://doi.org/10.1016/j.cobeha.2019.11.011
Keller, F. B., Schoch, D., Stier, S., & Yang, J. (2020). Political astroturfing on Twitter: How to
coordinate a disinformation campaign. Political Communication, 37(2), 256-280.
https://doi.org/10.1080/10584609.2019.1661888
Lammers, J., & Baldwin, M. (2018). Past-focused temporal communication overcomes
conservatives’ resistance to liberal political ideas. Journal of Personality and Social
Psychology, 114(4), 599. https://doi.org/10.1037/pspi0000121
Lammers, J., & Baldwin, M. (2020). Make America gracious again: Collective nostalgia can
increase and decrease support for right‐wing populist rhetoric. European Journal of
Social Psychology, 50(5), 943-954. https://doi.org/10.1002/ejsp.2673
Lasswell, H. D. (1927). The theory of political propaganda. American Political Science
Review, 21(3), 627-631.
Latt, S. M., Milner, A., & Kavanagh, A. (2019). Abortion laws reform may reduce maternal
mortality: an ecological study in 162 countries. BMC women's health, 19(1), 1-9.
https://doi.org/10.1186/s12905-018-0705-y
Leonardelli, G. J., & Toh, S. M. (2015). Social categorization in intergroup contexts: Three
kinds of self‐categorization. Social and Personality Psychology Compass, 9(2), 69-87.
https://doi.org/10.1111/spc3.12150
Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding
and coping with the “post-truth” era. Journal of applied research in memory and
cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008
Lewin, K. (1943). Forces behind food habits and methods of change. Bulletin of the national
Research Council, 108(1043), 35-65.
Lewin, K. (1947). Group decision and social change. Readings in social psychology, 3(1), 197-
211
Linebarger, P. M. A. (1954). Psychological Warfare: International Propaganda and
Communications. Duell: Sloan and Pearce.
Maertens, R., Roozenbeek, J., Basol, M., & van der Linden, S. (2021). Long-term effectiveness
of inoculation against misinformation: Three longitudinal experiments. Journal of
Experimental Psychology: Applied, 27(1), 1–16. https://doi.org/10.1037/xap0000315
McWhorter, J. (2021). Woke racism: How a new religion has betrayed Black America.
Penguin.
Marchand L., & Perrier, G. (2022). Les loups aiment la brume: Enquête sur les opérations
clandestines de la Turquie en Europe [Wolves like fog : Investigating Turkish
clandestine operations in Europe]. Grasset.
Mahfud, Y., & Adam-Troian, J. (2021). “Macron demission!”: Loss of significance generates
violent extremism for the Yellow Vests through feelings of anomia. Group Processes &
Intergroup Relations, 24(1), 108-124. https://doi.org/10.1177/1368430219880954
McGuire, W. J. (1964). Inducing resistance to persuasion. Some contemporary approaches. CC
Haaland and WO Kaelber (Eds.), Self and Society. An Anthology of Readings,
Lexington, Mass.(Ginn Custom Publishing) 1981, pp. 192-230.
McGuire, W. J. (1973). The yin and yang of progress in social psychology: Seven
koan. Journal of personality and social psychology, 26(3), 446.
https://doi.org/10.1037/h0034345
McKinnon, A. (2022). Russia Is Sending Its Ethnic Minorities to the Meat Grinder. Accessed
December, 29th from https://foreignpolicy.com/2022/09/23/russia-partial-military-
mobilization-ethnic-minorities/
Mercier, H. (2016). The argumentative theory: Predictions and empirical evidence. Trends in
Cognitive Sciences, 20(9), 689-700. https://doi.org/10.1016/j.tics.2016.07.001
Milkman, K. L., Gromet, D., Ho, H., Kay, J. S., Lee, T. W., Pandiloski, P., ... & Duckworth, A.
L. (2021). Megastudies improve the impact of applied behavioural
science. Nature, 600(7889), 478-483. https://doi.org/10.1038/s41586-021-04128-4
Mittone, L., & Savadori, L. (2009). The scarcity bias. Applied Psychology, 58(3), 453-468.
https://doi.org/10.1111/j.1464-0597.2009.00401.x
Mitts, T., Phillips, G., & Walter, B. F. (2022). Studying the impact of ISIS propaganda
campaigns. The Journal of Politics, 84(2), 1220-1225.
Mooijman, M., Hoover, J., Lin, Y., Ji, H., & Dehghani, M. (2018). Moralization in social
networks and the emergence of violence during protests. Nature human behaviour, 2(6),
389-396. https://doi.org/10.1038/s41562-018-0353-0
Montoya, R. M., Horton, R. S., Vevea, J. L., Citkowicz, M., & Lauber, E. A. (2017). A re-
examination of the mere exposure effect: The influence of repeated exposure on
recognition, familiarity, and liking. Psychological bulletin, 143(5), 459-498.
https://doi.org/10.1037/bul0000085
Moscovici, S. (1981). On social representations. Social cognition: Perspectives on everyday
understanding, 8(12), 181-209.
Muthukrishna, M., Bell, A. V., Henrich, J., Curtin, C. M., Gedranovich, A., McInerney, J., &
Thue, B. (2020). Beyond Western, Educated, Industrial, Rich, and Democratic
(WEIRD) psychology: Measuring and mapping scales of cultural and psychological
distance. Psychological science, 31(6), 678-701.
https://doi.org/10.1177/0956797620916782
Muthukrishna, M., Henrich, J. A problem in theory. Nat Hum Behav 3, 221–229 (2019).
https://doi.org/10.1038/s41562-018-0522-1
Nolan, J. M., Schultz, P. W., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2008).
Normative social influence is underdetected. Personality and social psychology
bulletin, 34(7), 913-923. https://doi.org/10.1177/0146167208316691
Obaidi, M., Kunst, J., Ozer, S., & Kimel, S. Y. (2022). The “Great Replacement” conspiracy:
How the perceived ousting of Whites can evoke violent extremism and
Islamophobia. Group Processes & Intergroup Relations, 25(7), 1675-1695.
https://doi.org/10.1177/13684302211028293
Oishi, S., Kesebir, S., & Snyder, B. H. (2009). Sociology: A lost connection in social
psychology. Personality and Social Psychology Review, 13(4), 334-353.
https://doi.org/10.1177/1088868309347835
Orwell, G. (2018). Notes on nationalism. Penguin UK.
Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2015). On the
reception and detection of pseudo-profound bullshit. Judgment and Decision
making, 10(6), 549-563.
Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in cognitive
sciences, 25(5), 388-402. https://doi.org/10.1016/j.tics.2021.02.007
Petty, R. E., Brinol, P., Loersch, C., & McCaslin, M. J. (2009). The need for cognition. In M.
R. Leary & R. H. Hoyle (Eds.), Handbook of individual differences in social
behavior (pp. 318–329). The Guilford Press.

Petty, R. E., Briñol, P., & Tormala, Z. L. (2002). Thought confidence as a determinant of
persuasion: The self-validation hypothesis. Journal of Personality and Social
Psychology, 82(5), 722–741. https://doi.org/10.1037/0022-3514.82.5.722
Picheta, R. (2022). Russian forces have staged illegal ‘referendums’ in Ukraine. What comes
next? Accessed December 26th, 2022 from
https://edition.cnn.com/2022/09/27/europe/ukraine-russia-referendum-explainer-
intl/index.html
Priolo, D., Pelt, A., Bauzel, R. S., Rubens, L., Voisin, D., & Fointiat, V. (2019). Three decades
of research on induced hypocrisy: A meta-analysis. Personality and Social Psychology
Bulletin, 45(12), 1681-1701. https://doi.org/10.1177/0146167219841621
Prost, M., Piermattéo, A., & Lo Monaco, G. (2022). Social representations, social identity, and
representational imputation: A review and an agenda for future research. European
Psychologist. Advance online publication. https://doi.org/10.1027/1016-9040/a000489
Proulx, T., Inzlicht, M., & Harmon-Jones, E. (2012). Understanding all inconsistency
compensation as a palliative response to violated expectations. Trends in cognitive
sciences, 16(5), 285-291. https://doi.org/10.1016/j.tics.2012.04.002
Reardon, D. C. (2018). The abortion and mental health controversy: A comprehensive literature
review of common ground agreements, disagreements, actionable recommendations,
and research opportunities. SAGE open medicine, 6, 2050312118807624.
https://doi.org/10.1177/2050312118807624
Rodrik, D. (2021). Why does globalization fuel populism? Economics, culture, and the rise of
right-wing populism. Annual Review of Economics, 13, 133-170.
https://doi.org/10.1146/annurev-economics-070220-032416
Roozenbeek, J., Freeman, A. L., & van der Linden, S. (2021). How accurate are accuracy-
nudge interventions? A preregistered direct replication of Pennycook et
al.(2020). Psychological science, 32(7), 1169-1178.
https://doi.org/10.1177/0956797621102453
Rouquette, M. L. (1996). Social representations and mass communication research. Journal for
the theory of social behaviour, 26(2), 221-231.
Rosling, H., Rosling, O., & Rönnlund, A. R. (2018). Factfulness: ten reasons we're wrong
about the world--and why things are better than you think. Flatiron books.
Rühle, M. (2014). NATO enlargement and Russia: discerning fact from fiction. American
Foreign Policy Interests, 36(4), 234-239.
https://doi.org/10.1080/10803920.2014.947879
Ryan, R. M., Deci, E. L., Vansteenkiste, M., & Soenens, B. (2021). Building a science of
motivated persons: Self-determination theory’s empirical approach to human experience
and the regulation of behavior. Motivation Science, 7(2), 97-110.
https://doi.org/10.1037/mot0000194
Schultz, P. W., Nolan, J. M., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2018). The
constructive, destructive, and reconstructive power of social norms:
Reprise. Perspectives on psychological science, 13(2), 249-254.
https://doi.org/10.1177/1745691617693325
Shaffer, V. A., Focella, E. S., Hathaway, A., Scherer, L. D., & Zikmund-Fisher, B. J. (2018).
On the usefulness of narratives: an interdisciplinary review and theoretical
model. Annals of Behavioral Medicine, 52(5), 429-442.
https://doi.org/10.1093/abm/kax008
Sherif, M. (1936). The psychology of social norms. Harper.

Silverstein, B. (1987). Toward a science of propaganda. Political Psychology, 49-59.


https://doi.org/10.2307/3790986

Sokal, A., & Bricmont, J. (1997). Impostures intellectuelles [Intellectual Impostures]. Odile
Jacob.

Šrol, J. (2022). Individual differences in epistemically suspect beliefs: the role of analytic
thinking and susceptibility to cognitive biases. Thinking & Reasoning, 28(1), 125-162.
https://doi.org/10.1080/13546783.2021.1938220

Sternisko, A., Cichocka, A., & Van Bavel, J. J. (2020). The dark side of social movements:
Social identity, non-conformity, and the lure of conspiracy theories. Current opinion in
psychology, 35, 1-6. https://doi.org/10.1016/j.copsyc.2020.02.007

Tajfel, H. (1974). Social identity and intergroup behaviour. Social science information, 13(2),
65-93. https://doi.org/10.1177/053901847401300204

Tenzer, N. (2022). The heavy clouds of peace. Accessed December 26th, 2022 from
https://tenzerstrategics.substack.com/p/the-heavy-clouds-of-peace
Till, C. (2021). Propaganda through ‘reflexive control’ and the mediated construction of
reality. New Media & Society, 23(6), 1362-1378.
https://doi.org/10.1177/1461444820902446
Tormala, Z. L., & Rucker, D. D. (2018). Attitude certainty: Antecedents, consequences, and
new directions. Consumer Psychology Review, 1(1), 72-89.
https://doi.org/10.1002/arcp.1004
Steiger, A., & Kühberger, A. (2018). A meta-analytic re-appraisal of the framing
effect. Zeitschrift für Psychologie. https://doi.org/10.1027/2151-2604/a000321
Taylor, A. (2022). With NAFO, Ukraine turns the trolls on Russia. The Washington Post.
Accessed December 28th from
https://www.washingtonpost.com/world/2022/09/01/nafo-ukraine-russia/
Umland, A., & Eichstaett, B. (2009). Fascist Tendencies in Russia’s Political Establishment:
The Rise of the International Eurasian Movement. Russian analytical digest, 60(9), 13-
17.
Van Bavel, J. J. V., Baicker, K., Boggio, P. S., Capraro, V., Cichocka, A., Cikara, M., ... &
Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic
response. Nature human behaviour, 4(5), 460-471. https://doi.org/10.1038/s41562-020-
0884-z
Van Bavel, J. J., & Pereira, A. (2018). The partisan brain: An identity-based model of political
belief. Trends in cognitive sciences, 22(3), 213-224.
https://doi.org/10.1016/j.tics.2018.01.004
Van den Bos, K., & Maas, M. (2009). On the psychology of the belief in a just world:
Exploring experiential and rationalistic paths to victim blaming. Personality and Social
Psychology Bulletin, 35(12), 1567-1578. https://doi.org/10.1177/0146167209344628
Van Lange, P. A., Joireman, J., & Milinski, M. (2018). Climate change: what psychology can
offer in terms of insights and solutions. Current Directions in Psychological
Science, 27(4), 269-274. https://doi.org/10.1177/0963721417753945
Van Prooijen, J. W., Krouwel, A. P., & Pollet, T. V. (2015). Political extremism predicts belief
in conspiracy theories. Social Psychological and Personality Science, 6(5), 570-578.
https://doi.org/10.1177/1948550614567356
von Hohenberg, B. C., & Guess, A. M. (2022). When Do Sources Persuade? The Effect of
Source Credibility on Opinion Change. Journal of Experimental Political Science, 1-15.
https://doi.org/10.1017/XPS.2022.2
Vöhringer, M. (2011). A Concept in Application: How the Scientific Reflex Came to be
Employed against Nazi Propaganda. Contributions to the History of Concepts, 6(2),
105-123. https://doi.org/10.3167/choc.2011.060207
Wagener, A. (2022). France and the moral panic of “islamo-leftism”. CFC Intersections.
Wallace, J., Goldsmith-Pinkham, P., & Schwartz, J. L. (2022). Excess death rates for
Republicans and Democrats during the COVID-19 pandemic (No. w30512). National
Bureau of Economic Research.
Webster, D. M., & Kruglanski, A. W. (1994). Individual differences in need for cognitive
closure. Journal of personality and social psychology, 67(6), 1049-1062.
https://doi.org/10.1037/0022-3514.67.6.1049
Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual
adherence. Political Behavior, 41(1), 135-163. https://doi.org/10.1007/s11109-018-
9443-y

Wyer, N. A. (2010). Selective self-categorization: Meaningful categorization and the in-group


persuasion effect. The Journal of social psychology, 150(5), 452-470.
https://doi.org/10.1080/00224540903365521
Yeni Safak (2022). Altılı masayı iyice karıştıracak: Kılıçdaroğlu'ndan adaylık ilanı gibi afişler.
Accessed December 27th, 2022 from https://www.yenisafak.com/gundem/altili-masayi-
iyice-karistiracak-kilicdaroglundan-adaylik-ilani-gibi-afisler-3897202
Yilmaz, O. (2021). Cognitive styles and religion. Current Opinion in Psychology, 40, 150-154.
https://doi.org/10.1016/j.copsyc.2020.09.014
Young, I. F., & Sullivan, D. (2016). Competitive victimhood: A review of the theoretical and
empirical literature. Current Opinion in Psychology, 11, 30-34.
https://doi.org/10.1016/j.copsyc.2016.04.004
Zarouali, B., Dobber, T., De Pauw, G., & de Vreese, C. (2022). Using a personality-profiling
algorithm to investigate political microtargeting: assessing the persuasion effects of
personality-tailored ads on social media. Communication Research, 49(8), 1066-1091.
https://doi.org/10.1177/0093650220961965
Zmigrod, L. (2020). The role of cognitive rigidity in political ideologies: theory, evidence, and
future directions. Current Opinion in Behavioral Sciences, 34, 34-39.
https://doi.org/10.1016/j.cobeha.2019.10.016
Zmigrod, L., Eisenberg, I. W., Bissett, P. G., Robbins, T. W., & Poldrack, R. A. (2021). The
cognitive and perceptual correlates of ideological attitudes: a data-driven
approach. Philosophical Transactions of the Royal Society B, 376(1822), 20200424.
https://doi.org/10.1098/rstb.2020.0424

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy