top of page

​Memetic Warfare: A Study of Psychological Tactics in the Digital Age

Memetic Warfare Thumbnail.jpg

The Understanding of Memetic Warfare

Richard Dawkins coined the word “meme” in his book The Selfish Gene (1976), defining it as an idea, action, or behavior that spreads through culture while evolving and adapting, much like biological genes (Solon, 2013). Memes have transitioned into the digital realm, becoming one of the most recognized parts of online culture. Humans deliberately create internet memes and are constantly changing and mutating in a perpetual state of flux (para. 3). Their significance lies in their ability to capture emotions and ideas in succinct visual, video, or textual formats facilitated by peer-to-peer dissemination across social media platforms. Dawkins argues that memes function as cultural analogs to genes that transmit information and influence behaviors similar to biological inheritance. 

 

Likewise, memetic warfare is still a relatively niche concept. Although more recently, it has been steadily gaining notoriety. Giesea defines memetic warfare as “competition over narrative, ideas, and social control in a social-media battlefield” (2015, pg 69). He claims that it has features similar to psychological warfare– which all encompass similar attributions to propaganda. Giesea argues that if propaganda and public diplomacy are considered conventional forms of psychological Warfare, Memetic Warfare is the guerrilla variation of traditional methods of propaganda tactics. 

 

In the social media battlefield context, like cyber-warfare focusing mainly on data control, memetic warfare diverges by finding ways to dominate the digital realm’s dialogue, narrative, and psychological space. Memetic warfare helps to disrupt or undermine enemy efforts, serving its primary purpose surrounding the communication battlespace. Giesea also points out that memetic warfare, over time, has become actively used by state-level actors such as with Russia and China and non-state groups like the Islamic State

How Memes Alter Perception

Moreover, memes exert a profound influence on human behavior and beliefs. They facilitate connections through simple symbolism, which provides a gateway to a vast audience. Like psychological operations, memes easily garner support with minimal execution by leveraging simple and effective stylistics to convey messages (Kreps, Lushenko, and Carter, 2023). Echo chambers in the digital realm are examples of  utilizing memes to communicate and reinforce shared beliefs and create a sense of close-knit communities. This shared communication platform allows individuals to express their thoughts freely while aligning their worldviews together. Memes play a vital role in this situation by incorporating cultural influence as a medium to convey ideas with minimal effort. As DiResta et al. note, “memes turn big ideas into emotionally resonant snippets, particularly because they fit our information consumption infrastructure… Memes are powerful because they can be easily recontextualized and reshared and act as “in-group” cultural signifiers.” (Pg. 50) Therefore, memes have a significant power in shaping people’s perceptions. Memes help bridge ideas in a society and are accessible and capable of spreading a thought or a particular narrative of messages in seconds. 

Memetic Warfare in the New Frontier of Psychological Tactics

In addition, memetic warfare has notably played a role during the 2016 presidential campaign of Donald Trump, orchestrated by Russian operatives. The agents heavily influenced the election by employing strategic internet trolling tactics, including the use of memes, to influence the American public to be in favor of Trump. The Inter Research Agency (IRA), known as a Russian propaganda machine, strategically targeted popular social media platforms like Facebook, Twitter, and Instagram in attempts to suppress voter counts to have Trump lead in the campaign during the elections (Thompson & Lapowsky, 2018)

 

The IRA’s methods involved using tactics of memetic warfare to reinforce division within the American political landscape. They would generate, for example, political memes that advocated for Trump, depicting him as a moral Christian individual, while the candidate for the Democrat party, Hillary Clinton, was portrayed as a worshipper of Satan (Figure 1).

Memetic Warfare - Figure 1

(Figure 1. A meme portraying Donald Trump’s political party aligned to Christian values, while Hillary Clinton’s with Satanism.) 

Interestingly, Yevgeny Prigozhin, the leader of Russia’s Wagner mercenary group, had claimed his involvement and funding of the Internet Research Agency (Faulconbridge, 2023). Given Prigozhin’s close ties to Vladimir Putin before his betrayal and defame – many speculate that Russia is actively employing  memetic warfare to further its political objectives. This tells us that using social media and digital platforms is a new breeding ground imperative for shaping political narratives and memes used to manipulate public perception. 

 

Even with the outbreak of the full-scale war in Ukraine with Russia, there have been significant efforts from both sides of the conflict to disseminate propaganda to further its political objectives (Bhandari et al., 2023). Russia and Ukraine have employed numerous tactics to propagate their narrative about the conflict – which is evident from cultural events and news coverage to exploiting the use of social media. Their propaganda not only emanates from the state level but also involves citizens or sympathizers from abroad who frequently promote the dissemination of information online. 

 

Furthermore, memes have become an influential tool in the digital information space during the conflict in Ukraine, serving as an instrument to mold public perception and discourse. Both Russia and Ukraine have strategically adopted memes to shape public sentiment domestically and internationally throughout the conflict. 

 

Citizens have proactively used social media in new dimensions during the war in Ukraine. As memetics became one of the prevailing methods for individuals to share their experiences about the war on various social media platforms, it has established a profound connection between citizenry and institutional communication (Bracciale & Aglioti Colombini, 2023). Memetic social connection ultimately led to the integration of heightened civic and institutional engagement levels that have elucidated the role of memes’ profound influence in political discourse. 

 

Russia has also implemented similar strategies to strengthen its propaganda campaign using memes. Russia has leveraged memes as a political tool to disseminate and amplify its ideological narratives. For example, the Russian Embassy in the United Kingdom in the past has shared a meme on social media depicting Zelensky referencing Bart Simpson from the famous television series The Simpsons, writing on a chalkboard: “I will not fire missiles towards Poland anymore,” with the caption on the post stating: Firing missiles at a NATO member state is probably not the best way to join the alliance.” (Figure 2) This meme surfaced when a missile hit Poland, insinuating Ukrainian involvement in the attack on its neighboring country while mocking them for its long ambition to join the NATO alliance.

Memetic Warfare - Figure 2.png

(Figure 2. The Russian Embassy in the United Kingdom shared a meme about the news of a missile hitting Poland in 2022)

The Russian embassy’s use of memes above depicts a form of Memetic Warfare, which uses humor and trolling as a way to provoke and ridicule the Ukrainian government and undermine its authority. Interestingly, memes as a propaganda tool had manifested due to high demand from Russian voters (Solovian & Khimiak, 2023).  

 

Today, the Russian Federation employs propaganda tactics to solidify its narrative mainly across online information. While Russia uses traditional propaganda tactics such as war ceremonies, festivals, and disseminating information through state-controlled media, its use of the internet to convey its narratives transcends old forms. From official institutions sharing memes to citizens and pro-Russian netizens that shape perceptions in line with the government’s objectives, it contributes to the distortion of the information space in unconventional terms. 

Militant Groups and Memetic Warfare 

Strategic Use of Memes by Militant Groups

 

Additionally, memes are often used by militants across various online platforms, such as websites and social media, as a way for them to cultivate support within their communities and to extend way beyond them, reaching a vast group. Militants using memes as a weaponized political tool often express their core values by combining humor or pop culture references to convey extremist or violently threatening narratives (Crawford, Keen, & Suarez-Tangil, 2021)

 

For example, far-right accelerationists often strategically utilize memes to shape perceptions and project a self-image tied to strength and dominance. Crawford, Keen, and Suarez-Tnagil note that these characteristics are often seen in alt-right groups’ memes in which they portray themselves as a formidable political actor. They also include attires that are associated with Nazi ideology, like the Black Sun or S.S. logos, while mutating the famous meme character, ‘Chad’ (pg. 990) (Figures 3 & 4).

Memetic Warfare - Figure 3.png

(Figure 3. A far-right accelerationist meme advocating for violent actions against society)

Memetic Warfare - Figure 4.png

(Figure 4. A politically neutral “Chad” meme perpetuating a stoic male’s facial expression)

The compelling nature of memes is their ability to evolve, as seen with the Chad meme. Initially, the meme portrays an ordinary white male with a stoic expression, symbolizing a masculine personality (Figure 4); this meme has undergone significant ideological changes. As depicted in the image above (Figure 3), the Chad meme has mutated to incorporate a balaclava, a weapon adorned with alt-right symbols, and the Black Sun displayed on his forehead. This dark and humorous meme is an interesting example of how memes can adapt to specific ideologies. 

 

The alt-right Chad meme is also a reference to an event that had taken place in reality, like the far-right attacks in Poway, the United States, and Christchurch, New Zealand. The meme glorifies and promotes violent tendencies of right-wing accelerationism, aligning the ideologies of the real-life attacks mentioned. Even more shockingly, the accompanying comments to the online memes contribute to the entire narrative. Some comments expressed a metaphorical hint, subtly endorsing future attacks, reminiscent of something similar to the Christchurch (Crawford, Keen, & Suarez-Tangil, 2021).

 

In the context of Memetic Warfare, these memes and comments play a considerable role in desensitizing individuals to violence. Psychological manipulation is employed to dehumanize the ideology’s opposition and specific targets, thereby attempting to influence the audience’s psyche. This complex and multifaceted way of communication helps us understand militants, often extremist ideologies. 

Case-study 1 - The Proud Boys

The Proud Boys, a well-known chauvinist North American alt-right militant organization, have gained fame from their advocacy, often promoting violence (Pilkington, 2022). Aligned closely with Donald Trump, many Proud Boys members participated in the infamous January Capitol Hill riot (Frenkel, 2021), marching alongside fellow Trump supporters. The group vehemently criticizes left-wing and progressive political affiliations and has earned a title from the FBI as a group associated with white nationalism (Solomon, 2018).  

 

The Proud Boys’ ideology consists of rejuvenating Western civilization, which includes traditionalist and nationalistic elements that encourage males to orient themselves towards hyper-masculinity. Newly recruited members are required to pledge an oath affirming their pride in Western chauvinism. The initiation ritual includes physical aggression by numerous already-members until the recruit cites a pop culture trivia.

 

Furthermore, their violent tenets extend beyond their recruitment initiatives. The Proud Boys have notably conducted street violence, often on the side of the provocateurs (Pilkington, 2022). They also leverage social media to disseminate their ideology, recruit new members, and coordinate movements. Communication online with the Proud Boys includes symbolic and violent methods, with memes that display crucial roles that help shape a specific community and culture for the group (DeCook, 2018). The Pepe the Frog meme, used by many group members, features a Proud Boys version wearing their uniform, emphasizing how the group’s integrated alt-right imagery into their organization (Figure 5).

Memetic Warfare - Figure 5.png

(Figure 5. A meme depicting Pepe the Frog adorned with Proud Boys attire, accompanied by the ‘OK’ hand gesture, recognized as a symbol associated with white power) 

While Pepe originally started as a neutral meme like the Chad meme, it has been used by many online communities, such as 4Chan and Reddit, where right-wing discourse is often rampant; the meme has become associated with alt-right symbols (Roy, 2016). The Proud Boys’ use of Pepe, the Frog, wearing their uniform and making the OK hand symbol contributes to the normalization of hate. Even though the OK hand gesture is claimed to be a form of trolling by the members of the Proud Boys, it is also tied to the symbol representing white power. The Proud Boys’ extensive propagation to a predominantly white-oriented demographic, which is evident in the militant’s online discourse, is a form that represents memetic warfare to advance their political objectives, contributing to the resurgence of populist fascism (Figure 6).

Memetic Warfare - Figure 6.png

(Figure 6. A meme circulating among supporters of Donald Trump and the Proud Boys)

Case-study 2 - The Islamic State

Likewise, Daesh, better known as the Islamic State (IS), is a militant group of self-proclaimed caliphate asserting control over all Muslims. They employ tactics that are driven by fear, showcasing videos of execution and torture utilizing social media for recruitment, and have been well-known to adopt multifaceted approaches to advance their political ambitions. 

 

ISIS has incorporated elements of popular culture, including characters from memes inspired by Hollywood and video games, to attract young audiences to potential recruits (El Ghamari, 2017). Notably, ISIS has even utilized popular video games like Grand Theft Auto V (GTA V) as a tool to recruit people (Tassi, 2014) (Figure 7). The recruitment videos seek to draw parallels between violence depicted in the video game and real-life combat, attempting to make violence exciting and appealing for potential recruits. This strategic use of pop culture, like GTA V, aims to resonate with the audience and influence them to align with the group’s ideological narrative that influences and redirects people’s perceptions. 

Memetic Warfare - Figure 7.png

(Figure 7. ISIS using Grand Theft Auto V as a recruitment video)

Strategies to Counter Memetic Warfare with Extremists

Often, accelerated radicalization stems from active engagement on social media platforms and the internet. As Liang and Cross (2020) argue, one way to combat the spread of radicalization is the strategies surrounding digital disruption counter-narratives. Although digital disruption methods are still in their development stages, they make it more challenging for online users to find radicalized content on the internet and social media (pg. 18). Notably, the self-radicalized neo-Nazi Anders Behring Breivik’s ideologically motivated attack in 2011, in which he killed 77 people in Norway spurred the hacktivist group Anonymous to use methods of digital disruption. Anonymous named their efforts “Operation UnManifesto” to plummet the wave of radicalization (Choney, 2011). They altered Brevik’s manifesto, eliminating the chances of online viewership by containing the spread of his ideology’s influence. As Liang and Cross note, the partnership with Jigsaw, for example, enacting the Redirect Method, also helps users who intentionally seek violent extremist content to be redirected to curated YouTube videos or non-extremist posts (pg. 19). Hence, digital disruption helps prevent dedicated far-right extremists from accessing the tools of inspiration. It effectively counters extremist propaganda, undermining radicalized groups’ employed tactics such as those of memetic warfare.

 

Another more conventional method is the enactment of content moderation. This method is mainly conducted by social media platforms in which they hold responsibility for removing content and users that violate guidelines that go against rules on “hate speech, inappropriate content, support or celebration of terrorism, or spam” (Ganesh & Bright, 2020, pg. 11). It is fundamental for civil society, government, and private sectors to initiate counter extremist strategies to better content moderation as argued by Ganesh and Bright. No doubt, new technologies that interact with content moderation are in the process of requiring improvement. As Gunton (2022) highlights, the use of artificial intelligence (AI), on the one hand, is quite limited in effectiveness and potentially infringes upon freedom of expression. AI pursuing content moderation requires the utmost contextual awareness, such as understanding cultural and legal differences in each region (Gunton, 2022, pg 73). It also needs to develop around the areas of analyzing accurate extremist content. This leads to skepticism when considering the potential biases associated with AI content moderation when dealing with moral and political barriers undermining freedom of expression. Gunton also argues that, on a side note, de-platforming methods could be effective in combating extremist content. Using methods of de-platforming migrates radicalized groups from mainstream social media platforms to an alt-tech platform that allows effective containment of the spread of ideological influence and reduces the chances for the group to expand with recruitment (pg. 75).

 

Moreover, as Sultan (2017) argues, ISIS’ recruitment strategies align with those of online social media tools and communication models similar to those of American marketers (pg. 47). Disrupting ISIS’ recruitment and ideological influence on the internet requires remediating social media to disenfranchise extremist echo chambers. As Sultan writes, 

"ISIS has developed World War Two style propaganda campaigns that now play out in News (AMAQ agency and global coverage), Video (YouTube, News, and Terror updates), Audio (sound clips and audio tweets), Social (Facebook, Instagram, Snapchat, Twitter, Weibo, etc.), Video Game mods (ARMA 3) as well as in social campaigns tied to #hashtags.” (pg. 48).

 

In response to ISIS’ growing synergy between leveraging the skills of social media and propaganda campaigns, social media companies have taken different measures to combat the stream of extremist content. These examples range from YouTube, Facebook, and Twitter, equipping content reviewers to moderate the flow of disseminated content (Farag, 2017, pg. 863-864). However, building a comprehensive and practical counter-message that simultaneously ensures the technological capabilities to combat extremist content is an ethical consideration. As Farag (2017) argues, governments should avoid burdensome regulatory frameworks on social networks to cultivate a protected space where individuals can freely express their opinions while containing extremist content. Stakeholders should work towards a comprehensive strategy to enhance counter-message tactics working with social media companies and NGOs. 

Conclusion

While memes are often associated with humor, the term has evolved with diverse characteristics that extend beyond a simple laugh. Viewing memes as a source of rapid dissemination of specific cultural symbols within society becomes a way of expression. These include stylistic choices and symbolic references to the meme. In essence, memes have an immense potential to sway people, spread, and shape public perception. 

 

In the era of emerging post-truth societies, where memes reinforce the intricacy of engaging people’s emotions, delivering objective reporting could be challenging. With its profound impact on the information space, memetic warfare complicates researchers’ understanding of the ever-increasingly fast information landscape. 

 

On the other hand, if researchers would examine how militants strategically employ memes to propagate their ideologies, it would offer valuable insight into their worldview and reality. A better understanding of militants and armed non-state actors allows people to study their motivation. It provides opportunities to unravel how society influences these groups to form in the first place. 

Works Cited (MLA-style)

Bhandari, A., Shah, S. B., Thapa, S., Naseem, U., & Nasim, M. (2023). Crashmate: Multimodal Analysis of Directed and Undirected Hate Speech in Text-Embedded Images From Russia-Ukraine Conflict. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 1993-2002).

Bracciale, R., & AGLIOTI COLOMBINI, J. (2023). Meme Tales: Unraveling the Function of Memes in the Russian-Ukraine Conflict. RIVISTA TRIMESTRALE DI SCIENZA DELL'AMMINISTRAZIONE, 2023(4). 

Choney, S. (2011). Hackers want you to rewrite Norway gunman’s manifesto. NBC NEWS. https://www.nbcnews.com/tech/tech-news/hackers-want-you-rewrite-norway-gunmans-manifesto-flna121821

Crawford, B., Keen, F., & Suarez-Tangil, G. (2021, May). Memes, radicalisation, and the promotion of violence on chan sites. In Proceedings of the International AAAI Conference on Web and Social Media (Vol. 15, pp. 982-991).

Dawkins, R. (1976). The selfish gene. New York, Oxford University Press. 

DeCook, J. R. (2018). Memes and symbolic violence:# proudboys and the use of memes for propaganda and the construction of collective identity. Learning, Media and Technology, 43(4), 485-504.

DiResta, R., Shaffer, K., Ruppel, B., Sullivan, D., Matney, R., Fox, R., ... & Johnson, B. (2018). The Tactics and Tropes of the Internet Research Agency, New Knowledge Report prepared for the United States Senate Select Committee on Russian Interference in the 2016 Election.

El Ghamari, M. (2017). Pro-Daesh jihadist propaganda. A study of social media and video games. Security and Defence Quarterly, 14(1), 69-90.

Farag, D. (2017). From Tweeter to terrorist: Combatting online propaganda when Jihad goes viral. Am. Crim. L. Rev., 54, 843.

 

Faulconbridge, G. (2023). Russia’s Prigozhin admits links to what the U.S. says was election meddling troll farm. Reuters. https://reuters.com/world/europe/russias-prigozhin-admits-links-what-us-says-was-election-meddling-troll-farm-2023-02-14/

 

Frenkel, S. (2021). The storming of Capitol Hill was organized on social media. The New York Times. https://www.nytimes.com/2021/01/06/us/politics/protesters-storm-capitol-hill-building.html

 

Ganesh, B., & Bright, J. (2020). Countering extremists on social media: challenges for strategic communication and content moderation. Policy & Internet, 12(1), 6-19.

 

Giesea, J. (2015). It’s time to embrace memetic Warfare. Defence Strategic 

Communications, 1(1), 67-75.

 

Gunton, K. (2022). The Use of Artificial Intelligence in Content Moderation in Countering Violent Extremism on Social Media Platforms. In Artificial Intelligence and National Security (pp. 69-79). Cham: Springer International Publishing.

 

Kreps, S., Lushenko, P., & Carter, K. (2023). Lessons from the meme war in Ukraine. Brookings. https://www.brookings.edu/articles/lessons-from-the-meme-war-in-ukraine/

 

Liang, C. S., & Cross, M. J. (2020). White crusade: How to prevent right-wing extremists from exploiting the internet. Geneva Centre for Security Policy, 11, 1-27.

 

Pilkington, E. (2022). Proud Boys memo reveals meticulous planning for ‘street-level violence.’ The Guardian. https://www.theguardian.com/world/2022/sep/19/proud-boys-document-jan-6-violence

 

Roy, J. (2016). How ‘Pepe the Frog’ went from harmless to a hate symbol. Los Angeles Times. https://www.latimes.com/politics/la-na-pol-pepe-the-frog-hate-symbol-20161011-snap-htmlstory.html

 

Solomon, M. (2018). FBI categorizes Proud Boys as an extremist group with ties to white nationalism. NPR. https://www.npr.org/2018/11/20/669761157/fbi-categorizes-proud-boys-as-extremist-group-with-ties-to-white-nationalism

 

Solon, O. Richard Dawkins on the internet’s hijacking of the word ‘meme.’ Wired. https://www.wired.co.uk/article/richard-dawkins-memes

 

Solovian, V. & Khimiak, A. (2023). A Very Dark Humor: Memes as a Tool of Russian Propaganda. Ukraine Crisis Media Centre. https://uacrisis.org/en/a-very-black-humor-memes-as-a-russian-propaganda-tool

 

Sultan, O. (2017). Combatting the Rise of ISIS 2.0 and Terrorism 3.0. The Cyber Defense Review, 2(3), 41-50.

 

 

Tassi, P. (2014). ISIS Uses ‘GTA 5’ In New Teen Recruitment Video. Forbes. https://www.forbes.com/sites/insertcoin/2014/09/20/isis-uses-gta-5-in-new-teen-recruitment-video/?sh=74949623681f

 

Thompson, N., & Lapowsky, I. (December 17, 2018). How Russian trolls used meme warfare to Divide America. Wired. https://www.wired.com/story/russia-ira-propaganda-senate-report/

bottom of page