CURATING THE FUTURE: THE SUSTAINABILITY PRACTICES OF ONLINE HATE GROUPS By Julia Rose DeCook A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Media and Information Studies — Doctor of Philosophy 2019 ABSTRACT CURATING THE FUTURE: THE SUSTAINABILITY PRACTICES OF ONLINE HATE GROUPS By Julia Rose DeCook The rise of populist fascism and hate violence across the world has raised alarm bells about the nature of the Internet in the radicalization process. Although there have been attempts in recent years to halt the spread of extremist discourse online, these groups remain and continue to grow. The purpose of this dissertation project was to examine online extremist groups’ responses to infrastructural failure, which was defined as an event such as deplatforming and other modes of censorship, to understand how these groups manage to persist over time. Examining the responses of three groups to these failures, r/TheRedPill; r/Incels and Incels.me; and r/AznIdentity; who are affiliated with the larger “Manosphere” (a loosely connected online network of men’s rights activists, Incels, Pick Up Artists, etc. connected to the alt right), what was revealed is that these groups’ practices not only build their communities and spread their discourse, but sustains them. Previous research on hate groups tends to focus on the role of Internet platforms in amplifying hate speech; the discourses the groups create; or on political strategies enacted by the groups. I argued in this dissertation project that this does not get at the heart of why these groups manage to survive despite attempts to thwart them, and that studying the material structures they are on as well as their social practices are necessary to develop better strategies to combat violent far-right extremism. Using an update to the grounded theory approach known as situational analysis, I observed and followed the groups for two years (January 2017 to 2019) and collected data in the form of text posts, images, and their networks. To inform the project, I relied on a theoretical framework guided by platform and infrastructure studies, communication, anthropology, and social movement studies. Ultimately, I argue that what these groups create through these sustainability practices results in a symbolic infrastructure. Unlike material infrastructures (like large scale electrical grids) or knowledge infrastructures, symbolic infrastructure is made up of not just the material artifacts that these groups create, but is primarily premised on the shared practices of these groups that produce and reproduce their discourse, their identity, and their networks. Specifically, symbolic infrastructure is built up of three subpractices: archiving; fortification; and identity maintenance and network building. These practices, and their resulting product, are made possible by the material structures of the Internet and allows for the preservation and circulation of the group’s epistemic/discursive forms. Symbolic infrastructure, due to its nature, is easily transported from platform to platform even after the groups are faced with infrastructural failure or threatened by it. Although it was famously said by Susan Leigh Star that infrastructures only become visible upon breakdown, the case studies in this dissertation demonstrate that the mere threat of breakage is enough for the nature and role of infrastructure to be revealed. Each of the groups case studies navigate and maneuver around the constraints of the platform and digital infrastructure they find themselves reliant on, but also manage to innovatively exploit its affordances. What this dissertation revealed is that the strategy of deplatforming has significant limits because of the nature of the Internet, and that the work of combating extremist thought is not only relegated to the digital realm but must be extended beyond it. Copyright by JULIA ROSE DeCOOK 2019 For my family - I could not have accomplished this without your love and support. I love you all 제 가족들에게 - 여러분의 사랑과 지원 없이 이 학위를 받을 수 없었을 겁니다. 너무나 so much. Thank you. 사랑하고 감사합니다. Grandma, you never had the choice to go to school. I dedicate this dissertation and my doctorate 할머니, 할머니에겐 학교를 갈 수 있는 선택권이 없었습니다. 제 논문과 박사학위를 할머니에게 바칩니다. to you. v ACKNOWLEDGEMENTS “ … Life is not what you alone make it. Life is the input of everyone who touched your life and every experience that entered it. We are all part of another.” - Yuri Kochiyama, Passing it On, 2004 To my family and friends – you have all fundamentally shaped who I am and have been who I cried to, laughed with, and are all why I am able to live a life filled with joy and love. I could not have gotten through this without you. Thank you for constantly reminding me of the potential of who I was, who I am, and who I can become. I love you all. To my advisor, Kjerstin – I can’t express in words how grateful I am to you for taking me on and for being supportive, kind, and encouraging in my darkest moments. Thank you. You have taught me not only how to be an academic but also a human being. To my committee – Beth, Chantal, and Casey – and my mentors: you have all fundamentally shaped me as a scholar and person, and your kindness and intelligence are what I aim to emanate in the future. Thank you. To my community – it is an honor just to be Asian, and an honor to be an mixed race Korean woman. Without my community, without my heritage, I am nothing. I am eternally grateful to our organizations, culture, and elders. We really are our ancestors’ wildest dreams come true. To those who have been injured or lost their lives because of racial and gendered violence – I also dedicate this dissertation to all of you and those who are mourning your loss. I know that it will not replace the lives that have been cut tragically short, but it is my hope that this project helps to stop future violence and pain. vi TABLE OF CONTENTS LIST OF TABLES LIST OF FIGURES CHAPTER 1: Introduction The Possibilities of Failure Infrastructural Imaginaries Dissertation Map CHAPTER 2: Background and Methods [Online] Extremist Social Movements When Infrastructure Breaks Affordances for Radicalization Networked Publics and Cultural/Epistemic Production Discourse and Preservation Shaping And Maintaining Online Social Worlds Portable Infrastructure Methods Digital Ethnography Data Analysis Situational Analysis CHAPTER 3: Case Study 1 – World Builders: r/TheRedPill Quarantine The Red Pill Universe r/TheRedPill Discourse and Ideology The Ban Wave Cometh: The Beginning of the Imminent End Contingency Plans: Doomsday Preparation on r/TheRedPill “Why Don’t We Just Leave Reddit On Our Own?” “Incels Gets Banned: What Does That Mean For TRP?” “A Civil War Is Coming.” “We Are Not Organized And That Is Our Strength.” Arming the Defenses CHAPTER 4: Case Study 2 – Lost Civilizations: r/Incels & Incels.me Banned: November 2017 The “Incelosphere” The 2017 Ban Wave September 2017 October 2017 Paradise Lost: Incels.Me “Elliot Rodger’s Legacy Lives On” vii ix x 1 6 9 13 16 17 22 27 30 32 36 37 39 40 43 43 48 49 51 54 56 63 68 82 91 98 101 103 104 107 110 117 121 126 134 “Incels Are Finally Being Taken Seriously.” Incels.Me is Dead “The Fire Rises.” The Incel Institution Diffused Extremism CHAPTER 5: Case Study 3 – Refugees: MRAsians and r/AznIdentity The Asian Masculinity Ecosystem “Get Twitter Now.” Ban Wave 2017 And Networked Harassment MRAsians The Politics of Digital Space CHAPTER 6: Discussion and Conclusion Anticipating Failure Migration Across the Digital Frontier Symbolic Infrastructure Identity Maintenance and Network Building Fortification Archiving Strategies of Subversion Failure as a Nexus for Action Follow the Leader? Portable Discourse and Culture “Culture Wars” and the Fight for Reality Centralized Hubs of Extremism Reproducing Structures Final Thoughts APPENDICES APPENDIX A: Glossary APPENDIX B: Relational Maps APPENDIX C: Discourse Maps BIBLIOGRAPHY 138 150 155 163 166 168 171 174 186 197 209 212 218 221 224 227 230 232 233 236 239 241 243 245 247 252 255 256 257 259 262 viii LIST OF TABLES Table 1. Case Studies, Sites, and Data of Data Collection Start 41 Table 2. Description of Maps, from page xxiv from Clarke, Friese, and Washburn, 2017 44 ix LIST OF FIGURES Figure 1. r/TheRedPill’s landing page after the quarantine Figure 2. r/TheRedPill is Quarantined Figure 3. r/TheRedPill’s universe on reddit Figure 4. r/TheRedPill’s off-reddit universe Figure 5. Response to a reddit announcement post about banning communities containing sexually suggestive content featuring minors/perceived minors Figure 6. Valuable posts should be archived Figure 7. A post from 2013 noting the shadow banning of accounts associated with Men’s Rights and TheRedPill Figure 8. You are not a part of reddit Figure 9. Alt-accounts Figure 10. The landing page of Puerarchy.com Figure 11. An email emergency alert system in the event of a forum shut down Figure 12. When we get banned from the Internet Figure 13. Forums.red Figure 14. “Why don’t we just leave reddit on our own?” Figure 15. The Purge Figure 16. A post from October 2016 about Donald Trump on r/TheRedPill Figure 17. A post about the Vice article that increased attention of the subreddit in 2017 Figure 18. A mod post responding to Yiannopoulos’ Twitter ban by moderator redpillschool Figure 19. You can kill a man but you can’t kill an idea x 4 49 52 54 56 59 60 61 62 64 66 67 68 69 70 71 72 73 74 Figure 20. “Don’t talk about Fight Club.” Figure 21. User Bloodycurative speaking about their doxing experience Figure 22. Incels got banned. What does it mean for TRP? Figure 23. “For the Incels” from January 2018 Figure 24. Posting is locked, comments are open Figure 25. Ahead of the curve Figure 26. Free speech Figure 27. “Curing” the issues surrounding free speech Figure 28. Twitter is a Liberal Democrat Social Justice Swamp platform Figure 29. Efforts to ban TRP have resurfaced Figure 30. There’s a Civil War coming Figure 31. Stop Reddit Censorship Figure 32. Our Patreon Figure 33. Our strength Figure 34. Banned for violent content Figure 35. Incels communities on reddit. A strikethrough indicates the subreddit has been banned at time of writing Figure 36. Incels communities off-reddit (own communities, not subcommunities like 4chan’s) Figure 37. What are some of the most toxic subreddits? Figure 38. How is this sub even alt right? Figure 39. “Slowly fade into non-existance.” Figure 40. Trying to get the place banned Figure 41. Stop giving them clicks xi 78 79 83 84 85 86 87 88 89 91 92 94 97 99 107 110 110 113 115 116 117 121 Figure 42. I refuse to be humiliated again Figure 43. Incels and Hate Crimes Figure 44. Examples of violence from r/Incels Figure 45. Comments responding to the post about the policy change pt. 1 Figure 46. Comments responding to the post about the policy change pt. 2 Figure 47. The mods are the same as in reddit Figure 48. You’ll never be able to gas us all out Figure 49. Getting banned is a new beginning Figure 50. Ten more shall take its place Figure 51. They stole our legacy Figure 52. This is a fresh start Figure 53. Incel Wiki Figure 54. Incels Store Figure 55. A war zone Figure 55. How much data is being saved? Figure 56. Statement from SergeantIncel Figure 57. This board is done for Figure 58. Inceldom is mainstream Figure 59. Incels.me does not condone violence Figure 60. What do we do if the site gets taken down? Figure 61. Doomsday Figure 62. Mainstream recognition Figure 63. We have a Twitter account now! xii 122 124 125 126 127 128 129 130 131 132 133 134 135 136 139 140 141 142 142 143 145 146 147 149 149 150 150 151 152 154 155 156 159 160 161 162 163 164 166 167 175 176 177 178 179 185 Figure 64. Wiki editors wanted Figure 65. Featured all the time Figure 66. Bitcoin Figure 67. New Discord Server Figure 68. The BBC interview will double our amount of users Figure 69. Official backup domain Figure 70. For immediate release Figure 71. Almost roped Figure 72. What happens if a mod gets laid/gets a girlfriend? Figure 73. Beware American Incels Figure 74. We have to preserve and archive this Figure 75. Follow us on Gab Figure 76. LiveStream and Whitepill.org Figure 77. Don’t do interviews on Podcasts Figure 78. Spokesmen Figure 79. DDoS Attack Figure 80. The Manosphere Figure 81. r/AsianMasculinity’s roots in TRP Figure 82. The MRAsian community on reddit Figure 83. The MRAsian online ecosystem off reddit Figure 84. A top destination for aware Asian Men (AM) Figure 85. Get Twitter Now Figure 86. AsianSoul forum announcement from 2016 xiii 185 188 189 190 191 193 195 195 198 199 200 201 202 203 206 207 209 210 211 212 233 257 257 Figure 87. Anonymize your Asian Activism from 2016 Figure 88. Sponsored Projects Figure 89. Impeding our progress Figure 90. Woke social network Figure 91. Shouldn’t we stop using Cuck? Figure 92. We condemn the glorification of violence Figure 93. There is no official back up Figure 94. The r/hapas sidebar in 2017 Figure 95. The importance of Asian Americans having children Figure 96. Crowdfunded porn as activism Figure 97. Donations Figure 98. How to get the most out of your Twitter account Figure 99. Mobbing Figure 100. Emperor Magazine Figure 101. Stop using the language of the alt right Figure 102. Friends and Projects Figure 103. Evidence Figure 104. Sign up for AsianSoul.org Figure 105. Revenge Figure 106. The history of Asian reddit and their catalysts Figure 107. Symbolic infrastructure and the practices that build it Figure 108. Relational Map of r/TheRedPill Figure 109. Relational Map of r/Incels xiv Figure 110. Relational Map of MRAsians and r/AznIdentity Figure 111. Discourse Map 1 for r/TheRedPill Figure 112. Discourse Map 2 for r/TheRedPill Figure 113. Discourse Map 1 for r/Incels Figure 114. Discourse Map 2 for r/Incels Figure 115. Discourse Map 1 for MRAsians Figure 116. Discourse Map 2 for MRAsians 258 259 259 260 260 261 261 xv CHAPTER 1: Introduction In September 2018, Alex Jones was banned from Twitter. Jones, who for years had peddled dangerous conspiracy theories and misinformation on his vast media ecosystem that consisted of radio shows, podcasts, YouTube videos, and his own social media pages, is perhaps best known for peddling the conspiracy theory that the Sandy Hook Elementary School shooting in 2012 was a hoax (Salinas, 2018). Jones was not only banned from Twitter, however: Facebook and YouTube also followed suit and deleted Jones’ content from their platforms, and in April 2019 Jones and his media empire were banned from Facebook (Schwartz, 2019). A part of a larger far right media system, Jones and others like him have made their fortunes from this mode of “outrage media,” meant to incite fear and rage among its consumers (Neiwert, 2017). Jones’ banning from Twitter and other online spaces is a part of a larger trend in an attempt to control the spread of extremism and hate speech online. The rise of the alt right (an umbrella term for the loosely connected extremist right and white nationalist movement that is known for their strategic use of digital media) has often been attributed to their strategic use of digital technology and social media (Daniels, 2018; Neiwert, 2017). After the election of Donald Trump and a number of violent mass murders carried out by far right extremists (Southern Poverty Law Center, 2018), there were calls for online platforms to start taking responsibility for helping to facilitate radicalization and indoctrination into extremist thought (Daniels, 2018; Noble, 2018). In response, the major social media websites either started to more strictly enforce existing policies or enacted new ones to justify the removal of these extremist celebrities. These bans, also referred to as “deplatforming”, have been applauded by activists, scholars, and netizens alike. Other high-profile alt right celebrities who have been banned from most major 1 social media platforms include former Breitbart editor Milo Yinnaopoulos, former co-founder of Vice Media and former Proud boys leader Gavin McInnes, conspiracy theorist Louis Farrakhan, among many others (Schwartz, 2019). But it is not just the high-profile figures of these movements that have been subjected to bans during this massive shift in platforms taking responsibility for violent content and disinformation. Entire communities have been removed from these platforms in an attempt to “detoxify the Internet” and to discourage the continual growth of extremism and dangerous mis/disinformation within them (Marantz, 2018; Statt, 2019). But attempts to halt the spread of extremist groups involve not just banning them from specific platforms but banning entire platforms themselves – like in the cases of Gab, touted as an “alternative Twitter” for the far right; as well as the prominent neo-Nazi website The Daily Stormer, both were taken down by their domain registrars and server providers (Ingram, 2018). Due to the nature of the Internet and despite these attempts to shut down the websites completely, they re-emerged on new domains and with new server providers. The same occurs in other web spaces – despite the bans that have occurred on major social media platforms, these platformed celebrities and groups still manage to find new web homes and to reestablish themselves. So despite these attempts by platforms, domain registrars, and even governments (J. Silverman, 2019) to censor this extremist thought and their amplification, far right extremist groups manage to persist (Weill, 2018). The Internet allows for the easy creation of community and spreading information, affords groups the possibilities to move from platform to platform, and even to create their own online ecosystems, despite the imminent threat of bans and censorship. This dissertation focused on three groups that have been subjected to modes of online censorship: r/TheRedPill, r/Incels, and r/AznIdentity. The groups highlighted in these case 2 studies fall under the general umbrella term of “The Manosphere”, which is a loosely connected online network of men’s rights activists, Incels, Pick Up Artists (PUAs), and other male-focused communities (Baker, 2017; Ging, 2017). Although many groups in the Manosphere attempt to distance themselves from the alt-right, particularly in recent years, the connections between them cannot be ignored: and in fact, the Manosphere has been pointed to as a gateway ideology into more extremist thought – specifically, for the alt-right (Futrelle, 2017b). Reddit, in particular, has come under fire in recent years for their role in facilitating the violence that came from GamerGate (Barnes, 2018; A. Massanari, 2015), as well as providing the community infrastructure for extremist groups (Caffier, 2017). Despite these affordances, there are multiple ways that community infrastructure can fail online. On September 27, 2018, the notorious subreddit r/TheRedPill was placed under quarantine by reddit administrators. A quarantine on reddit indicates that the forum has been placed under a form of probation by the reddit administrators due to not adhering to community guidelines. As opposed to a ban, which removes the community entirely from reddit’s infrastructure and erases all of their content, a quarantine is a kind of warning to the subreddit that they are being watched and if they do not “fix” their ways, that they will ultimately be removed. In this case, administrators pointed to the group’s “shocking or highly offensive content” for justification for the quarantine. As a result, the group’s statistics on subscribers and active members are set to zero, all posts are being heavily monitored by non-subreddit administrators, and instead of being met with the subreddit’s homepage, visitors are met with the following page: 3 Figure 1. r/TheRedPill’s landing page after the quarantine The quarantine of the subreddit did not come as a shock to its members or the larger extremist digital community. Despite the ease of building online communities on platforms like reddit, extremist groups also fall mercy to the actions of other users (who intend to shut down the community) and the platform administrators themselves in whether or not they can continue existing. Some groups persist for years with no fear of censorship, whereas others are shut down almost immediately depending on their content. However, some groups exist in an in-between space, where their content is offensive and shocking but not in violation of community rules that warrant a quarantine or a ban of the group. r/TheRedPill is one of these in-between groups: it has been the target of a number of ban attempts, has been heavily “brigaded”1 by other reddit users, and has constantly walked a thin line between being banned and existing on the larger reddit 1 “Brigading” is the term for a form of digital harassment enacted by reddit users to target a community and report posts, flood the subreddit with comments and posts that are irrelevant, and to “downvote” all content to confuse the reddit algorithm 4 infrastructure. Notably, r/TheRedPill was quarantined, but not other Manosphere groups like r/MGTOW or r/MensRights; but the moderators and users of r/TheRedPill have been prepared for years of an imminent ban and established archives and off-site forums in the event the subreddit goes dark. Established in 2012, the subreddit has always used this threat of censorship as fuel for their movement and has proactively established their presence off the reddit platform. In a way, deplatforming and other modes of failure can often serve to strengthen the group rather than to discourage them and can provide the necessary situation for innovative practices to flourish. The three case studies in this dissertation demonstrate the ways that these groups maneuver, understand, and continue to persist despite attempts to ban and silence them – I argue that these practices of navigating digital platforms and infrastructure are what allows these communities to persist, and each case demonstrates their similar but also differing methods. This ability to maneuver around the constraints of platforms and web infrastructure demonstrate an innovative and sophisticated level of knowledge of these systems: although the ideas that are being peddled within these communities are not by any means “new” ideologies or discourses, the sociotechnical systems that are being used to create and spread them have evolved not just the discourse itself but the methods in which it is reproduced. Each case study represents more than 18 months of ethnographic work, where I collected thousands of posts, images, and links to understand the ways that these groups responded to threats of bans and censorship. All three of the cases also represent a different relationship to digital infrastructure, and across cases there are a number of similar practices in negotiating this relationship but also differences in how they responded to the potential failure of that infrastructure. These practices in response to this 5 actualized or potential failure helped to shape the metaphors used to describe the groups that are introduced in the case studies: digital refugees, world builders, and lost civilizations. The Possibilities of Failure To explore these responses to failure (real or threatened) and what they ultimately create, I draw on three case studies that explore different kinds of movement and archiving strategies of three extremist groups. The first case, r/TheRedPill, demonstrates how they use these strategies to build their worlds and to preserve them, in the event they are forced off one platform and have to re-establish themselves on another. The second case, r/Incels, represents a lost civilization, in that the original community lost all of their content after they were banned, and from these failures rebuilt themselves in a different way. The third and final case study is of r/AznIdentity, who represent a different kind of preservation and movement: the members of this community were banned from another subreddit, r/AsianMasculinity, and instead of moving off-site decided to remain on reddit’s platform and establish an antagonistic group. They are digital refugees, who in the beginning started the group out of spite and to antagonize the original group they were ousted from and relied on this initial event to build their group identity. To maintain this “new world,” they engage in similar archiving and world building practices like r/TheRedPill in establishing off-site forums, materials, and networks. The purpose of this dissertation is not for exploring and establishing that these groups exist, that much we are aware of. What this dissertation aims to do is to study the archiving, preservation, and sustainability strategies of these extremist groups, which reveals their attempts to produce and reproduce certain discursive forms and epistemologies. Their archives represent social epistemologies that not only guide their worldviews (Stoler, 2002), but also demonstrate 6 their knowledge of digital infrastructures and its affordances in sustaining the group. Rather than looking at their communities as they exist, this dissertation is aiming to understand the ways system failures in the form of bans, quarantines, and other platform-related censorship actions reveal something larger. In particular, the responses of these groups to real or imagined threats of failure and their sustainability practices is a mode of epistemic production, meant to not just sustain the group’s community but to continue their discursive forms. The ways that discursive forms that long predate the Internet are being reproduced demonstrate another dimension of these groups’ symbolic practices afforded to them by material infrastructures like the Internet. Although material infrastructural failures can be detrimental to the group, these threats and their consequences result in innovation. Thus, despite being constraining, infrastructural failures are also enabling innovative practices to sustain the community. As the saying goes, “necessity is the mother of invention,” and the need to preserve in the face of threats perhaps drives much of these practices. Existing research on extremist groups online focuses on the makeup of the group’s content, attempts to expose their members, their conversational dynamics, and their larger networks and uses of digital media in propagating their content (Daniels, 2009b; Ging, 2017; Marwick & Lewis, 2017; A. Massanari, 2015; Noble, 2018). While all of these projects reveal the ways that extremism online functions and how the Internet facilitates the radicalization process and continues to reproduce already existing racist ideologies, what my dissertation hopes to answer is how these worlds are sustained when they are forced out; what happens when these groups are under constant stress and threats; and how they rebuild. By examining how these communities sustain themselves, this project will provide new insight to understand their persistence despite attempts to thwart their growth. 7 Understanding that existing on one platform is a strategy meant for failure, all three of these groups not only acknowledge the need for preserving the forum but aim to spread their group’s reach beyond their hubs to strengthen their movement and ideology. The question of why these hateful ideologies persist, and in the current political climate, grow, warrants further investigation. Particularly, what remains to be examined is how these communities harness the affordances of digital platforms but are simultaneously constrained by them through things like bans and quarantines – which also influence their practices. Research in the area of extremist hate groups tends to focus on either the material aspects of their groups or of the symbolic aspects, but often do not study these two facets in tandem. Exploration into how both the material and symbolic aspects of digital infrastructures as vehicles of propagation and how these work together in sustaining the group is lacking and understanding these practices may help to provide strategies for halting the spread of extremism more effectively. Therefore, the research questions guiding this dissertation are as follows: 1. What are the practices of these extremist groups to preserve and sustain their online communities? 2. What do their responses to infrastructural failure (or the threat of failure) reveal about their strategies to influence popular discourse and produce discursive forms? 3. What do these practices reveal about the creation and spread of digital discourse? 4. How do these groups engage in epistemic production to challenge cultural systems? 5. How do specific platforms, e.g. reddit, constrain and enable the practices of the groups to sustain their communities? 8 6. How are events like infrastructural failure used to strengthen group narrative and identity? Using a qualitative methodological approach, specifically situational analysis, this project Infrastructural Imaginaries fills in a gap in the literature by viewing how extremist communities navigate platform and infrastructural politics and exploit their affordances. Situational analysis is a method that was meant to update the traditional grounded theory approach after the postmodern turn (Clarke, Friese, & Washburn, 2017). Focusing on not just groups and communities, but taking into account the entire sociopolitical conditions and situations they find themselves in, situational analysis relies on the use of mapping out social worlds and the practices of the groups within them to understand phenomena (Clarke & Star, 2008). This framework of theory and method of analyzing social worlds in situational analysis highlights meaning-making practices that are shared among groups of actors – both human and non-human – and aims to identify the relations within and across these worlds (Clarke & Star, 2008). By looking at not just the material practices (e.g., designing archives and building off-site forums) but also at the symbolic practices of these groups (i.e. the creation of a collective identity and narrative), I view these groups’ sustainability practices as the creation of symbolic infrastructures than transcend the constraints of digital platforms. The practices being used by these groups to withstand the shocks of infrastructural failure build up symbolic infrastructure, which they are able to carry with them from platform to platform to ensure the futurity of their group. I argue that it is this symbolic infrastructure that facilitates their persistence despite infrastructural failure, and this arrangement across platforms helps to fortify them against these shocks. 9 Infrastructure does not need to be material or something that can be “touched” for it to impact the ways that everyday life is structured, and the effects of infrastructure are often affective and exist in a kind of imaginary due to its hidden nature (Parks, 2015; Rothstein, 2015; Star, 1999). Despite the famous assertion that infrastructure only becomes visible upon breakdown (Larkin, 2013; Star, 1999), the groups I study in this dissertation demonstrate the ways that infrastructure becomes visible through just the very threat of failure, and what they build in order to defend themselves in the event of a breakdown. What they build is symbolic, and the way that symbolic infrastructure is built is through three main practices: archiving, fortification, and identity maintenance and network building. Archiving in these groups is similar to other archiving work done by other activists and communities online that are more left-leaning and progressive, and sometimes not even political like in the case of fandoms (De Kosnik, 2016). These “rogue archiving” practices, according to De Kosnik, demonstrate how the work of knowledge production and its preservation have “gone rogue,” meaning that instead of being the work of libraries and official institutions like governments, archiving is an act that has been harnessed by laypersons with the skills and knowledge to participate in it (De Kosnik, 2016). A political act, the preservation of knowledge also ensures its reproduction, and thus laying claim to truth and history by these groups is not just merely ensuring back up repositories of information but rather to ensure the futurity of their group, their content, and to prove that they once existed (De Kosnik, 2016). Through archiving, the groups in this dissertation not only ensure that future members will be able to access historical knowledge of the group, but also ensure the continued socialization of these members into a larger historical and collective memory of the community that helps to sustain it. 10 Fortification, as a practice meant to build and support symbolic infrastructure, refers to the resilience-building practices of the group by building off-site forums, websites, and repositories that are able to withstand the shocks of infrastructural failure. Fortification also consists of the building and maintenance of a strong group identity through the use of a shared language, discourse, and collective narrative. Social movements have traditionally relied on a shared language and narrative not only to shape the goals of the group but also for the creation of connective action (Brown, 2006; Hartley, 2015; Polletta, 1998a, 2009). These narratives in particular are what are transported throughout time to build a collective identity (Koven, 2015), and fortification processes not only build these more symbolic pieces of group identity but provides the material space needed to preserve and share it. And finally, identity maintenance and network building is a practice also meant to fortify and sustain the group, and these official/unofficial affiliations mean that these groups are aware of the importance of networks in maintaining their place in their digital ecosystem in terms of social capital and to ensure the continued circulation of their discourse. All of these practices that make up the larger symbolic infrastructure of these groups also consist of smaller practices like the sharing of stories and narratives, shared identity, and their linguistic and interaction norms that are heavily enforced within the communities. All of these small acts that feed into the larger practices create and maintain a symbolic infrastructure, which ensures the sustainability and perseverance of the group. Theorizing through the lens of how they arrange themselves across platforms, the dynamics between platform, community, and infrastructure highlight the inherent political tensions of digital space are more visible: r/Incels is a lost civilization trying to exist based off of members’ memory alone, r/TheRedPill has anticipated the crumbling of their world, and 11 r/AznIdentity is made up of refugees trying to build a similar, yet completely different, world. Each of these cases demonstrate similar practices in archiving and preservation but represent different kinds of movement and world building strategies. The refugees (r/AznIdentity), world builders (r/TheRedPill), and lost civilizations (r/Incels) of this dissertation demonstrate different practices in how they build their symbolic infrastructures, and differing levels of resilience of their communities. Thus, I argue that the reason why these groups persist despite attempts to thwart them is because of these practices that build a symbolic infrastructure, which is able to withstand the failures of the material/digital infrastructures and platforms that these groups rely on for their existence. The platforms and the infrastructure that they rely on are their opportunity structures and also their enemies, thus marking a political tension between needing the services and affordances of these digital spaces but actively engaging in an antagonistic relationship with them as well as other communities. The us vs. them dynamic that they build within their communities is not just against the progressive left (in particular, feminism), but also with the platforms themselves and their censorship practices. Further, the policies of platforms and the infrastructures they are on constrain these groups’ movement but also enable, and thus are implicated within the larger critiques of these hate groups as well. By examining this tension between user, community, platform, and infrastructure, I aim to highlight not only these tensions but shed light on the ways that these practices and the building of symbolic infrastructure provides an explanation for the continued sustainability of these extremist hate groups. 12 Dissertation Map The organization of the dissertation is as follows: chapter 2 is the background and methods section, which provides an overview of the literature on the topics presented in this dissertation and positions this work in a broader body of work in the fields of social movement studies, science and technology studies, platform and infrastructure studies, and communication. The case studies are organized into their own chapters: Chapter 3 is the case of r/TheRedPill, notably, r/TheRedPill has harnessed the true potential and power of the digital platforms and infrastructure that their community relies on, and expanded their network not just horizontally on reddit through the establishment of other subreddits but also by building a vast network of off- reddit websites, affiliated blogs/podcasts/and other forms of media, as well as having the most robust archivization practices out of the three case studies. Chapter 4 presents the case of the r/Incels community, which was banned from the reddit platform in November of 2017. They exemplify the case of the “lost civilization” due to their lack of preparedness and organization in saving their community’s content but emerged on an off-site forum and started engaging in this preservation work due to the hard-learned lesson of failing to do so before. Chapter 5 is the last of the case studies and examines a community on reddit called r/AznIdentity, which serves as the hub of a group of men known as “MRAsians”, a portmanteau of “Men’s Rights Activist” and “Asians” (Chew, 2018). r/AznIdentity is an interesting case because it is the only one out of the three case studies that focuses specifically on a racial identity and a very distinct form of racialized masculinity as the main basis for its community. The group became infamous in 2018 due to their targeted harassment campaigns of Asian women, and in particular any Asian women they felt were “feminists” or “PAAs” (Progressive Asian Activists, their term, mimicking the “SJW” [Social Justice Warrior] insult of the right). One of their main 13 points of “activism”, in particular, is attacking Asian women for not dating, marrying, or reproducing with Asian men – and thus their targets often involve Asian women who are not dating/married to Asian men (Ng, 2018). This group was started by a group of users who were banned from another reddit community, who view themselves as refugees, and made a horizontal move and established a number of new subreddits and actively antagonize other subreddits. Thus, like refugees, they were forced out and persecuted, and in retaliation they have built a more robust community and network that far transcends the boundaries of the reddit platform. Chapter 6, the discussion and conclusion, presents the implications, themes, and points the way forward in the study of extremism online. In the discussion, more about the practices and the building of symbolic infrastructure is explicated, as well as the implications for studying other communities using a similar approach and theoretical framework. This chapter provides more theoretical grounding for the project by linking to the empirical findings presented in the case studies and recommendations for future projects. Most importantly, the chapters highlights that these practices to sustain far-right extremist thought are all toward the goal of these extremist groups to push their interpretations of reality because they desire to take over culture itself (Daniels, 2009b, 2018; Kelly, 2017). This desire to change cultural systems is not a new phenomenon: social and political movements have always had this goal to shift the realities of their worlds to create the possibilities of a new one (Buechler, 2011; Polletta, 2008). Often referring to their actions and movement as engaging in a “culture war”, the far right has been engaged in not just a battle for enacting their policies but for morality, government, and culture itself (Jensen, 1997; Kilgore, 2018; Koleva, Graham, Iyer, Ditto, & Haidt, 2012). Although this battle for reality and for having the dominant discourse and ideology of any given society has been a long-running fight for millennia, what the rise of mass media and the Internet 14 have done is helped facilitate and speed up the process of these changes – and its spread (Virilio & Bratton, 2006). Thus, the struggle to detoxify the Internet and the continued persistence of extremist groups online exemplifies how these networks have been built, embedded, and fortified through digital affordances. 15 CHAPTER 2: Background and Methods The advent of the Internet has allowed for many people to form communities and to find groups in which they can find information, express themselves, cultivate friendships, and gain support (Bambina, 2007). These online spaces have given many marginalized groups that may not have the resources in their face to face communities to grapple with issues that range from chronic disease (Coulson, 2005) to sexual orientation (Ybarra, Mitchell, Palmer, & Reisner, 2015) to activist movements (Velasquez & LaRose, 2015). These online groups are also instrumental for some in the formation of their identity, personal and/or social (Barker, 2009; Boellstorff, 2015; Chiu, Huang, Cheng, & Sun, 2015). However, despite the positive effects that online communities have granted to those who felt otherwise voiceless or lost, these positive effects do not merely extend to them. Online communities are a breeding ground for the propagation of conspiracy theories, misinformation, and extremist ideology (Banet-Weiser & Miltner, 2016; Chau & Xu, 2007; Chess & Shaw, 2015; Schafer, 2002; Wojcieszak, 2010). In particular, the men’s rights movement has only become more organized and perhaps, even more extreme, since the advent of the Internet (Gotell & Dutton, 2016). These mediated communities and their ability to transcend spatial and temporal constraints of typical face-to-face communication have not only enabled the men’s rights movement and other far right groups to organize their ideology and communities but has helped it to become a global movement (Girard, 2009; Gotell & Dutton, 2016; Serradell, Cruz, & Mondejar, 2015). However, extremism online had been studied beginning in the mid-1990s when researchers and policymakers noted how white supremacist groups like the Ku Klux Klan and others were establishing online forums and harnessing the power of these new technologies (Schafer, 2002; Thiesmeyer, 1999). In recent research, the use of qualitative methods, discourse 16 analysis, as well as big data analysis of these communities (to name just a few methods) have been utilized to understand these groups and their online mobilization. Below, I outline some of the research on online extremist movements and also discuss literature regarding the theoretical lenses that guided this dissertation. [Online] Extremist Social Movements Research on extremist communities and violent discourse often focuses on networks of users and their circulation strategies on specific platforms like Twitter and YouTube (Faris et al., 2017; R. Lewis, 2018; Starbird, 2017); the role of online platforms in radicalization (Bartlett & Miller, 2010; Daniels, 2009, 2018; Marwick & Lewis, 2017; Massanari, 2015; Wojcieszak, 2010); as well as specific strategies used by these groups to amplify their messages and to build their ideologies (Chess & Shaw, 2015; Coston & Kimmel, 2012; Marwick & Lewis, 2017; Mortensen, 2016; Schafer, 2002; Shaw, 2014; Whine, 1999; Wojcieszak, 2010, p. 1999). Other studies include inquiry into how memes are used as discursive forms to fight against perceived “enemies” like feminists (Massanari & Chess, 2018), networked harassment campaigns and their language use (Ging, 2017; Marwick & Caplan, 2018), and the connections between misogyny and more extreme forms of activism ((Futrelle, 2017b) Koulouris, 2018; Sundén & Paasonen, 2018). But long before the rise of the alt right (a loosely connected network of extremist right- wing and white nationalist groups that have harnessed the power of the Web in building their ideology and movement) there was GamerGate. GamerGate, in particular, was perhaps the foundation upon which the larger alt right adopted their tactics and strategies in sowing discord in online spaces (Lees, 2016). Reddit has been a hub for anti-feminist movements like 17 GamerGate and is perhaps one of the most prominent platforms in facilitating the rise of a “toxic technoculture” wherein users actively engage in harassment campaigns and tactics to silence the voices of those they view as oppositional to patriarchal systems of power (Massanari, 2015). It wouldn’t be until recently with the rise of the alt-right that the role of infrastructuralized platforms like Google and Facebook in fostering the growth of hate groups would be scrutinized (Marwick & Lewis, 2017; Massanari, 2015). Despite their lack of organizational structure, which was typically a criterion for determining whether or not a group could be considered a social movement (Gamson, 1992), as well as a seeming lack of consensus on the framing processes and goals of the movement itself (Benford & Snow, 2000), groups that fall under the umbrella modifier of alt-right should be considered a social movement because of one obvious marker of the movement’s purpose – in that they want to fundamentally change culture and the political sphere (Polletta, 2008). Thus, despite the struggle to balance the strategic, policy driven concerns of the group with the more ideological ones, the tactics used by extremist groups online have had lasting consequences on the culture of the Internet and infrastructuralized platforms as a whole by attempting to push their cultural and ideological schemas into the mainstream (Alava, Frau-Meigs, & Hassan, 2017; Daniels, 2018a; Lewis, 2018; Marwick & Lewis, 2017b; Polletta, 2008a). However, exposure by the mainstream media is a double-edged sword for political and social movements. Although on one hand news coverage and being present in mainstream sources can give movements exposure as well as help to recruit new members to their cause, it can also spell the downfall of the group in terms of how they are framed – not just by the media, but how they decide to frame themselves (Gitlin, 2003). This framing, by both the media and the necessity for the group to frame themselves through the use of spokespersons and clearly 18 outlined goals to be broadcast, can ultimately spell the downfall of the movement due to conflicts within the group itself (Gitlin, 2003). The alt-right, and the rise of groups like the Manosphere, similarly have both benefited and struggled from the lack of “official” stances and spokespersons of their movement, and those that did speak on behalf of the organization are intensely polarizing figures within the movements themselves; and the news media have had a large hand in amplifying their rhetoric (Phillips, 2018). In an online context, previous research on political mobilization used case studies of progressive movements like Occupy Wall Street, the Arab Spring, and others to illustrate the utility and constraints of using online platforms for political mobilization. However, Bennett and Segerberg noted two logics that may be occurring simultaneously in these processes that also exist among the extremist right: “(1) the familiar logic of collective action associated with high levels of organizational resources and the formation of collective identities and (2) the less familiar logic of connective action based on personalized content sharing across media networks,” (Bennett & Segerberg, 2012, p. 739). For the extremist right, this form of connective action is visible by examining their use of forums like 4chan, reddit, and independently operated ones, of social networking sites like Facebook, Twitter, Instagram, as well as the use of instant messaging services like IRC (Inter Relay Chat), Discord, and even Slack. The use of these various websites demonstrate how these digital media, platforms, and services have served as organizing agents (Bennett & Segerberg, 2012; Daniels, 2009b; Hine et al., 2016a; Kitada, 2012; Koulouris, 2018). Through the sharing of narratives and identity in the form of blog and forum posts, memes, and other kinds of digital content, the extreme right constructed an ecosystem within the larger digital infrastructure that helped to foster the growth and ultimately, the spread of extremist thought online (Daniels, 2018b; Donovan, 2019; Hine et al., 2016b; Marwick & Caplan, 2018; Nissenbaum & Shifman, 19 2017). Using these information technologies, information about the movement and the ideologies, as well as its goals, were able to produced, spread, and built upon by others who otherwise would not have had access to the content were it not for the ubiquity of digital platforms (Milan, 2015a). By peeling back the layer of a seemingly chaotic social movement without a clear sense of direction, and by focusing on its ecological and relational properties with the infrastructures and platforms themselves, new insight into the nature of social movements online can be garnered. Although Occupy Wall Street, the Arab Spring, Black Lives Matter, and other social movements are, indeed, valuable case studies in what they have taught communication studies and the larger scholarly community on modern social mobilization, analyzing the extremist right and their instrumental (and often, manipulative) uses of digital infrastructures to push forth their agenda into the mainstream may reveal more about how the entanglement of the material and the symbolic properties of infrastructures, platforms, and humans are all expressing differing levels of agency across digital spaces and temporalities (Emirbayer & Mische, 1998). Bennett and Segerberg, Renzi, as well as Milan point to how these digital networks are much more than just spaces of exchanging information and messages, but that they are organizations in themselves [i.e., connective action] (Bennett & Segerberg, 2012; Milan, 2015b, 2015a; Renzi, 2015). As a result, however, we can view them as social structures that have no inherent stability outside human action – for even the material agents like algorithms and platforms are built and maintained by people (Giddens, 1984), and the reproduction and spread of narratives are facilitated by the digital networks but are ultimately the result of human action through sharing. Infrastructuralized platforms like Google, Facebook, and Amazon are able to be manipulated for political gain by malicious actors (Daniels, 2018; Noble, 2018). However, when 20 these groups are censored or removed from the major platforms, the infrastructural affordances of the Internet allow for the creation of alternative platforms (i.e., Gab in place of Twitter, Hatreon in place of Patreon) that further allows for the sustainability of the group (Benett, 2018). Further, when censorship or removal does occur, this is used as a way of further solidifying the collective identity by pointing to these corporate platforms as the enemy. By studying extremist groups online, a multitude of revelations come to the surface that support traditional social theories but also shed new light on how platforms serve as digital intermediaries between otherwise disconnected persons and groups and because of what they reveal more largely about digital culture itself (Gillespie, 2017; Massanari, 2015). The alt right and the Manosphere, as social movements, harnessed the logics of infrastructure and politics and used them for their benefit and also against the platforms themselves, and thus by studying their practices more about the infrastructure’s agency within this relational context can be revealed. Thus, by studying this social movement within the context of the infrastructures that they project themselves into, and not by looking at what the infrastructure does but rather how it itself is an actor with its own agency that engages in its own politics, further theorizing of the changing nature of political mobilization, of the role of language and discourse, and of the role of platforms in this evolution can occur. Thus, by combining the perspectives of the symbolic and cultural production through narrative, the creation of collective identity, and ideas of structure, agency, and the relational/ecological properties of infrastructures, further insight can be garnered into what happens after the formation of the logic of connective action and during the ever-increasingly powerful role of platforms in shaping culture, civic engagement, the larger media ecosystem, and political mobilization. 21 While all of these studies lay an important foundation, they often focus on specific elements of each of these groups ranging from the material (i.e., their circulation networks on Twitter) to the symbolic (i.e., their strategies, ideology, and group identity). Further, many of these studies do not delve deeper into the practices that these groups employ to sustain their movements. Although there are studies that examine circulation networks and inquire about the larger societal context of what gave to their rise, the strategies of the groups themselves to maintain their movements has not been examined in depth. The analytical lenses for this dissertation will be guided by research in infrastructure studies; knowledge and cultural production; discourse and collective memory; identity; and preservation practices that include research on how archives function as repositories of information as well as fundamental aspects of social knowledge for groups. By further inquiring into their practices in producing discursive forms, preserving their group knowledge, and their responses to infrastructural failure, more can be revealed about the nature of social movements who rely on and yet are constrained by infrastructuralized platforms. Particular, this constraint is deeply felt when these infrastructures fail. When Infrastructure Breaks In communication studies, science and technology studies has helped to inform the conceptual language used in describing the sociotechnical characteristics of media and information technologies as culturally and socially situated systems, and for STS, communication studies has provided a robust body of research in understanding the relationship between mediated content, behavior, social structures and processes as well as the cultural forms, practices, and meanings enacted in these mediated contexts (Boczkowski, 2007). However, these 22 two bodies of literature, despite their increasing relevance to one another, have not necessarily informed the scholarship in terms of critically analyzing the relationship between the material (infrastructure and platforms) and the symbolic (social systems, language and discursive systems, and agency). Traditionally, the study of infrastructure grew out of science and technology studies, history, anthropology, and sociology. More recently, media studies and communication studies have been turning their attention to the study of infrastructure, to the study of underlying mechanisms and logic that dictate action. The analysis of infrastructure from a scholarly perspective developed from the study of analyzing large technical systems like electric power grids and telephone networks and eventually grew to a study of webs and networks; which then eventually developed into the work by Star and Bowker who studied the sociology of infrastructure and how the human elements that either sustain them or how infrastructural failure (i.e., electrical blackouts) affect people (Plantin, Lagoze, Edwards, & Sandvig, 2018). Infrastructure, from this perspective, is not a system that exists outside of the one created by humans but is rather a part of a larger network that entangles itself in the daily lives of the humans who create it, sustain it, and depend on it. Infrastructural systems, particularly in terms of information and knowledge infrastructure, are built up of communities of practice that required learned membership to be a part of the group, to share collective actions and knowledge, and to continually shape the meanings that are placed on these actions and knowledge (Borgman et al., 2013). Infrastructure plays a key part in the ways that digital platforms and by extension, online communities, are organized. Digital infrastructure provides the necessary technical support for these platforms and their communities to function, and often work in the background with many users not paying it much mind – until a failure like a quarantine or a ban occurs. Star famously 23 said that infrastructures are, by their very nature, invisible until they break down (Star, 1999). However, stating that infrastructures are invisible shrouds the ways in which infrastructure itself asserts a certain level of agency over the systems that it dictates. By approaching infrastructure as an agent within a larger system and viewing it not only as the skeleton that holds up systems but as a key actor within them (Deleuze & Guattari, 1987; Latour, 2005), we start seeing the ways that infrastructure is always visible and always present. But how do we see infrastructure? According to Deleuze and Guattari, who were championing the concept of the rhizome as one to replace the traditional model of culture of the root-tree system which traces causality and searches for an original “source” (i.e., the root), the rhizome “has no beginning or end, it is always in the middle, between things, interbeing, intermezzo,” (Deleuze & Guattari, 1987, p. 25). The rhizome moves horizontally and resists chronology and organization, and in this model of thinking culture spreads across and downward, and every which way. Negotiating this semi-chaotic but structured view of networks against the seemingly logical and ordered infrastructural networks like electrical grids, railroad systems, and the Internet requires a certain level of abstracted thinking beyond just analyzing what these systems do. Star and Bowker’s work on how infrastructure is influenced by the actions of humans and vice versa may help to fill in these gaps (Bowker & Star, 1999). The sociological approach to infrastructure studies exemplifies the rhizomatic logic as proposed by Deleuze and Guattari and further exemplifies the concept of assemblages put forth by them. Assemblage refers to a collection of things or pieces of “things” gathered into a single context. These assemblages can then bring about any number of effects, which include but are not limited to informative, machinic, consumptive, or aesthetic effects. Deleuze and Guattari choose to illustrate this with the example of a book, which exists as its own object that collects disparate 24 parts but does not remove it from the possibility of entering into other assemblages like libraries, archives, stores, etc. (Deleuze & Guattari, 1987). Like the rhizome, the assemblage lacks organization, but can either come into its own or become the part of something larger. This perspective of how assemblages come together and have effects informs Star’s assertion that infrastructure is “both relational and ecological – it means different things to different groups and it is a part of the balance of action, tools, and the built environment, inseparable from them,” (Star, 1999, p. 337). From Star’s perspective, although infrastructure itself may form its own “assemblage”, it exists in relation to other assemblages, objects, and actors within its environment. If the study of culture is approached from these perspectives, we start to break open our traditional view of culture as a bounded object that resists external or internal change. Infrastructures are “learned as part of membership” in communities and nations (Star, 1999, p. 381), and are not a part of the background of human experience but rather enters into a symbiotic relationship where it both shapes human experience and is shaped by it. Star also points out that infrastructure is heteroglossic, which complements the Deleuzian view of assemblages in that multiple voices emerge from a single “text” (as defined by Bakhtin’s discussion of the novel) but that they are embedded within a larger “text” or in this case, a system, where the “end” is actually a beginning, a starting point for analysis (Star, 1999). As Larkin notes, “infrastructures also exist as forms separate from their purely technical functioning, and they need to be analyzed as concrete semiotic and aesthetic vehicles oriented to addresses,” (Larkin, 2013, p. 329). Combining the perspectives of Star as well as the theoretical positioning of Deleuze and Guattari and Latour, Larkin notes how “focusing on the issue of form, or the poetics of infrastructure, allows us to understand how the political can be constituted through different means,” (2013, p. 329). 25 The study of infrastructure then is ultimately a study of structure and systems, and Larkin even pushes back on Star’s claim that infrastructures are invisible until they break down (Larkin, 2013). Larkin notes that “infrastructures are metapragmatic objects, sign of themselves deployed in particular circulatory regimes to establish sets of effects,” (Larkin, 2013, p. 336). Although many studies using the approaches proposed by Star reify her claim that infrastructures are invisible and are backstage actors (Star, 1999), it seems to serve as only one part of how infrastructures govern and structure the human relationship with them. Infrastructures, despite their seemingly omnipresent nature in structuring human experience and everyday life, are bound within larger systems (although this is increasingly becoming less true with the platformitization of infrastructure, discussed further below). However, this does not mean that both material and non-material objects cannot transcend these boundaries, giving rise to Star’s concept of the boundary object, which can either be a practice, a material form, information, and other aesthetic, material/non-material things that structure social systems and ways of being and knowing (Star, 2010). For these “metapragmatic” social objects, combining the theoretical view of infrastructure with Giddens’ theory of structuration (1984), however, reveals that infrastructure themselves participate in the creation and reproduction of social systems because despite their claimed invisibility, they operate as non-human agents that in day-to-day life have no beginning or end and endure over time and space. In a world where the focus on infrastructure is turning toward the digital and technological infrastructure that shape and govern our social systems, networked ICTs present a new set of challenges for how to analyze them (Postigo & O’Donnell, 2017) . In the impending wave of Web 4.0 the relationship between the platform’s affordances, the users that produce content, and the networks that shape these experiences are all in an 26 entangled relationship muddied by the fact that platforms themselves are corporations that dictate the form of social systems and experience. Infrastructure studies inform our larger understanding of how these visible/invisible sociotechnical systems shape the way that time, space, humans, and non-human actors are all embedded into a structure with none of these taking precedence over the other in how these forms perform in everyday life and larger events. As demonstrated by the case studies in this dissertation, the breakdown of infrastructures is not necessarily due to a mechanical system failure but rather the act of human administrators removing group members from their infrastructure – replacing or removing parts, and this occurs at the level of platforms (i.e. being banned from Facebook or reddit) but also at the domain registrar level as well. For these groups in these case studies, negotiating the building of community with the constraints of infrastructural politics is a key part of their navigating these systems. When a dominant sub-group is able to control the data and the knowledge within a system, they are then able to reinforce their power over the discourse within that area (Borgman et al., 2013). Since infrastructures are fundamentally about distribution, they are also about the practices within them, and thus their establishment and maintenance are a long term endeavor (Borgman et al., 2013). It is through observation of how groups navigate infrastructure and maintain their communities that it becomes clearer how the web has provided a number of affordances for the growth and spread of extremism. Affordances for Radicalization The role of platforms and infrastructure in the rise of extremist groups has been well studied. Everything from neo-Nazis to ISIS recruitment has been scrutinized on digital platforms, particularly in their modes of manipulating online systems to amplify their discourse and for 27 recruitment by examining their online communities qualitatively and through big data analysis (Daniels, 2009b; Whine, 1999; Wojcieszak, 2010). However, what these studies do not reveal is how these groups manage to persist online – but do highlight the ways that they use digital platforms for their own ends. The platforms themselves, as they are increasingly behaving like infrastructure in their organization of everyday life (Gillespie, 2017; Plantin et al., 2018) tend to downplay their role in the growth and spread of these communities. The ambiguous and neutral semantics of the term “platform” obscures that there are significant political dynamics occurring within these spaces, which influences not only policy but the potential uses of these digital space and the agency of users themselves (Gillespie, 2010, 2017). By asserting their agency over material and digital space, extremist groups then use a variety of practices, imbued with their own meaning, to not just create the knowledge within them but also how they spread from platform to platform. This movement, across platforms, is a key affordance granted to users of the Internet and requires some knowledge of navigating infrastructure to do so successfully. Although the term “affordances” emerges constantly as what they afford to users, a better approach to critically engaging with the dynamic nature of infrastructuralized platforms is by conceptualizing affordances as “the ‘multifaceted relational structure’ (Faraj & Azad, 2012, p. 254) between an object/technology and the user that enables or constrains potential behavior outcomes in a particular context,” (Faraj & Azad, 2012, quoted in Evans, Pearce, Vitak, & Treem, 2016). The relationship between user and platform – and increasingly, infrastructure – is a multidimensional one that needs to consider the relational aspects of all of these moving parts within these digital systems that inform larger cultural practices. Affordances themselves are not embodied in technologies, and affordances must be seen in relation to “what emerges from the user’s interactions with the object,” (Evans et al., 2016, p. 5). 28 Different affordances exist on platforms and infrastructures, indeed, but the blurring of what it means to be a platform and what it means to be an infrastructure gives rise to new challenges in analyzing these affordances – but, despite the fact that they are complementary and exist in conjunction with each other, they emerge from different disciplinary contexts and come with different theoretical and methodological assumptions (Plantin et al., 2018; Postigo & O’Donnell, 2017) . Platform allows for profit-driven companies like Google, Facebook, and others to play themselves up as “cultural intermediaries” (Gillespie, 2010, p. 353) and establish the rhetoric and criteria by which they are judged. Digital platforms do indeed afford the exchange of cultural flows (Jin, 2017), however these flows are governed and policed by the platforms themselves in who has voice, who has agency, and what kinds of behaviors are rewarded – or punished (Gillespie, 2017). This governance of platforms as to what social and human behaviors are acceptable then trickle down into “real life” encounters and are capable of shaping culture itself. The affordances of these digital infrastructures are revealed in the ways that users build online communities – they interact with the platforms and the infrastructural systems in their choice of website for establishing their group, their establishment of off-site forums, and by making use of the myriad of choices of communication in the age of Web 3.0. These digital spaces have allowed for the extension of people’s worlds and “publics” (by way of Habermas’ conception), and have also paved the way for people to participate in social worlds in ways they could not before (Papacharissi, 2010). Social networking sites and online social platforms have given rise to new understandings of community, culture, identity, activism, and civic engagement. Specifically, networked technologies and social media platforms allowed for “new affordances for amplifying, recording, and spreading information and social acts,” (boyd, 2010, 29 p. 46). These affordances in turn shape these “networked publics” and also structure the ways that users negotiate their landscapes. Networked Publics and Cultural/Epistemic Production The rise of networked publics have significantly blurred the distinction between a “networked” public and the concept of “publics”, and they do not necessarily dictate people’s behavior in them but rather provide the architecture that shapes user engagement (boyd, 2010). However, it is not just the dynamic between user and network that results in these shifts, but rather people within these spaces are learning how to “work within the constraints and possibilities of mediated architecture,” (boyd, 2010, p. 57). Harnessing the power of digital infrastructure, users in my dissertation not only engage in building communities but also engage in epistemic production that will sustain the community for the long term. In the making of a symbolic infrastructure built on knowledge, cultural forms, and other modes of epistemic production like narratives, these case studies demonstrate how the users interact with not only the technological components of infrastructure but also how they interact with modes of production that predate the Internet. But the question remains not only how these users react to infrastructural failure, but also what is lost in the event of failure, in particular the knowledge that the group built. What is lost is often the content that the group has meticulously produced for the purpose of sustaining their movement – which means that preservation is a necessary part of their world building strategy to maneuver around the logics of the platforms and infrastructure. The building of these worlds, then, not only helps to maneuver around the constraints of infrastructure and the platforms they find themselves embedded in, but allows for a hardened institution built through the capability of 30 participatory culture, and through which they understand their place in the larger ecosystem and within culture at large (Fischer, 2007; Jenkins, Ito, & boyd, 2015a). Part of the possibilities of action, the affordances, granted by networked ICTs and platforms is the ability for human actors to engage in the creation of repertoires of stories that help to not only provide cultural resources, but to define membership within their own communities, imagined or otherwise. Actors, human or not, do not engage in the shaping of technology apart from the social world but are constantly constructing and reconstructing the sociotechnical system in which they are a part (Beer, 2017; Latour, 2005; Postigo & O’Donnell, 2017) . Content takes a cultural form, and norms are instituted and reproduced within this larger connected social system that influences day-to-day life and action. Anderson noted that imagined communities of nationhood were informed by literary and text forms like the novel and the news industry (Anderson, 2006), and Giddens and other theorists have pointed to the use of language and discourse in the formation of meaning which is then embedded into larger social and discursive practice (Deleuze & Guattari, 1987; Foucault, 1982; Giddens, 1984; Latour, 2005) that then inform people’s understanding of the world itself. Culture does not exist “outside” this larger system and is a part of the social system itself, with infrastructure itself mirroring certain aspects of culture as being a human-created system with its own rules and practices. van Dijck writes: “The ecosystem of connective media does not reflect social norms; interconnected platforms engineer sociality, using real-life processes of normative behavior (peer pressure) as a model for and an object of manipulation,” (van Dijck, 2013, p. 174). This manipulation of systems is most visible when we see responses to failure, and these subversive tactics reveal much about how these connective media function. 31 Discourse and Preservation One of the most powerful shapers of culture, social life, and reality is discourse. Sociotechnical systems disseminate and diffuse discourse beyond the reaches of traditional communities and creates new discursive forms. But discourse, despite its powerful structuring of lived experiences both on and offline, is an elusive and often, vague concept (Sherzer, 1987). Depending on whose framework one works with, discourse is either a meta-level boogeyman that cannot be altered or, in the case of Bourdieu, is constantly being structured and restructured through interactions between people (Bourdieu, 1977). In the three case studies, we see discourse functioning not only as a governing logic by which they build their collective identity but also as something for the groups to fight against. By producing their own knowledge and repositories of information, they enact a number of discursive strategies to access some kind of cultural power. In effect, they use the Internet as the front line of the larger cultural battleground, and through language practices, identity, and narrative attempt to establish their “imagined community” (Anderson, 2006) and organize as social worlds that engage in action. Within these digital communities, narratives and other linguistic acts are performed similarly to how they’re performed in face-to-face interactions, and like discourse, “technology is a repeatable social, cultural and material process … crystallized into a mechanism or a set of related mechanisms,” (Sterne, 2003, p. 376). The structuration and figurations taking shape are influenced by a multiplicity of actors with varying degrees of agency – both the human and non- human within these structures reproduce the rules that govern behavior and action within them, with differing levels of possible action being enabled or constrained (Giddens, 1984; Latour, 2005). As a medium and like other media, technology is a space where gender, race, and other facets of identity can be performed, expressed, challenged, and reproduced (Boellstorff, 2015; 32 Chun & Lo, 2015; Marques & Koven, 2017). Sexist and racist language and ideologies related to them circulate within digital contexts (Chun, 2016; Nakamura, 2002), and meaning is “not limited to predefined linguistic or contextual factors but is necessarily mutable and partly unpredictable (Butler, 1997),” (Chun, 2016, p. 88). Therefore, despite certain similarities that exist in the virtual and the actual in terms of discourse, narrative, and gender/race expression, the digital environment has presented new challenges for the study of these dimensions of social life (De Fina & Perrino, 2017). A fundamental aspect of discourse, archives are a material artifact of these related processes that structure social life, and the ways that they are established and the content they contain reveal how these social worlds function and are sustained. The archive, as a part of their social structures that orders its users behaviors, emerges as a powerful source of knowledge and organizing principle (Stoler, 2002, 2008). Archives are not only places of extracting materials but are powerful spaces where discursive forms are created, reproduced, and through the power of the archived materials establish the law of “what can be said,” (Foucault, 1982). By analyzing archives as not only repositories of information but by examining the systems of belief that they enable, a deeper understanding of the “social epistemologies” (Stoler, 2008) that dictate people’s understanding of their worlds through collective memory can be revealed. In these case studies, archiving is a practice that is not only meant for preservation but also a way of maintaining these social epistemologies that the group is fundamentally built upon. These digital archives contain narratives, reference materials, and other types of content that not only function as a “back-up”, but as proof that this group ever even existed (De Kosnik, 2016). In the case studies proposed here, the analytical focus for these groups is by examining how they interact with digital objects and navigate events like infrastructural failure through their 33 preservation practices. On the surface, creating back-ups of forum content demonstrates the affordances of the Web in allowing for lay persons to establish digital archives, but a deeper analysis demonstrates that these archivization practices are a form of knowledge production. The establishment of these archives do not merely function as repositories of content but preserve the language and glossaries of the communities, their collective identity, and culture of the groups. By establishing their own communities and archives, the groups in this dissertation lay claim to their own regimes of truth and symbolic realities which helps to sustain them even if they are ousted from the platforms they call home. Thus, the archive is less a tool and more of a status, a form of power that asserts its dominance over cultures and social worlds (Derrida, 1998; Foucault, 1982), and although these unofficial archives can serve to uphold already existing power structures by way of race, class, and gender, they are often used to shift the balance of power between archive and repertoire (De Kosnik, 2016). Archives, and what is in them, afford a status of legitimacy for the historical narratives and cultural products of one group over another (Trouillot, 2015), and thus the participatory culture of the Web has given those who are often removed and silenced in larger cultural narratives to lay claim to their historical narratives and thus, repertories of action, culture, and truth (De Kosnik, 2016; Swidler, 1986; Trouillot, 2015). Web archives can be created by online communities and individual users and are (often) publicly accessible and changeable – however, digital infrastructures and platform rules themselves constrain the extent to which this process can be undertaken by users (De Kosnik, 2016; Proferes, 2016). Ordinary people – like in these three case studies - have become much more involved in the production and reproduction of culture and cultural systems that serve as foundations of larger systems of belief, ideology, and identity (Beer & Burrows, 2013). Although this has been 34 ongoing throughout time prior to the digital age – activists for instance have used their own system of archivization in order to help socialize others into the group using works by other activists, political thinkers, and the like (Lee, 2016). These “Rogue Archivists”, by De Kosnik’s definition, and in as demonstrated by the three groups in the case studies, are not actively attempting to resist the cultural structures that assert their power but rather are attempting to bring about an entirely different kind of society into being by creating these epistemologies and discursive forms (2016). Through the use of personal narratives and other cultural and memory objects, digital archivists reassert a claim to historical and cultural truth. In a sense, archiving outside the realms of what is considered to be authoritative has always been a part of social and political movements who are aiming to not just preserve the group’s memory and knowledge, but to socialize its current and future members into a certain line of thought and discourse. The material artifacts that social and political movements create, then, are a large part of their culture and helps to shape the collective memory (and identity) of the group (Lee, 2016). Social movements are attempting to usurp what they view as the dominant culture but still rely on a repertoire of tools that they had been socialized in to in order to enact action toward cultural change (Swidler, 1986). The cultural meaning created by the groups in these case studies, through cultural symbols, facilitates these patterns of action. However, because meaning itself is constructed by dominant forms of power, it also discourages some forms of action and constructs symbolic realities (1986). Access to power is also access to culture and the ability to shape its symbols, strategies, and meaning; and this is where we see the battle over determining regimes of truth, historical narratives, and cultural/symbolic realities (Derrida, 1998; Foucault, 1972; Swidler, 1986; Trouillot, 2015). This battle over shaping cultural/symbolic realities occurs online most often through the act of moderation, and 35 moderation as a practice is what enables but also constrains communities at both a micro and macro level. Shaping and Maintaining Online Social Worlds Within online social worlds, moderation is a key shaper and constraint that constructs the realities of the users that are within these communities. Moderation is often something that is known to happen within online spaces, however the decisions often exist in a black box with many users not understanding what exactly this process entails (Gillespie, 2017, 2018). These practices by platform moderators, and administrators, are meant to protect users from disturbing content and can serve a very crucial practice to prevent the circulation of not just disturbing but dangerous content (child pornography, gore, etc.). The issue, though, is the never-ending stream of content that moderators are often having to deal with – in particular, it can often be difficult to immediately respond to flagged/reported content or to enforce decisions fairly and consistently. Moderators, like in the case of Facebook, are either being paid poorly to do this extremely difficult and often traumatizing work (Newton, 2019), or aren’t being paid at all in the case of reddit – who relies on volunteer moderators to control the content of the subreddits (Marantz, 2018). Moderation, though, also serves another purpose in these online communities: they define the acceptable boundaries of practice and discourse within them, and moderators serve as the de facto leaders of the community (Massanari, 2015). Thus, moderators in their capacity as the “custodians” of these communities (Gillespie, 2018) not only define the acceptable boundaries, but also create and preserve the community in their vision as influenced by the policies of the platforms they are on. Moderation, however, occurs in tandem with other leadership practices 36 that include more outward facing tactics like official statements. It is this shaping and preservation of social worlds by these de facto forum leaders within the context of the case studies that informs the practices of the group – which all then lead to its preservation and sustainability. Preservation occurs in the form of archiving, the establishment of official narratives, and the cultivation of a group identity through the control of the discourse within the forum. Portable Infrastructure One window into how preservation strategies shape their users is by analyzing the ways that these groups maintain and preserve narratives in the shaping of collective identity of their members. Narrative is not only an expression of identity but a claim to some kind of historical truth, which sustains the legitimacy of the nation, community, or group in the members’ minds (Anderson, 2006; Polletta, 2008). Thus, I am theorizing that these cultural and discursive practices function as a kind of symbolic infrastructure, the creation of which allows for the groups to persist over time despite failure. Material infrastructures fundamentally shape the ways in which everyday life is lived (electrical systems, plumbing, roads) and shapes the ways that people understand themselves, interact with each other, and understand their social worlds. But as demonstrated by the above sections, the practices of archiving, navigation of the affordances and constraints of platforms, and the building of a collective identity are all practices that inform the building of a symbolic infrastructure. Containing both the material and the culturally symbolic, this infrastructure is what is able to be transported from platform to platform after breakages. 37 Creation and preservation of communities is a fundamental aspect of the longevity of any online group. In particular, the role that infrastructure and platforms play in the construction of and mobilization of political/social movements is that they serve as the “place between” (i.e., a liminal space, the interstitial) that enable mobilization through narratives, identities, and solidary networks that are built on platforms like blogging sites, social networking services, and through the sharing of links and other “soft” resources (Milan, 2015b). Polletta further argues that culture is not a subjective lens through which people perceive structures, but a key dimension of them (Polletta, 2008). Continued efforts to build and maintain this collective identity and culture, are crucial in the continued existence of any movement or group, and in the case of this dissertation, is a critical aspect of their sustainability. While symbolic infrastructure has not been explicitly theorized previously, it builds on a synthesis between work on knowledge infrastructure as well as work on digital culture in the process of meaning making and structuring of social worlds (Borgman et al., 2013; Bowker & Star, 1999; Couldry & Hepp, 2016; De Kosnik, 2016; A. Massanari, 2015; Milan, 2015b; Star, 1999). We can see glimmers of this concept in Bowker and Star’s work in knowledge infrastructures (Borgman et al., 2013) as well as in the work done by Milan and Bennett and Segerberg in digital collective identity and connective action (Bennett & Segerberg, 2012; Milan, 2015a), Polletta’s work in narrative and social movements (Polletta, 2009), De Kosnik’s work on rogue archiving (De Kosnik, 2016), as well as Massanari’s work on toxic technocultures (Massanari, 2015). Knowledge infrastructures, despite being a fundamental part of my thinking of symbolic infrastructure, does not take into account things like narrative and identity in its formation and spread. Additionally, Milan’s work as well as Bennett and Segerberg’s work on collective identity and connective action, despite informing how these processes occur in digital 38 social movements, does not take into account the preservation strategies of the movements they analyzed. Polletta’s work on narrative in social movements does not take into account the digital, nor do any of these above works explicitly deal in extremist social movements. Although De Kosnik’s work does note the utility of the concept of rogue archiving in understanding political movements, it ultimately focused on fandoms and small feminist forums that were not parts of a larger centralized platform like reddit. Finally, the work done by Massanari on toxic technoculture and the ways that platforms facilitated the rise and spread of the toxic GamerGate movement has greatly informed this concept, but again did not consider how the community sustained itself. Rather, they looked at the ways that algorithms and content moderation policies failed to police and control the growth of toxicity. Particularly, symbolic infrastructure, unlike collective identity and culture, has an intentional quality to it that unlike culture, which is embedded and feels overwhelming and unchangeable, enables these subcultures and social movements to embed themselves within the existing cultural infrastructure they see as oppositional to their goals. However, they are not only trying to embed – they are attempting to replace, to take over, and their practices of archiving and establishing a collective body of knowledge indicate this desire to spread out across online communities in an attempt to inject their discourse into the mainstream – and to shift culture. Methods To understand how responses to failure result in sustainability practices that allow for the continual existence of extremist groups, I focused on three groups in the “Manosphere”: the subreddit r/TheRedPill, Incels, and MRAsians (Men’s Rights Activists who are Asian). Often cited as “gateway ideologies” into the alt-right, the Manosphere is also a loosely connected 39 digital ecosystem consisting of pick up artists, men’s rights activists, and other male-focused identity groups that are threaded together by violent misogyny (Futrelle, 2017a). Because of their views of gender, they are often demonstrated to have connections to far-right extremism because of similar rhetoric against feminism and women’s liberation, as well as race (Futrelle, 2017b). Each case was chosen because it represents a different mode of failure and their responses to it in turn: Incels were banned from the reddit platform and had to move to an off-site forum; MRAsians started their subreddit after being banned from the dominant Asian American subreddit; and r/TheRedPill has been the focus of many ban attempts by other users and is constantly on guard in the event that their forum is ultimately shut down. Although the websites I am looking at tend to serve as “hubs” of the movement, there is no set hierarchical structure and the communities more accurately mimic rhizomes because of their horizontal, vast nature with no perceivable beginning or end (Deleuze & Guattari, 1987). Digital Ethnography. I conducted digital, multi-sited ethnographic fieldwork for these cases since January 2017 ceased fieldwork in January 2019. I started collecting data about MRAsians later, around April 2017. I stumbled across the forum on my own when I was looking for forums to join for personal use. The ethnography spanned two years, and the websites that I have visited have mostly been focused on the subreddits of r/TheRedPill, r/AznIdentity, r/EasternSunRising, and the web forum of Incels.me. Digital ethnography is a tangled web of never-ending data sources (Boellstorff, Nardi, Pearce, & Taylor, 2012; Postill & Pink, 2012), but limiting my data collection to focus primarily on the “hubs” of each group helps to limit the scope of the study, with outside sources from the hubs serving as supplemental data to build an image of their larger networks. Table 1 presents the communities, the main “hub” of the data collection, and data collection start dates. 40 Table 1. Case Studies, Sites, and Data of Data Collection Start. Case Study Main Website Data Collection Start r/TheRedPill http://www.reddit.com/r/TheRedPill January 2017 r/Incels and Incels.me http://www.reddit.com/r/Incels February 2017 r/AznIdentity http://www.reddit.com/r/AznIdentity March 2017 http://incels.me Although my fieldwork timing had not necessarily been as consistent as a physical, face- to-face ethnography because of the nature of digital platforms, I attempted to enter each digital space 1 to 2 times a week since January of 2017, with more frequent fieldwork during times of significant political or social events like mass shootings, elections, and legal cases. I will exit the field for the purposes of this dissertation on January 15, 2019. Focusing primarily on the “main” landing page of each community, I then would collect data from posts that were getting a significant amount of attention and engagement (comments, upvotes, etc.), since a regular user of the forum would hypothetically engage mostly with the homepage content. User engagement on reddit is typically in the form of upvotes/downvotes (upvote for a “like”, downvote for a “dislike”) and reddit’s algorithm brings more highly upvoted posts to the top of the page (Massanari, 2015). Thus, I aimed to focus on some of the most visible content while then also looking at controversial posts (a setting you can filter out by on reddit, wherein there are a lot of downvotes but a lot of comments) to get an idea of the ways that accepted discourse functions but then also exploring contested spaces of meaning. For my own personal safety, I opted to not interview or engage with participants directly and have maintained a distance from them by only accessing websites, forums, and channels that 41 are public (i.e., not private or only accessible through the creation of an account). However, by observing the digital spaces, the infrastructural changes in the communities (moving platforms, changes in archived materials, etc.), and the interactions of users within them, I have collected enough data to discern their strategies of epistemic production, of the ways that they attempt to disrupt cultural systems, their preservation strategies, and their networks. Along with posts and comments, I also collected data from their wiki pages, their glossaries, their rules, and other static materials (i.e., rulebooks and historical “records” the groups keep) that they made available to group members to get an understanding of their archiving and socialization practices. From each community, I collected thousands of posts and comments, screengrabs, and developed extensive background knowledge of each group and their off-site networks by just being in their online spaces. The dataset is comprised of scrapes done through Evernote Web Clipper (which captures the page as is, showing the kinds of interactions and the hierarchy of comments); text-based data from Google Big Query and a reddit comment scraper powered by Python; images (screengrabs of certain posts; images that have been posted on the forums by users); and some YouTube videos and comments. The use of Evernote Web Clipper is because of the fact that the page is captured with all of its elements: the way the website looks, users’ profile pictures, memes/gifs/links embedded in the posts, etc. The data is thus multimodal and is extremely rich because of this. By homing in on the questions outlined above to understand the ways that platforms as infrastructure and their archiving strategies are intertwined, much of the data will not be used. Despite this, the data not related to these strategies of knowledge production and distribution is still useful in understanding group dynamics, the larger discourses they enact, as well as the broader ideology of the group. 42 Data Analysis. In order to capture the various ways that infrastructure, discourse, and social movements are entangled within my case studies, I used a method known as situational analysis. Due to the nature of the questions as well as the context wherein my case studies are located, situational analysis is a more appropriate approach than traditional modes of ethnography and even digital ethnography because of how it views social worlds and their actors. My interest in the ways that knowledge is sustained within these groups stems from their use of archiving within the forums themselves: the use of wikis, off-site archives, and constant monitoring of the group’s activity are a way for these groups to continually assert their claims to knowledge, and to truth and reality itself. Further, they serve as spaces of socialization into the group through the use of glossaries, readings, and other artifacts that are meant to be used as tools for navigating the digital habitus they have enacted. But all of these things have been influenced by the very platforms and infrastructural systems that they create these epistemic and discursive forms, which also influence their spread and preservation. Situational Analysis. Since the postmodern turn, the differences and complexities of social life confounded with the rise of digital and participatory technologies has necessitated an updating of the grounded theory approach (Clarke, 2003). Situational analysis updates but also fills a gap in the approaches previously proposed by Glaser and Strauss’s grounded theory approach by taking into consideration of context and situatedness of the actors within social worlds, and relies on the use of maps to analyze relations between human, nonhuman, discursive, and other elements of situational contexts (Clarke et al., 2017). Specifically, within these contexts, the things that should be observed and accounted for are not only the visible practices (in my specific case, the archiving practices and responses to infrastructural failure) but also the invisible ones that are rendered through silences (Clarke, 2003; Trouillot, 2015). Situational 43 analysis is relational, situated in its approaches to action, and considers all of the human, non- human, and discursive actors that make up social worlds. By using three kinds of maps – situational maps, social worlds/arenas maps, and positional maps – Clarke provides a methodological and theoretical framework to understand all of the actors in the situation (situational), to lay out the larger collective actors (social worlds), and the discourses and discursive materials (positional) within them. These maps frame and analyze the situation of inquiry in the following ways (Table 2). Table 2. Description of Maps, from page xxiv from Clarke, Friese, and Washburn, 2017. Map Type Purpose Situational Maps Lay out the major human, nonhuman, discursive, affective, geopolitical and other elements in the research situation of inquiry and provoke analysis of relations among them Social Worlds/Arenas Lay out the major collective actors (social worlds, organizations, institutions, etc.) Maps and the arena(s) of commitment and discourse with which they are engaged in ongoing negations in the situation of inquiry Positional Maps Lay out the major positions taken, and not taken, in discussions, debates, and extant discourse materials in the situation of inquiry vis-a-vis particular axes of difference, concern, controversy about important issues The situational maps capture not just the groups, but all of the elements that have influenced their inception and growth. The social worlds/arenas maps become more concise by limiting the focus to the communities; i.e., all three are associated with the larger men’s rights movement (the “Manosphere”) but position themselves differently. To understand a particular 44 social world, a researcher “must understand all arenas in which that world participate sand the other worlds in those arenas and the related discourses, as these are all mutually influential and constitutive of that world,” (Clarke et al., 2017, p. 73). Effectively, the maps start broad and then become narrower to capture the complexity. Through this method, we overcome the issues of modeling digital community, bring forth the silences in historical narrative and production, and are able to visually display the complexities of social situations. Because of the nature of situational analysis being dependent on understanding the ways that human and non-human actors are implicated within them, and the role of discourse in these situations, it is an appropriate approach for data analysis. The data that did not reflect responses to infrastructural failure, i.e. the data that revealed the larger discourses and ideologies of the group and their positioning within this extremist ecosystem, helped to create the messy situational maps that start the process of situational analysis. This was a crucial step during analysis to visualize all of the moving parts, both material and sociopolitical, that influenced these groups and the situations that they were in. However, Clarke notes that the utility of this approach is more typically in the realm of science and technology studies or in medicine (Clarke et al., 2017), but it can be argued that its utility also extends to the current political situation that the alt-right have been able to establish their social worlds by way of digital infrastructures. Each case has its own situational map, social worlds/arenas maps, relational maps, and positional maps since despite their similarity they cater to different interests and found themselves in slightly deviant situations in terms of infrastructural failure. Creating these maps requires inductively coding data and looking for themes, in particular trying to parse out their responses to infrastructural failure, as well as identifying the key actors (human and non-human) within any given situation. The relational 45 maps of each group highlight how all of these pieces fit together and how they are related to and informing one another, whereas the social worlds/arenas maps illustrate the situation and the actors more succinctly. To serve as an aid in the analysis of these case studies, discourse mapping all of the groups’ different positions highlights not only their similarities (and their differences) but the practices of the groups across platforms. These discourse maps highlight the similarities but also the key differences among the groups in terms of their organization and cohesiveness. To create all of the maps, I used a software called OmniGraffle and to code the data, I used a combination of Evernote’s “notebook” feature as well as hand written notes, which were then typed up and put into their own folder as fieldnotes. The maps, however, were particularly helpful in understanding the vastness of the social worlds of the groups in this dissertation. As a visual aid, the social worlds/arenas maps are meant to demonstrate how specific connections influence the positional mapping and distribution channels of the groups across the digital realm. Since meanings are tied to practice, the focus of interpreting these actions is particularly made clearer when analyzing how these groups’ situations change (Charmaz & Belgrave, 2012). The arena they occupy is nebulous and their social worlds are nestled within larger ones, and accounting for their rhizomatic nature reveals how all of these assemblages come together and interact (Clarke et al., 2017). These maps are included in the case studies – their “worlds”, their relational maps, as well as the discursive positions that they adopted in response to infrastructural threats. This approach was not without its limitations, however. Of course, the largest one was that I was not able to conduct interviews with any of the users of the groups that I studied – although this at first seems like a huge limitation in the conclusions I ultimately arrive at, 46 observing their online interactions and their practices is still a valid method of discovery, and in particular this choice was made to ensure my safety (Boellstorff, Nardi, Pearce, & Taylor, 2012; Massanari, 2015). Further, this dissertation only looked at three groups within The Manosphere, and future work using this same theoretical and methodological approach can be focused on more extremist groups that are actively participating in violent radicalization like the ones that led to recent attacks (Anti-Defamation League, 2019; Marwick & Lewis, 2017). Despite the limitations, what this dissertation aimed to find was how these groups manage to persist over time – and ultimately, arrived at a conclusion that may help to enact strategies to effectively combat radicalization both on and offline. 47 CHAPTER 3: Case Study 1 World Builders: r/TheRedPill The first case study is that of r/TheRedPill, which is the largest of the three communities studied in this dissertation and perhaps the best well-known among scholars, activists, and netizens for its role in The Manosphere and its prominence on reddit at large. Founded in 2012, the community quickly became infamous on reddit for its content and discourse and was mostly perceived as an offshoot of the pick-up artistry community that had homes on the Internet far before r/TheRedPill. What this case demonstrates is the concept of world building, where I show that they not only created a subcommunity within the larger reddit platform but rather built an entire world encompassing their beliefs, discourse, and practices. The subreddit is an immensely organized one, and throughout its life has had to evolve its practices of moderation, preservation, and sustainability through platform policy changes as well as internal and external pressures. r/TheRedPill are a notable case precisely because of their prominence – being one of the key players in this larger movement, r/TheRedPill provided a blueprint and framework for other communities that were offshoots of it (or who were inspired by r/TheRedPill’s practices) in not just maintaining but sustaining their community. Serving as an example for other similar groups, r/TheRedPill may be an aspirational community due to these strategic practices of establishing archives, fortifying their community, and of the identity maintenance and network building techniques they used not only to strengthen the identity within the subreddit but also to attach themselves to a larger network which would help to sustain the group in the event of an infrastructural failure. In some respects, their quarantine sent a shock throughout the larger Manosphere but also their quarantine was not a surprise – no other community like r/MensRights, r/MGTOW, and other affiliated subreddits were banned or quarantined on reddit, 48 and perhaps due to r/TheRedPill’s popularity their quarantine was meant to be a warning or to set an example for what was to come. Quarantine On September 27, 2018, r/TheRedPill was placed under quarantine by reddit administrators. A quarantine on reddit indicates that the forum has been placed under a form of probation by the reddit administrators due to not adhering to community guidelines. As opposed to a ban, which removes the community entirely from reddit’s infrastructure and erases all of their content, a quarantine is a kind of warning to the subreddit that they are being watched and if they do not “fix” their ways, that they will ultimately be removed. In this case, administrators pointed to the group’s “shocking or highly offensive content” as justification for the quarantine. As a result, the group’s statistics on subscribers and active members are set to zero, all posts are being heavily monitored by non-subreddit administrators, and instead of being met with the subreddit’s homepage, visitors are met with the following page: Figure 2. r/TheRedPill is Quarantined 49 The quarantine came as no surprise to the moderators and users of r/TheRedPill: attempts to ban and censor the community had been ongoing since the forum’s inception. The community was started in 2012 by reddit user pk_atheist, a local New Hampshire politician in his 30s named Robert Fisher (Bacarisse, 2017) and was built around his vision to indoctrinate men into a misogynistic ideology that is an amalgamation of men’s rights activism and pick-up artistry. Both of these movements have long histories and often feed into one another, but r/TheRedPill is perhaps the most visible community that combines them. Marketing the community as a “[a] … discussion of sexual strategy in a culture increasingly lacking a positive identity for men.” A possible reason for their success (currently at 300,000+ subscribers to their main forum alone) is their instrumental use of the reddit platform itself: rather than just being a typical web forum that exists only in one digital space, the moderators and members of the forum affiliate themselves with other subreddits that espouse Red Pill teachings, align themselves with other men’s rights blogs and websites to build a larger network, and have been proactive in establishing off-reddit communities. The world that they have created encompasses a number of facets of Red Pill identity that is not only meant for men – a community exists for married Red Pill members, for Red Pill women, and even a community for Red Pill Parenting, where tips and suggestions are traded among members on how to raise children with this mindset. Thus, they not only were aware of the way that non-members perceived them, but actively started building a world beyond the subreddit itself. Although the community has often been targeted within reddit by digital vigilante subreddits to be shut down, since 2016 many news articles and think pieces have brought attention to the group as well as the larger Manosphere. These attempts to shut down and censor the forum, from internally within reddit as well as external pressures, have resulted in the 50 group members taking precautionary steps since the forum’s inception in order for the group to persevere. The practices of the group then not only signify world building, but the preservation of this world in the event that it is removed from the reddit platform. These practices not only aim to preserve the group but its discourse, ideology, and content in the form of archives, off-site forums, as well as in the creation of PDF handbooks available in an off-reddit website. Because of the constant barrage of attacks by non-members (referred to as “brigading”2), the subreddit early on in their existence established fail-safe forums and archives for their content which look exactly like the subreddit. r/TheRedPill is a notable case study for its (1) popularity (2) its preservation strategies and (3) their establishment of off-site forums and archives. Although they were able to sustain their community on reddit despite constant threats of being shut down, they have always been acutely aware of how their community comes across to non-members and this outside perception is a significant part of how they respond to these threats of infrastructural failure. The situation that the community found themselves in was one where a shifting tide in terms of the role and responsibility of platforms and their complicity in amplifying hate speech, and reddit and other platforms began exercising their power to ban and censor groups, and increased media attention. In this case study, I explore the practices they use to protect their social world, and also the world that they created in response to the threat of censorship to fortify themselves. The Red Pill Universe Wake up, Neo. The discourse of TheRedPill, as well as its name and overall mission, is premised primarily from a scene from the film The Matrix where the name of the forum is taken 2 Brigading is an online practice where non-users of a community will enter and start making posts, downvoting content, etc., and encouraging others to do the same to alter its content and cause chaos temporarily 51 from: in the scene, Morpheus offers the protagonist Neo two choices: take the blue pill, and he remains in the hallucination imposed by the robot overlords who have enslaved humanity – but take the red pill, and he “wakes up” to reality and sees the world for what it truly is for the first time. In essence, TheRedPill offers an ideology and identity that is replacing the reality the user knew before. The subreddit does not exist on its own, however, and tight moderation and strong establishment of a network provide the infrastructure for their community. Subreddits allow for the functionality to establish community materials that include but are not limited to rules and glossaries for members to familiarize themselves with, but in the case of r/TheRedPill there were a number of readings provided for new members in order to socialize them into the group. Figure 3. r/TheRedPill’s universe on reddit r/TheRedPill built its world in two different ways: 1. By establishing multiple subreddits that aimed to cater to different needs of the community (for example, a subreddit for women interested in Red Pill ideology) and 2. Officially aligning the subreddit with a number of 52 different blogs, websites, and off-site forums. Figures 3 and 4 represent these nodes and r/TheRedPill serves as the main hub to which all of these connections point back to. Figure 4 shows the universe of r/TheRedPill off the reddit platform. The data to build this “map” was taken from the community itself – they provide a number of links for their users to point them to additional resources or friendly communities. Through these, the connections of where their ideological leanings are more apparent: it is from these websites that many of the materials used to socialize new users into the group are taken from, and they serve as the source dogma for the community’s discourse and other content. These readings are kept in a very prominent space on the subreddit itself: the sidebar, which is a static element of the page that follows the user as they click on content throughout the subreddit. Containing a glossary, rules, and a number of “theory” readings meant to teach and socialize users into r/TheRedPill ideology, the sidebar is the first creation of a sort of archive, a roadmap into the group, and governs the rules of interaction the community. These identity maintenance and network-creating practices are a mode of grounding their version of reality. By existing in multiple spaces and nodes, and in multiple forms, the group exists outside of just its main subreddit and this enables them to appeal to a broader user-base as well as providing an interconnected community infrastructure. This infrastructure, however, far surpasses the constraints of the digital infrastructure they build their communities on and serves as an ideological infrastructure that help to inform and sustain Red Pill thought. By building this world, they not only build their community but the very discourses upon which it is founded – and these different sources of thought inform and support one another. 53 Figure 4. r/TheRedPill’s off-reddit universe r/TheRedPill Discourse and Ideology The readings, the affiliated subreddits, and the establishment of a vast network serve as a world building practice that all echoed similar discourse – that modern society has been plagued by feminism and progressive thought, and that masculinity and men are in crisis as a result. The community espouses pseudo-evolutionary psychology to support their assertion that women are morally, intellectually, and physically inferior to men. This discourse is also present in the movement that coined itself “The Dark Enlightenment,” and Red Pillers as well as those within the Dark Enlightenment acknowledge that their rhetoric and beliefs go against “mainstream” culture – a culture that they see as antagonistic to their ideas and their discourse (a relational map showing these connections and their situation is included in Appendix B). Borne out of the larger men’s rights movement as well as pick up artistry, r/TheRedPill combines these two views with 54 ideas from the eugenics movement in terms of women and men being biologically different, but without as much discussion regarding race. Race, in the forum, is not discussed unless it’s to reiterate that women of all races are the same (because they are women) and that a man’s race may have an impact on their sexual success – race is seen as a biological reality that impacts women but not men, further demonstrating the forum’s attitudes to biological determinism and sex. Regardless, the ideology present in the forum echoes that of reactionary politics and ideals that the men’s rights movement premises itself on. Because they are on a popular website, the subreddit is easily found and accessed by a multitude of users from different walks of life, and the identity of man is more important than any other identity marker. r/TheRedPill is central to the entire social world that they created, and the moderators of the forum serve as the de facto leaders of the community. These modes of organization – the establishment of the sidebar materials, the additional subreddits, and the alignment with other Manosphere blogs and websites – means that the subreddit does not exist alone in the digital world but is just one very powerful node within a larger community premised on the same discursive forms. But being reliant on reddit’s platform to build and maintain their community means that the moderators and subreddit members have little control over what the reddit administrators do in terms of control over the community. Although the platform allowed for the functionality to establish a multitude of subreddits, resources, and discussion, the reddit administration ultimately holds the power to ban and censor communities from its platform as well as limiting access to its infrastructure. r/TheRedPill, and communities like it within the Manosphere, are at the mercy of the governance of these platforms and the policies that shape them. Thus, while they depended on the platform for their very existence, the platform also served as their main source of strife. 55 The Ban Wave Cometh: The Beginning of the Imminent End Bans on reddit, like many other online communities, are a part of how reddit administrators moderate the site. Since reddit’s inception in 2005, there have always been community-wide guidelines and policies in place and many subreddits are heavily moderated. Motivated by an ideal of free speech and freedom of expression, reddit boasts thousands of communities and millions of members in groups dedicated to everything from gaming to hiking to science and history – however, as the site grew larger and the communities more controversial, changes in their policy occurred (A. L. Massanari, 2015). Users often pushed back against bans of communities by pointing to outside pressures of the reddit administrators from their stakeholders, as well as public media scrutiny. Redditors often cried out in response to announcements of community bans by saying that this was the beginning of the end of free expression on the platform, a “slippery slope” that would be hard to come back from – even in the case of the banning of communities like r/jailbait and other communities dedicated to sexually suggestive content of minors/perceived minors in the summer of 2012. Responses to this were varied – on one hand, many users supported the removal of these communities whereas other users were concerned of what this act of censorship would mean for other communities across the reddit platform (figure 5). Figure 5. Response to a reddit announcement post about banning communities containing sexually suggestive content featuring minors/perceived minors 56 What does the history of a ban that predates r/TheRedPill (established October 2012) have to do with the community? This ban is relevant because of what it would foreshadow in the years to come: in 2014, reddit made the unpopular choice to ban a community called r/TheFappening, which was a community dedicated to posting the leaked iCloud images of nude or nearly nude celebrities (A. L. Massanari, 2015). This was met with a large outcry from members of the reddit community who pointed to the same concerns of reddit administrators infringing on free speech and acting as tyrannical dictators, same as the response to the banning of r/jailbait and affiliated communities above. But the event that sent off alarms in r/TheRedPill came after the ban of r/TheFappening: In June of 2015, reddit instituted what reddit users and other netizens refer to as a “ban wave”, which refers to a large number of communities being banned from the platform en masse. This ban wave, most notably, received attention for its ban of r/FatPeopleHate, a fatphobic community dedicated to mocking and denigrating photographs and videos of overweight and/or obese persons, as well as sharing text posts about members’ disgust at these individuals. Other communities that were banned included groups that targeted trans individuals, Black people, and the gay community (Robertson, 2015a). These bans were instituted after a change in reddit’s harassment policies in May of 2015, where reddit updated its policy on what constituted “harassment”. r/TheRedPill, however, took note of these bans and the acts of digital vigilantism of other reddit communities who aimed to have other offensive or depraved subreddits shut down, particularly within the community r/AgainstHateSubreddits. Following the ban wave of 2015, r/TheRedPill took note of the policy change and started to “prepare” against the perceived threat of their imminent demise. Moderators within the community (who are selected by existing moderators, with the initial moderator being the 57 subreddit founder) started clamping down on their policies and banning users and made public notices to the members about this change. In a post from 2015 following the ban of r/FatPeopleHate, one of the users of the subreddit noted that the ban “ … relates to TRP as we were regularly listed alongside FPH as ‘the worst places on reddit’ by multiple AskReddit threads.” The poster, as well as many in the forum, blamed these administrative level reddit changes to the policies and rules on Ellen Pao, former reddit CEO, and often referred to her as “Chairman Pao” and the current administrators as the “Pao Administration,” and directly placed blame on Pao for the changes and the perceived threat to free speech. The hatred of Pao was nearly universal across all of reddit after news broke that Victoria Taylor, an extremely popular reddit employee who was responsible for coordinating the AMA (Ask Me Anything) series, was fired – and, of course, the blame was pointed at Pao (Mills, 2015). Even though Pao was not responsible for the firing – rather, it was reddit co-founder Alexis Ohanian (a white-presenting, cis-straight man) – the ire and disdain was all targeted towards her (an Asian woman). Pao’s tenure as CEO, as well as the bans that occurred before her hiring, set off alarm signals in the community that occurred long before any action was taken on the subreddit. A post in the community seemed to point to some knowledge of what was about to take place before it happened. A post titled “[Meta] SJWs are reportedly working with admins to develop a stricter "no harassment" policy at Reddit. We should have an action plan in place for when TRP is banned” provided details and posts that pointed to the impending ban waves that threatened the community. At the time of this post (February 2015), the backup website Puerarchy.com had already been shared with the community as the backup in the event of a forum shut down. 58 Figure 6. Valuable posts should be archived In figure 6, the user WaynesCotting noted that not only should there be a backup forum (they suggest the alternative website Voat, which functions similarly to reddit and will be elaborated on later in this chapter) but that the “valuable” posts and the sidebar materials should be archived. In response to the fear of being next in any upcoming ban wave, the subreddit moderators rejected suggestions of moving to Voat or 8chan and started to establish off-reddit forums and archives for members to go to in the event of the subreddit being banned. However, the moderators also controlled the number of conversations within the forum that they believed were caused by the panic instigated by the shutdown of r/FatPeopleHate and other communities. In response to the ban, many r/TheRedPill users pointed blame toward “SJWs” (Social Justice Warriors) and SJW culture for the shift in reddit policies: i.e., that reddit administrators were responding to outside pressure to violate the community norms reddit was founded on regarding freedom of speech and expression. SJWs, “shitty feminists”, and “PC” (politically correct) movement were all pointed to as reasons for why these subreddits were shut down, and fears were expressed that TheRedPill would be next on this so-called “hit list” to shut down 59 “politically incorrect” or “offensive” subreddits. However, concerns were raised prior to this first “ban wave” that saw the shutdown of r/FatPeopleHate when a user noted that Men’s Rights and TheRedPill user accounts were being banned (see figure 7). Many redditors, not just those on r/TheRedPill, were concerned that reddit was going to go the way of Digg – another extremely popular website and content aggregator that effectively crashed when it lost users after a complete website redesign in 2010 (Marshall, 2012). Figure 7. A post from 2013 noting the shadow banning of accounts associated with Men’s Rights and TheRedPill. Figure 7 is a post from 2013 that expressed the concern about the shadow banning of accounts associated with Men’s Rights and r/TheRedPill, and the edit indicates a response to the comments by other users claiming that they were experiencing the same. But fears in the community, real or imagined, about an imminent ban or shut down (whether it be individual accounts or of the community itself) had been occurring since the subreddit’s founding. However, the instigating event that made TheRedPill start aggressively establishing off-site forums and a “contingency plan” was the change in policy regarding harassment that occurred in 60 2015 and led to the ban of r/FatPeopleHate and other communities. At this point, TheRedPill was relatively unknown outside of the reddit community, but had nearly 100,000 members. The establishment of off-site forums and archives point to an interesting practice within the community that shows that they responded to external events as threats, and viewed these changed policies on not just an attack on the communities that were banned, but on the entire world of reddit and even more broadly, an attack on (1) men (2) free speech and (3) conservative ideologies. Many users of the forum understood that the community was not viewed favorably outside of the subreddit, and one user even proclaimed that “our exile is inevitable.” Figure 8. You are not a part of reddit. r/TheRedPill also started making posts reminding users that they were “not a part of reddit” (see figure 8), and that they needed to be aware that their community existed as an enemy in “a foreign land,” and that the community was hated by other communities on the reddit platform. Thus, despite being a part of the “reddit community”, the posters in the subreddit demonstrably viewed themselves as apart, and posts in the subreddit by the forum leaders and 61 moderators continued to assert this claim, noticeably driving a further wedge to solidify a group identity, but an identity that they knew was a contentious one. Notably, a user in the comments of this post also noted that they created alternative accounts to browse and post on reddit – the reddit platform allows for users to look at other user profiles and posting/commenting history, and many did not want to invite scrutiny or the risk of harassment for being a part of r/TheRedPill. As the profile of the community increased, and more and more people learned what the community was about and what it stood for, fears of this type were amplified. In particular, being found out as a “red piller” (see figure 9). Figure 9. Alt-accounts Although some users pushed back on this “doomsday prepper” attitude (as seen in figure 9) when it came to the subreddit’s attempts to remind users of how they were perceived by being members of the community as well as the contingency plans, the moderators ultimately made decisions on their own on how the subreddit would proceed and how it would re-group in the future. They did this by exploiting the nature of the Internet itself: because of the affordance of being able to create infrastructure off the reddit platform, this was exploited as a decentralizing tactic to remove themselves from the power of the reddit administrators and larger reddit community. 62 Contingency Plans: Doomsday Preparation on r/TheRedPill Responses to Pao as CEO of reddit as well as the change in harassment policies leading to the Ban Wave of 2015 caused a mass exodus of reddit users to the website Voat. Voat functions much like reddit and even mimics the visual layout of the website, and premises itself on the idea that all speech is allowed on the forum – i.e., it’s an alternative reddit, where almost anything is fair game. r/FatPeopleHate and other communities immediately set up communities on Voat, and even redditors who were just simply against the change in policy as well as Pao’s leadership left the website to join this alternative community (Hockenson, 2015). Even within TheRedPill, users were suggesting the establishment of a TheRedPill community on Voat, in the event that TheRedPill was shut down. Although a Voat version of TheRedPill did exist, the official policy of the subreddit was that the subreddit would move to Puerarchy.com3 – the official backup site of TheRedPill (see figure 10). An additional forum, TRP.red, mimicked the design and layout of the subreddit and would serve as the official “forums” of the community. The moderators aggressively started to push members of the forum to bookmark these websites after the announcement was made of their existence. These off-reddit websites started were created in response to the ban waves that had threatened the community before. The moderators also had started becoming more strict on posting guidelines, the content that was allowed to be in the subreddit, and the views of the forum members seemed mixed on the possibility of the community being banned: despite the mixed feelings, the moderators established these off-site forums as a backup in the event of a ban, and for years these backups served more as a kind of “doomsday situation” bunker as opposed to an actual place to move to in the event of a disaster. In posts announcing the new 3 Puerarchy means “rule by boys”, according to the moderator redpillschool 63 website, they continued to say to users of r/TheRedPill that the forums would not be opened until a ban occurred and is meant to serve as an archive. Figure 10. The landing page of Puerarchy.com Although Puerarchy.com was announced as the back-up site, from its creation to the time of writing it still had not unlocked the forums associated with it and served as more of an actual “back-up”, in that it had an archive of posts and other kinds of content related to r/TheRedPill. Panic in the community regarding the 2015 Ban Wave fizzled out fairly quickly, partially because of moderators strictly controlling the amount of posts being made in the community about the event. The moderators chose to do this in order to control this panic, and to also keep the content of the subreddit focused on its goals. In August of 2015, however, another ban wave occurred resulting in the banning of a number of racist subreddits (most notoriously r/CoonTown) and under the new policies, also banned the community r/rapingwomen in July of 2015 (Chandrasekharan et al., 2017; Matney, 2015). The Ban Wave of 2015 that started with r/FatPeopleHate then trickled over into the banning of other communities, whose many users 64 then migrated off the forum although not in the same numbers as they had on reddit (i.e., they moved to Voat but had less members) (Hockenson, 2015). Although the banning of these communities were effective in combating hate speech on reddit, it didn’t mean that the hate speech went away completely – it just moved (Chandrasekharan et al., 2017), highlighting the mechanisms of easy-to-navigate platforms and alternative communities that were widely known in the event of a breakdown. Acute awareness in the forum of the cultural and political climate, particularly among the moderators, meant that a contingency plan (Puerarchy.com) had been in place even before the bans that occurred in 2015. It was in 2016, however, that more public posts about this plan began being made. A post from the moderator redpillschool announced the contingency plan for members to be aware of – in addition to Puerarchy.com, another website called forums.red served as a website that mimicked TheRedPill visually that served as an archive for their posts, and established another node in their network called “TRP.red” that would function much like the subreddit but also include blogs, a newsletter, and content curated by some of the top celebrities within the forum like Rollo Tomassi, an active men’s rights writer. In effect, the goal of TRP.red was for it to serve as a social networking site all on its own as an alternative social media platform for the community to evade censorship attempts. In addition to the establishment of these off-site forums, a mailing list (see figure 11) was also created to alert members of when this move would occur – a move, that the moderators said, would only happen once, regardless of the other communities that existed on Voat, 8chan, and other archives. Although users had the freedom to scrape their own posts and maintain their own repositories of archived material, the moderators asserted that TRP.red and Puerarchy.com were the official sites in the event of a shutdown. Seeing themselves as the community leaders, the 65 moderators wanted to have final say and full control over what the official stances of the subreddit were. Along with this, a website called redpillhandbook.com provided PDFs of top posts and other sidebar materials (which users are required to read before posting in the community), and a stickied link there also contained a downloadable zip file of the top 500 posts in the community. Figure 11. An email emergency alert system in the event of a forum shut down Like doomsday preppers who stockpile resources and materials due to a fear of an imminent apocalyptic scenario where the world will end, the moderators built these communities long before they were needed and sat in wait. After this initial wave of panic, the off-reddit websites continued to operate as read-only where posting was not allowed but rather were providing a prepared and ready digital infrastructure to settle into in the event of a shut down. Some members, as mentioned before, took matters into their own hands and started providing their archived materials (see figure 12), but one poster from 2017 said it was not enough to have 66 an alternative digital community but that there needed to be offline and non-digital reserves of the information on the subreddit (see figure 12). Almost conspiratorial in its tone, the poster states that the information from r/TheRedPill needed to be saved because it could one day be removed from the Internet altogether. These conspiracy-like theories were not rampant on the subreddit, but there was concern of the forum being watched by the “3-letter organizations”, i.e. the FBI, CIA, etc., and that their demise would not only occur on reddit but across platforms and the Internet as a whole. Figure 12. When we get banned from the Internet This fear of being banned from the Internet altogether, although entertained and discussed on the forum, seemed to be more niche as opposed to being the main mindset of the community at large. TRP.red and Forums.red served to be where users would be able to do exactly what they had been doing on the subreddit: post, interact, and comment on content and build community, and even mimicked the stylization of the subreddit itself and its structure (figure 13). 67 Figure 13. Forums.red “Why Don’t We Just Leave Reddit on Our Own?” The fears in the community seemed to die down in 2016 and 2017, or at least were not as amplified in the immediate aftermath of the bans that occurred in 2015 on reddit. The title of this section comes from a user comment (see figure 14) on a post in which redpillschool posted in May of 2016 the subreddit’s “contingency plan”, following the news that a number of subreddits had been quarantined, including r/European (J. Rogers, 2016). The subreddit r/European had become a white supremacist community, and subsequently after the quarantine moved to Voat, as did many of the subreddits banned in 2015. In effect, Voat became Zion, a digital promised land for members of these banned and censored communities to make a mass exodus to, and would continue to serve this function for subsequent banned/quarantined communities (Menegus, 2017). Many of the users within the subreddit made this plea to just leave reddit on their own before they were ousted, and this is perhaps an attempt to assert some agency over the situation rather than being at the mercy of the reddit administrators (figure 14). 68 Figure 14. “Why don’t we just leave reddit on our own?” Subsequent ban waves on reddit in 2016 were targeted more towards communities like r/PizzaGate, which was a dangerous conspiracy theory community that contended that Hillary Clinton and other members of the democratic party were operating a child sex trafficking ring out of Comet Ping Pong, a Washington, D.C., pizzeria – this conspiracy existed outside of the confines of reddit, but the community was particularly active on the reddit platform (C. Silverman, 2016). Of important note, the PizzaGate conspiracy theory led a North Carolina man to actually travel to the pizzeria and fire multiple shots from an assault rifle because he believed by doing so he would rescue the child sex slaves who were imprisoned at the restaurant (Jarrett, 2017). Because of the nature of the community, r/TheRedPill did not particularly seem concerned with its banning, but moderators did start banning accounts again from the community en masse in 2017. The reason for the ban was due to reactions from members who wished to remain apolitical, who did not meet posting requirements or who violated posting requirements, as well as to silence complainers and seemingly those who did not embody or perform r/TheRedPill principles well enough (see figure 15). The ban seemed to be in response primarily toward a post by moderator and frequent poster GayLubeOil’s about Donald Trump and the popular r/The_Donald community; and an attempt by the moderators to control and govern not just what is said in the community but who gets to access it at all. These bans are a sort of boundary-making for what is acceptable discourse within the community (Star, 2010), and is an action used by moderators across platforms. 69 Figure 15. The Purge Reasons for why the community became less concerned about the future viability of the subreddit can perhaps be pointed to the social and political atmosphere during these two years. The success of Donald Trump on the campaign trail as well as the rise of the alt-right in 2016 served as political and social events that actually strengthened rather than hindered the community (Marwick & Lewis, 2017; Neiwert, 2017)(see figure 16). However, that does not necessarily mean that they supported politics per se but rather what Trump’s success as well as the rise of the alt right demonstrated. Trump’s win would be a “cultural zeitgeist”, a win against feminism, and reclaiming masculinity and “the misandry bubble popping” (see figure 16). Trump was lauded as a figure by the moderators in particular that represented a brand of masculinity that harkened back to a time before feminism, a patriarchal period, and although the figure himself was not necessarily the object of admiration, what he represented was. In 2016 and 2017, much of the discussion on the forum remained the same but more scrutiny was being paid to the types of content being posted on the forum due to the increased attention on the community. 70 Figure 16. A post from October 2016 about Donald Trump on r/TheRedPill Interestingly, this post is no longer available on the subreddit but is available on the backup forum, TRP.red. It was perhaps removed, along with many others that openly supported Donald Trump and other content that could point to alt-right affiliation, due to the quarantine and increased scrutiny of the subreddit due to popular media attention. Particularly after the election, a number of popular media pieces were published from outlets like Vice, The Cut, CNN, and others that pointed to The Manosphere as serving as a sort of “gateway ideology” into more extremist thought (Baker, 2017; Brooks, 2017; Futrelle, 2017b; Griffiths, 2017). These articles not only identified r/TheRedPill, but associated communities like Men’s Rights communities, MGTOW (Men Going Their Own Way), and Incels (Involuntary Celibates) (B. Lewis & Marwick, 2017); which all fall under the “Manosphere” label. This increased attention was well- known through the community, and was only amplified following a series of bans in early and late 2017 that targeted alt-right communities (Sacks, 2017; Statt, 2017). 71 Figure 17. A post about the Vice article that increased attention of the subreddit in 2017 For the cases where a direct response to the media pieces about r/TheRedPill was posted (particularly for the Vice and CNN reports), these official posts thanked the media outlets for “boosting our traffic and adding to our readership,” (see figure 17). These responses of the community to this increased visibility simultaneously were taken as a blessing and also a cause for concern – but as seen in figure 17, these media pieces were also used in order to strengthen an us vs. them dynamic (Gamson, 1992; Gitlin, 2003). During this time, the number of articles and exposés on these communities skyrocketed after previously not having been a main topic of journalistic coverage – although some coverage did exist, particularly following Elliott Rodger’s murder spree in 2014 (Hill, 2015). Increased media attention, as well as public outcry about the ways that these social media platforms have facilitated the rise of extremism, finally led to a number of bans across platforms, in particular Twitter, who banned a prominent alt- right/GamerGate/Manosphere figure, Milo Yiannopoulos (Ohlheiser, 2016). Yiannopoulos was an extremely prominent figure in a number of hate-based movements and profited off of his 72 profile as “professional troll,” eventually becoming the lead technology editor at the far-right media outlet Breitbart (Lang, 2016). His ban then set the precedent for a number of other bans across different media platforms, with Yiannopoulos eventually being banned from Facebook, Instagram, and a number of other websites. But in 2016 and 2017, renewed concern about the changing nature of censorship on the Internet and “free speech” was aroused due to a number of far-right and Manosphere celebrities being banned on Twitter; in particular, Milo Yiannopoulos’ ban from the platform inspired claims in the forum that the “anti-male” conspiracy perpetuated by feminists, “SJWs”, and the media was on full display. Yiannopoulos’ ban from Twitter, as well as Gab.ai being censored by their domain registrar in 2017, heightened concerns within the forum about the inevitable end of the community. In particular, it was not necessarily fear that spread throughout the forum after the ban of Yiannopoulos from Twitter but rather these bans and other acts of censorship on platforms served as points for fortifying the community, their practices, and their discourse (see figure 18). Figure 18. A mod post responding to Yiannopoulos’ Twitter ban by moderator redpillschool 73 Moderators, and forum members, were actively keeping their finger on the pulse of the activity occurring outside of reddit as well as within it and preparing in response. Archives started being made of the content, and users and moderators continued to reiterate that they can be removed from the platform, but as seen in figure 19, that their ideas would never die. Figure 19. You can kill a man but you can’t kill an idea These backup forums and archives then had a stronger reason for existing, aside from the previous ban waves that threatened the forum before on the platform. The post in figure 18 by the moderator redpillschool within the comments demonstrated the ways that this forum built a world within their world outside of the reddit forum. This world would be one that was free from the influence of their enemies: the SJWs and feminists, who in their eyes were aggressively pushing an agenda to censor free speech. Even offering up the alternative social media website, TRP.red, to Yiannopoulos, redpillschool again reiterates the purpose of the alternative website is to serve a as place for the subreddit to regroup and to be away from the possibilities of bans and censorship, where they would be allowed to move and speak freely without attention or scrutiny. 74 A post about “The Red Pill App” (seen below) also confirms that the Manosphere and the larger far right are connected. Particularly, r/TheRedPill contributors and moderators were instrumental in the success of r/The_Donald, and in the post moderator GayLubeOil also seems to foreshadow what will eventually be a similar situation that r/TheRedPil would find themselves in. The moderator and founder of r/The_Donald, CisWhiteMaelstrom, was doxed and the post seems to indicate the belief that this information was intentionally leaked by reddit administrators who were threatened by the size and popularity of r/The_Donald. The us vs. them dynamic is a critical part of shaping group identity and building worlds, particularly in social movements, which the post also states: “ [ … ] The reason The_Donald became the largest Trump community on the internet is because it utilized a Red Pill publicity strategy and was run by Red Pill Endorsed Contributors. The basic premise is that the left is far more numerous on Reddit and has more time to bitch about stupid shit. So the fastest way to grow a community is to write something deliberately provocative and make sure lefties hear about it. Then when lefties show up to virtue signal about your thought crimes, provoke them further by banning them and telling them to choke on your semen. Pretty soon lefties will tell other lefties about the injustice creating a positive feedback loop of publicity. Eventually Righties will hear about it and join your community. This strategy worked really well and The_Donald subsequently grew so big that it began to challenge the Left's monopoly on political discourse on Reddit. [ … ] However the admin's heavy handed tactics allowed CisWhiteMaelstrom to create an Us vs Them dynamic between his userbase and the Admins further driving The_Donalds growth. The admins had no choice but to dethrone an effective leader and hope for someone less aggressive. CisWhiteMaelstrom personal information was leaked to SJWs who immediately began placing threatening phone calls to his entire family including his pregnant sister. The admins can and do read your private messages, modify your comments and collude with their SJW allies to create leadership transitions. In fact they recently tried this with the Red Pill but were unsuccessful. [ … ]” – from the post, “The Red Pill App,” June 2017, by user GayLubeOil. Although this may have been common knowledge within the forum, the post asking for programmers to assist in the r/TheRedPill app building process also explicitly stated that 75 The_Donald became as successful as it did due to help from r/TheRedPill moderators and contributors – in particular, the strategy of banning those who post to debate, argue, troll, etc. on the forum was implemented because these posters would then tell others about this banning, then creating free publicity that would draw more members to the forum who would of otherwise not known of its existence. The excerpt from the post above also claims that this approach of leaking information of the moderators to dox them was attempted on r/TheRedPill but failed. But the post goes on to say that this illustrates a need for a communication platform outside of reddit and outside of “servers housed in SJW cuck4 shacks”: “ [ … ] For this reason we are building a secure communication platform who's servers will not be housed in SJW Cuck Shacks. Communication and the exchange of ideas is the basis for all power, which is why we can no longer afford to allow Aids Skrillex and Trigglypuff to rifle through your personal messages. Our goal is to create a secure platform that will allow our members to form private groups and eventually coordinate in person meetups. Features Include: * iPhone and Android chat apps. * Online browser chat interface. * Private groups similar to facebook but with privacy and anonymity as the default. * Specialized features for vetting members and preventing infiltration. In order to accomplish our objectives we either need a team of Alpha programers to volunteer their time and talents or half a Bitcoin to buy the necessary scripts. Send programer volunteer inquiries to: redpillschool@trp.red Send Bitcoin donations to: 1NeqAW41zBf1ujMzNMAZVuhRmkpB8CQL2X [ … ] ” - from the post, “The Red Pill App,” June 2017, by user GayLubeOil. Using not only the off-site forums as a gathering space but attempting to build an app, the subreddit moderators demonstrate their knowledge of how these platforms and world building practices function – they even solicit donations through bitcoin, rather than through more identifiable channels like PayPal. The fear that private messages are being infiltrated, as well as a 4 Short for “cuckold,” a sexual fetish where partners enjoy seeing their partners engaged in sex acts with others. Used as an insult in far-right circles. 76 desire to start building off-site, face-to-face communities, motivated the desire to create the app in order to facilitate moving off of the reddit platform. Further, a “mod” (short for moderator) post was made in the forum by moderator PaperStreetVilla that TheRedPill was not a democracy – and that this was to ensure not just the longevity of the community but to remain a “viably beneficial system for men,” i.e. that democracy is antithetical to TheRedPill values, as well as being detrimental to the long-term life of the ideas themselves. “In essence, we cannot be fair, because we are outnumbered, and our opponents, those who disdain us, they don't play fair. They attempt to disrupt and subvert, and thus we must respond in kind by doing the same to their protests. We must dictate how the sub is run, and we must be benevolently ruthless in this endeavor, lest those who are less versed in RP, or our enemies, dilute the philosophy on Reddit and fracture the larger manosphere community as a result. For the sake of the continued preservation of our ideas, ideas which have immeasurably benefited the lives of thousands of men - we cannot cave to the notion that democracy is a superlatively superior form of governance. For us, as a community, it isn't, and it is unlikely it ever will be. If you don't like how we rule, then leave, because we want you not.” – post “[Mod] The Red Pill is not democratic. It cannot be in order to be a viably beneficial system for men,” by moderator PaperStreetVilla, June 2017, a repost from two years prior. The above excerpt demonstrates how the moderators collectively made the ultimate decision in dictating how the subreddit is run – particularly in the name of preserving a certain discourse and worldview the moderators were pushing forth. The fears of debating on the forum as well as outside of it were based on the idea that this would fracture the community to a point that it would dissolve. The moderators, then, positioned themselves as benevolent dictators. The excerpt above reminds members that they are outnumbered – and that this would ultimately be their downfall if they entertained debate with those who did not ascribe to their school of thought. Echoing these fears of fractured community, censorship, and in particular of being “found out” as Red Pillers in their personal lives, posts in the forum started appearing saying that 77 r/TheRedPill was “like Fight Club”, in that the first rule of Fight Club is to not talk about Fight Club (see figure 20): Figure 20. “Don’t talk about Fight Club.” Reponses to the post varied from the users stating that they were not ashamed of their identity and affiliation with r/TheRedPill, but moderators and other more senior members responded and said that proselytizing r/TheRedPill principles was a bad idea and would only invite criticism and other negative consequences from the people they attempt to convert. Rather, moderators like redpillschool recommended building small “tribes” to “build power” rather than talking about the principles in large, public forums like Facebook. However, the pushback to the claim to not talk about the community was focused primarily on that r/TheRedPill philosophy needed to be spread in order to save “Western society” and that civilization “needed more great men.” Other posters who had suffered negative consequences agreed with the moderators, however, and said that they had lost friends and been shunned after attempting to spread r/TheRedPill ideology. Although posts similar to this had been made prior to the fears of bans, doxing, and censorship, the reactions always seemed mixed in terms of whether or not spreading the message 78 should be attempted in the members’ personal lives. Fears of being doxed by talking about the community outside of the subreddit were also warranted: a top poster was doxed after engaging with users in r/PurplePillDebate, a subreddit in between the “red pill” and the “blue pill” (thus purple) where users debated about the viewpoints of both. In the post, user Bloodycurative states “I should know better. I’ve preached extensively that you cannot unplug someone when they themselves refuse to believe they are plugged in,” (figure 21). These fears of being found out as a Red Piller had been echoed previously in the community, and the protective practice then was to create alternative accounts, to not talk about r/TheRedPill in other communities, and to not try and “spread” the message to family/friends. By remaining “underground”, so to speak, the members would not only protect themselves but also the community. Figure 21. User Bloodycurative speaking about their doxing experience Not talking about the community outside of the subreddit, tales of caution of debating non-members, intense control over the subreddit’s content on part of the moderators, and the 79 introduction of new subreddits like r/RedPillRight to discuss politics off the main forum were all attempts to not just control the community, but to hide evidence from prying eyes and journalistic outlets served as sustainability practices to continue the community’s existence within the reddit platform. In early 2017, reddit started banning alt right subreddits for violating the policies against doxing (Statt, 2017). But it wasn’t until the Unite the Right rally in August of 2017 in Charlottesville, Virginia, that concerns and reminders to the community about the need for a back-up plan started appearing again. Of particular note is that the commonly held belief is that the neo-Nazis, Ku Klux Klan members, and the presence of the alt-right in general at Charlottesville was a part of a mass conspiracy to censor free speech. The Charlottesville rally, and the responses to it, were a part of a “big test” to see if the Internet could be censored, as well as the “giving away” of ICANN5 to the world: “I haven't been too concerned with neo-nazis and white supremacists because in my opinion they don't exist in numbers that would have any real effects in the USA. But the media has had a field day blaming them for everything that's wrong in our country, and one of Obama's big plays is coming to fruition. There were cries of danger when Obama gave away ICANN to the world, but I don't think many saw the long-game on this one. With ICANN firmly out of the grasp of USA juristiction, freedom of speech would no longer be a concern for ICANN and registrars would no longer be functioning as sub-contractors of a government contractor. This past two weeks was the big test to see if everything worked as planned. Step 1. Get an enemy nobody can defend. In this one, they doubled down just to be sure. Nazis and KKK. Whether they were planted, or just organically in the right place at the right time, the media made sure to play it that the riots and protesters were 100% evil. This paved the way for step 2. Step 2. Convince the country that hate speech isn't really free speech because it causes riots. Even though the speech itself hasn't really been a subject here, it was mostly the people's right to simply exist with different ideas and protest. The fact that their protests didn't end peacefully must be linked to speech they had elsewhere. Using this logic, we could argue McDonalds is guilty as well, given that many of them had lunch that day. Step 3. Censor the website of those who have these offending ideas, and make it hard-to- impossible for those groups to contact each other anonymously. Use the new loophole: Cancel the domain since ICANN no longer has to follow US Constitution as it is no longer subcontractor of the govt. Doesn't matter if you have your own server on your own internet connection. Now they can take our domains. 5 The Internet Corporation for Assigned Names and Numbers 80 What we've experienced here has flown mostly under the radar but should absolutely fucking scare everybody with half a brain. This was a test run to see if they could censor the internet. And they did it, and nobody cared. Up until now, censorship on the internet has mostly been a game of whack-a-mole where every attempt of censorship spurs a dozen alternatives. Gab, trp.red, the chans, all of these sites represent alternatives to mainstream services that are clearly guilty of censorship. Many who are too stupid to grasp what happened here will say, "who cares if they censor white supremacists?" Obviously if you're on TRP, then you know the real answer to this. The question isn't about whether hate speech is okay or not. The question is: what will the witch hunters label hate speech today? [ … ] ” – a post by redpillschool, “Censorship on the Internet” In this same post, redpillschool also noted that “There's no mistaking it. All unpopular ideas are already under fire as hate speech as evidenced by the fact that the media and mainstream have been hellbent on labeling everybody alt-right: gamergaters, libertarians, republicans, Trump supporters, red-pillers, as well as actual racists and Nazis,” thus pushing back against the labeling of the Manosphere as being a part of the alt-right. The fear, then, was not necessarily about being banned from a platform like reddit but about limiting access to the Internet at all with a .com domain, a fear that was confirmed in 2018 when Gab was kicked off of its domain registrar (which Red Pillers discussed in the forum as proof of why their concerns were warranted before). On this post, as well as many others throughout the years, many commenters recommended “moving to the dark web,” however moderators and other dissenters noted that they would then be impossible to find by others looking to join their community. Moderators solicited not just help in programming, but ideas in general on how to protect the community, with redpillschool stating in the “Censorship on the Internet” post: “If you have some ideas, I'd love to hear them here. We are no longer just a bunch of guys sharing tips on getting laid. We accidentally stumbled upon what happens when the mainstream culture deems legal speech as hate speech, and if we don't act now, we'll be next to see our voices muffled.” 81 The excerpt from redpillschool’s post about starting off as a community of “guys sharing tips on getting laid” and evolving into something much more identifiable as a social movement highlights the evolution of the forum, not just in terms of its content but in terms of its framing and organization. Although the evolution took course over a number of years, evolving the practices of maintaining and sustaining the forum was a necessary move in order to ensure its survival. The key things that were pushed by the moderators to ensure this sustainability included not talking about the community, raising alarm bells on other bans and acts of censorship happening on other platforms and to other people/communities, and being aware of the social and political climate were all tactics that the community used to fortify against the possibility of failure. But just shortly after this post, r/Incels was banned from reddit making the fear of being banned salient again. “Incels Gets Banned: What does That Mean for TRP?” The post about “not talking about Fight Club” was posted in December of 2017, and one month prior to this plea to fellow community members to not talk about r/TheRedPill, the r/Incels community was completely banned from reddit on November 8, 2017 (Hauser, 2018). Although the Incels community was regularly mocked on the subreddit and users/moderators actively distanced themselves from the Incels, forum members noted how Incels served as a “decoy” for their community because of their more extremist views – they served as an easier target, and now that they were gone, the shutdown and censorship attempts would turn on r/TheRedPill. Although r/TheRedPill members and moderators knew that they were often being talked about on communities all over reddit in terms of them being awful, the need for their ban, 82 etc., Incels being banned seemed to bring to fruition a fear of these censorship attempts being directed toward the community (see figure 22). Figure 22. Incels got banned. What does it mean for TRP? The case of the Incels ban will be explored in Case Study 2, but is an important one for all of the case studies in the dissertation because of what their ban represented: the inevitable censorship of what reddit administrators and other platform administrators deemed as unacceptable content and discourse within these digital worlds. The post also highlights that Incels, TheRedPill, and Men’s Rights in general were all seen as “the same” in the eyes of “the BP’ed6 manipulated man and the average reddit feminist”. Some of the concerns expressed in response to Incels being banned is that members of the Incels subreddit would migrate to r/TheRedPill and other Men’s Rights communities across reddit that had not been banned, and that these new members would post poor quality content or hate speech that would serve as evidence of why they needed to be shut down. In figure 22, the poster notes that communities that aimed to see the subreddit get shut down saying that there are “more than a few redditors 6 “Blue Pilled”, the opposite of “Red Pilled” 83 around that would be more than pleased to wipe us out.” It can be assumed that members of the banned Incels forum started migrating to r/TheRedPill by looking at the posts that were made addressing the Incels that seemed to have started posting in r/TheRedPill. Posts started appearing, in particular in January of 2018, addressing Incels who may have migrated to the forum following the shutdown of the r/Incels subreddit (see figure 23). Figure 23. “For the Incels” from January 2018. Rather than focusing on the potential impact of Incels-type posts on the forum in terms of public perception of r/TheRedPill, the posts earnestly tried to help the new members in terms of understanding the kinds of acceptable content and strategies they could use in their attempts to pursue romantic or sexual relationships. Eventually, due to the influx of posts that did not meet posting guidelines, were outrageous (i.e., a post about “sperm traveling to a woman’s brain” was pointed to often by users who noted that the quality of the content had gone downhill), posting was shut down in early January by the moderators and only approved submitters were able to make new posts (see figure 24). From what could be inferred from the posts, the moderators did this to clean up the content in the moderator’s queue as well as started only reposting older posts and sidebar content to encourage newcomers and even those who had been in the forum longer to re-familiarize themselves with r/TheRedPill thought and the mission of the community. This moderation practice was not only meant to “clean up” the subreddit, but to re-establish the discursive norms of the community in an attempt to weed out those who refused to comply or did not belong. As seen in Figure 24, they were not only strictly enforcing the types of language that 84 was acceptable within the community but the kinds of discourse and behavior that were allowed at all. Figure 24. Posting is locked, comments are open. Posting was unlocked a few days after these announcements and action to lock down the subreddit to clean up the content of the forum. In early 2018, the focus of the forum seemed to almost go back to business as usual, but with some discussions still around the need for a different platform to house the community; without the panic associated with the posts coming after the news of censorship and bans. The discussions moved toward the need for a new platform due to the perceived threat against “free speech” with the actions of other platforms like Twitter and Facebook banning communities and alt-right/Manosphere celebrities like Yiannopoulos and the removal of alt-right subreddits from the reddit platform. The discussion turned from being focused on panic (“what are we going to do”) to one where the establishment of off-site forums and responding to censorship focused on fighting for free speech on the Internet for all, not just r/TheRedPill. 85 The discussion of preserving the community evolved from being just about the community being able to regroup in the face of a ban (an infrastructural failure) to one where the moderators and forum members saw it as their mission to begin building a platform protected from censorship in the name of free speech. Reddit administrators started banning more communities on the reddit platform, not just the ones associated with hateful ideologies, and were making these moves based on violation of reddit policy: the alt right subreddits were not officially banned for their rhetoric, but rather because they were doxing and engaging in harassment, thus violating reddit’s rules. Posters noted that in 2018, “SJW thought censoring is at its height,” (see figure 25), and another posted commented that redpillschool was ahead of the curve in preparing for an imminent shut down. Figure 25. Ahead of the curve In the post called “On Freedom of Speech,” redpillschool noted that reddit administrators were no longer just censoring what was perceived to be communities violating policies or hate speech, but “blindly” banning communities without justification. Further, redpillschool claims that other moderators from other subreddits approach the r/TheRedPill moderators to ask how they can protect themselves, alluding to r/TheRedPill’s skill in navigating the censorship and bans occurring on these platforms. Freedom of speech was not only seen as a mission for the personal well-being of the community, but rather for the good of free speech itself and its 86 longevity. In this post (see figure 26), redpillschool also solicits information from people who may have ideas about how best to preserve “free speech,” not just the r/TheRedPill community. Figure 26. Free speech As one can see from figure 26, by shifting from protecting the subreddit and toward the cause of free speech and decentralized platforms, r/TheRedPill positions itself as not just a leader in this area, but almost like a martyr-esque figure due to the constant threats of censorship the forum has faced. Combating censorship, then, becomes a mission for the good of all humanity (in a way) rather than just for the good of the community. Further, redpillschool notes that freedom of speech is necessary for the community to “build their own power,” rather than relying on platforms they do not control. These preparations not only were in hand with prior establishment of off-reddit websites, forums, and archives, but also included solicitation of advice from others on how best to fight the battle against platform censorship. At this point in 87 time, platforms like Facebook and Twitter started cracking down more severely on hate speech and hateful ideologies, including their figureheads, and reddit was also following suit. Figure 27. “Curing” the issues surrounding free speech In figure 27, redpillschool also reiterates the issue of moving to “the dark web,” which is primarily the issue of accessibility. In a way, moving to the dark web would not only limit the possibility of new members finding and joining the community, but was perceived to be a kind of surrender and defeat by “going underground”. The theory that reddit was banning these communities due to pressure from advertisers and other stakeholders is also reiterated, and this is a common sentiment expressed on multiple posts regarding the issue of other communities being banned, the possible ban that would befall r/TheRedPill, and so on. Bans on other platforms, in particular, were also seen as a sign of changing attitudes toward what constituted hate speech and what was a hate group, and when a prominent r/TheRedPill member (IllitimableMan) was banned from Twitter (figure 28), it wasn’t seen as a point of defeat but rather strengthened the 88 community’s belief that their views were the truth, a truth that “SJWs” and “feminists” did not want to hear, and thus needed to be controlled. Further, these acts of censorship were used to point out that these platforms were the bigots by trying to weed out bigotry, and that ultimately these acts of censorship would be the downfall of the platform. Figure 28. Twitter is a Liberal Democrat Social Justice Swamp platform By being banned from Twitter, IllitimableMan was not silenced but rather commenters noted that this ban was a retaliation due to his views angering the status quo. The moves by major social media platforms like Twitter to weed out hate speech were interpreted by the community that they were the ones speaking the uncomfortable truth, and that the SJWs that “controlled” discourse were afraid of this truth. However, in 2017 and 2018 in particular, posts reminded the members of the forum that they were not a movement – and that being a movement would lump them in with the dreaded SJWs that were a common enemy. They saw themselves as being engaged in a “war,” but with each other as well as the rest of the world, and that r/TheRedPill teaches that men are on their own and should not rely on a collective for strength and power: “Some of you are under the delusion that TRP is a movement, an army, a fraternity, or some kind of support group. We don't have conferences, fundraisers, bro walks, or well funded RP organizations, and we never will. The Red Pill, simply put, is information. 89 [ … ] I've seen multiple people calling TRP a movement and it makes me laugh. This is not a movement. We're not social justice warriors and I would be insulted if anyone ever called me one. I'm just a man who figured a few things out, determined a sexual strategy that works for the lifestyle I want to pursue, went after it, and decided to share some of my wisdom. You're all my competition. You are learning how to raise your SMV and internalize game which will actually help you take my plates away, and you think I'm going to cheer you on? This is a war, gentlemen, and it is every man for himself. The feminists won't stop until they have complete female supremacy, and they can subjugate men as nothing more than worker class slaves for their husband: The State. I don't wish any of you ill will, and I have made some friends because of this community, but that's where it ends. You are a man, so go build your empire, but realize at the end of the day that you are ultimately on your own.” – post by Triadis3, a repost of a post by moderator SoftHarem, 2018 Commenters supported the claim that r/TheRedPill was not a movement, with one noting that “Movements are convenient for people who don't want to think too much and want to cling to an identity. Those are both the opposite of red pill to me.” However, commenters noted that the collective space for knowledge in the forum and by sharing information, they were a movement, but this idea has been contested in the forum on and off for years. Wanting to remain apolitical in discussion, the moderators of the forum moved political discussion to r/RedPillRight in 2017, and kept the main subreddit r/TheRedPill for the discussion of sexual strategy and male empowerment – the two stated goals of the community. Regardless of their political orientation or political mobilization, the community was still being talked about in mainstream media as being affiliated with the alt-right, being the target of other redditors aiming to shut down the forum, and other kinds of “brigading” attacks that put the members of the forum on high alert. In particular, the Brett Kavanaugh senate hearings heightened concerns in the community that an attempt at censorship would occur again, with many members condemning the hearings as an example of “SJWs” taking over politics. In August and September of 2018, concerns fanning the possibility of banning r/TheRedPill were made real: the subreddit had been listed on “TheBanOut2018”, a list made by 90 a group of other subreddit moderators with communities they believed should be removed from the reddit platform (see figure 29). Figure 29. Efforts to ban TRP have resurfaced In the post, they identify again the common enemy of the SJW as well as reiterating the rule to not discuss r/TheRedPill outside of the forum – either online or offline, and that changing laws in other countries threatened free speech. These fears of censorship that had been growing for years would be actualized soon after this warning post, when the subreddit would be placed under a quarantine by the reddit administrators. “A Civil War is Coming.” As mentioned in the beginning of this chapter, in September of 2018, reddit administrators acted on the long-anticipated fears of the community: r/TheRedPill along with some other communities were “quarantined”. Although not an outright ban, the quarantine feature had been used fairly sparsely by the reddit administration, and was re-deployed again for 91 r/TheRedPill, r/cringeanarchy (which was a subreddit dedicated to mocking those on the left- leaning political spectrum), and r/WatchPeopleDie (Asarch, 2018). In this quarantine, 20 subreddits including those mentioned above were effectively put on a probationary status by the administrators where they claimed the function of a “quarantine” as opposed to a ban was to “ … prevent its content from being accidentally viewed by those who do not knowingly wish to do so, or viewed without appropriate context,” (Reddit Announcements, 2018). A quarantine, unlike a ban, has an appeals process for the moderators of that community to go through to remove the “opt-in” system for the subreddit for users to view the content. The announcement further goes on to state that this is a method to “incentivize positive behavior,” (Reddit Announcements, 2018). Thus, the administrators assumedly would be monitoring the quarantined communities to ensure that they are abiding by platform policies and standards. The most common sentiment expressed on the forum in response to the quarantine was anger. In figure 30, the post “There’s a Civil War Coming,” identifies the reason for the ban being related to the Kavanaugh senate hearings: Figure 30. There’s a Civil War coming. 92 Positioning themselves as the “rational” discourse present on reddit, the poster acknowledges that political discussion is often not allowed on the subreddit but that the Kavanaugh hearings and the quarantine signal a need for a kind of paradigm shift within the community – to the point where they view these actions as a kind of information warfare, a mode of censorship akin to a dictatorship, and that “powerful white men” with opposing views are seen as such a threat that they are silenced. These infrastructural failures in the form of bans, quarantines, and overall the amplification of messages advocating for empowering women and believing survivors were all seen as signs of a culture that was attempting to silence and demean men and men’s rights. Thus, in a way these attempts to discourage growth only strengthened the group and their discourse, and their mission to not necessarily spread but to preserve it. The calls for banning r/TheRedPill along with other communities may have originated from a subreddit formed by a group of redditors (r/TheBanOut2018) who identified toxic communities and advocated to reddit administrators for their bans. They also seemingly engaged in brigading, where members of other communities start posting in the target community, or “downvoting” all of the posts to manipulate the reddit algorithm. Since reddit works on an upvote/downvote system where “upvoted” content has more points and is thus more visible, this is a common tactic used to sow discord on subreddits. In response to r/TheRedPill being added to the list for The Ban Out, redpillschool created a subreddit called “Ban Ban Outs” which seemingly was meant to mock the Ban Out subreddit but also to draw attention to some kind of “double standard” that he felt was present. Responses to the possibility of an impending ban were mixed. Some felt that it was an unnecessary panic whereas others started voicing again the need to move off of the reddit before the community was banned. Nonetheless, administrators took action and on September 28, the 93 community was quarantined. For a period of time, posting was locked on the forum and members would be met with the loading screen at the beginning of this chapter, where they would be told that the subreddit was dedicated to offensive or shocking content. In response, redpillschool started a petition at change.org and called for members of the subreddit to boycott reddit for one hour on November 1st at 4:00 p.m. PDT to send a message to reddit administrators, demanding the quarantine be reversed (see figure 31). Figure 31. Stop Reddit Censorship Content responding to the quarantine when posting was reopened on the subreddit ranged from members thanking the community whereas some of the senior contributors and moderators posted content that reiterated what many long-term users of the forum had anticipated for years. The theories that abounded within the community were succinctly written in a post by GayLubeOil, a former moderator and one of the most frequent posters in the community, wherein three possible reasons for the censorship were given: “If in 2012 I told you of a Radical Feminist conspiracy to push strong men out of positions of power and to pull young men down into depression and despair, you would tell me I'm crazy. That was the chorus for a long time. For a long time if you discussed institutional discrimination endured by men, you would be shrugged off as a loon. However as the years rolled forward the truth became more and more unavoidable. In 2016 the American people had a referendum on that truth. In 2017 the British also had a referendum. They voted No! They voted against GloboHomoism, Liberalism, demographic replacement and all sorts of things outside the immediate purview of sexual strategy. That's when all hell broke loose and institutional actors began their efforts to change where the culture is moving. […] 94 There are three theories on why Reddit pulled the trigger on the quarantine today. The first is the technical theory. Reddit fiddled with their algorithm about a month ago, which allowed the Red Pill to hit the front page and rapidly gain subscribers. Next is the MeeToo theory wherein, Reddit quarantined us a cultural response to the supreme Court nominee on controversy. Finally there is a Grand Cleanse theory wherein there is a multi platform conspiracy to deplatform non liberals and control the narrative. Here's what you need to know. Register an account on TRP.red so you can continue to be a part of this community. Whatever is happening will continue to happen and will in fact speed up. There is no ignoring this at some point this will affect your personal sexual life. Stock up on controversial books like Ride The Tiger Revolt Against The Modern World. An Amazon book ban isn't far behind and you'll be glad you bought it when you could.” – post by GayLubeOil from 2018 on the quarantine GayLubeOil’s post seemed to predict the impending bans that were coming forth by observing the types of censorship that were occurring on and off the reddit platform: indeed, Amazon would ban the sale of books by notable Manosphere celebrity Roosh V in early September, just a few weeks before the subreddit was quarantined (McKay, 2018)Commenters to this post noted that it was not a conspiracy and that the moderators of the subreddit needed to allow political discussion in the community if they truly cared about preserving the community – but also that preserving the subreddit would be in vain because reddit would ultimately try and hide the community away or ban them altogether. Particularly during the quarantine, redpillschool commented on nearly every post reminding users to migrate to TRP.red and forums.red, and to sign up for the Puerarchy mailing list to be notified when a shutdown would occur. Appeals were made to the reddit administration by the moderators of the forum, and updates would continually be given to the community while the subreddit was in the process of removing themselves from the quarantine. Although initially the emotions in the forum were more distraught, ultimately the quarantine started being mocked by the users as a demonstration of the reddit administrators’ lack of strength when it came to outside pressures to censor “free speech.” Others began noting that other communities that were similar to r/TheRedPill like 95 r/MensRights and r/MGTOW were not quarantined, and a comment posted by user itswr1tten contended that “The real purpose of this softban is to ASSOCIATE TRP with the shock/gore, incels, the white rights, and the "jews did everything" subs. Guilty by association is the modus operandi of the current political climate, and TRP is an order of magnitude or more bigger than everything else softbanned (besides a Faces of Death remake and a catchall sub). That's the real reason.” Others again reiterated the belief that a ban, or a “soft ban” like a quarantine, was due to a knee-jerk reaction to the “truth” the community was espousing. It was during the quarantine as well as that redpillschool and other moderators started asking for donations to a Patreon account as well as providing a bitcoin address to help fund the building of a new platform to house the community without fear of quarantine, ban, or other modes of censorship, and we see this strategy again in later cases. Other commenters started comparing the censorship to George Orwell’s novel 1984, and equated the current political and social climate to being similar to the dystopian one described in the novel, with a comment by user Imperator_Red saying “Also, fuck this censorship. We've truly never seen anything like this in the west before. I will never forgive the leftists for this. They aren't just people with a difference of opinion. They are tyrants, and I consider them to be my personal enemies from now until the day I die.” The comment was in response to a post by redpillschool that believed the reddit administration quarantined the subreddit because the community reached 300,000 subscribers, saying that the censorship was purely ideological. In this post, redpillschool also announced the soft launch of forums.red, which had been established years prior but had not been opened for use until this period of time for the subreddit. 96 The responses and actions taken during the quarantine as well as before it point to the moderators in particular for having a sophisticated level of knowledge on how to not only manage an online community, but how to use different digital platforms to accomplish their goals. Whether it was a backup website, an archive, a handbook, a Patreon page, or a bitcoin address, the moderators anticipated this infrastructural failure and had built in many precautions. Although the boycott was unsuccessful (as figure 31 demonstrates, only around 500 people signed the petition), redpillschool as the de facto leader of the community took action where he felt it was necessary to in order to preserve the group and to take a stance on how they felt about the quarantine – and asked others to join them (figure 32). Figure 32. Our Patreon The boycott, the discussion of outside political and social events, and the other actions taken by the moderators seemed to go against the subreddit’s policy on remaining apolitical in their discussions and actions. Perhaps after the quarantine, the desire to remain apolitical disappeared because of the threat being on their doorstep and affecting the community. Despite previous claims that the subreddit was not a movement and that r/TheRedPill was only to provide information and community to discuss sexual strategy and masculinity, the impending threats to r/TheRedPill in the years falling the 2015 ban wave seem to negate this “apolitical” claim and stance. Viewing themselves as guerilla fighters in a “culture war”, the community then seemed to band together more in their practices to maintain the subreddit (on or off the reddit 97 platform). However, one thing of note is that r/TheRedPill is indeed not its own ideology despite being its own digital community and social world: the ideology that fuels them is based on the men’s rights movement, misogyny, and hegemonic masculinity and patriarchy. But their attempts to preserve these discursive forms by strategically building the world out from their main subreddit to encompass a wide swath of digital space in the form of other affiliated subreddits, blogs, websites, and back-up websites and archives are practices that are meant to continue preserving them. “We Are Not Organized and That is Our Strength.” A post on r/TheRedPill titled “We are not organised [sic] and that is our strength” reiterated the fact that the subreddit was not a movement, and that what the members and moderators were engaged in were a kind of guerilla warfare against a shifting culture and society that they viewed as detrimental to men and masculinity. However, the practices of the moderators and the buy-in from the community members seem to indicate the opposite: they are extremely organized, and exhibit all of the qualities of an organized political and social movement, yet vehemently try to distance themselves from this label as a way of distinguishing themselves from their dreaded enemies, the “SJWs”. Their organization in establishing off-reddit forums, archives, and back-ups demonstrate that they were well aware of the changing policies, realized that the end would eventually come, and took measures to ensure the sustainability of the community. They may not see themselves as a social movement, per se, but it cannot be argued that they have a high level of organization and have been prepared for this for a long time. These moderators, and forum members who assisted in building this world outside of reddit, are not using these digital platforms in fundamentally different ways, but are using them 98 exactly as they were intended and designed to be used – the infrastructural nature of the Internet allowed for the creation of archives, forums, and back-up websites that would preserve the ideas of the group, if not the community. But what sets r/TheRedPill and other extremist groups apart is how they innovatively harness and exploit this power despite its constraints. In figure 33, the post “we are not organised [sic] and that is our strength”, the poster not only reiterates that r/TheRedPill and MGTOW are not formal “movements” but also says something at the end of the post that highlight the ways that platforms simultaneously facilitate and yet constrain movement of these groups, pointing to a level of self and group awareness. The poster notes that if r/TheRedPill were to be shut down, the members would just move to other forums. Even if they are not “r/TheRedPill,” they would still be able to find spaces online where they can discuss similar ideas and exchange strategies for the things r/TheRedPill advocates. The capability of the Internet to allow for this kind of movement across space despite attempts at bans and censorship is a fundamental part of how the moderators and community members built their world, but also what they did to preserve it. Figure 33. Our strength 99 These practices to preserve and sustain in the event of an infrastructural failure can be interpreted as a mode of connective action (Bennett & Segerberg, 2012), and are practices that preserve discourse so that it can be reproduced. The practices for preserving discourse on the Internet are just as important to examine as the practices that create them, and continued claims to truth and reality are premised upon these preservation tactics. The moderators of the forum continually noted that communication was a form of power, and to build platforms and communities that weren’t controlled by reddit administrators was a fundamental part of maintaining power. The governance of these platforms by the administrators did not extend beyond the boundaries of reddit, however a seismic shift in governance occurred when domain registrars and website hosts started banning websites like The Daily Stormer and Gab, which was also an alarm bell in the forum of the changing nature of the Internet itself. Thus, the forum not only built off-reddit websites and forums, but are in the process of building an app for the community for secure communication for the free exchange of r/TheRedPill ideas and discussion. The structure of the Internet and the necessary skills for building these kinds of platforms all facilitated this shift in mission, and although censorship was a concern on reddit and other similar platforms like Twitter, the moderators had the knowledge and necessary skills and financial means to navigate them. Like other modes of discourse and thought, the discussions that happen in online spaces and the communities that are built are premised on the idea of discourse as power – communication as power, and the ability to exchange discursive forms and to produce them are all ways to access this power. Earlier in this chapter, the point was made that “ideas can’t be killed,” and there is a historical truth to this claim. What digital platforms do differently, however, from archives and public spheres of the past, is provide the necessary infrastructure 100 that is accessible by laypersons to lay claim to this truth, to this power, and also to alter the cultural realities they find themselves in. Despite attempts to quarantine, ban, and censor from a multitude of different levels of platforms and infrastructure, there is always another loophole that the members of the forum can go through to maintain the community. By responding as a community to the bans in a somewhat conspiratorial fashion (i.e., the alt right was planted and created by the left to censor free speech), they push back against mainstream culture and view culture as an experimental practice, one where power ebbs and flows based on the exchange of ideas and creation of online communities. Arming the Defenses Like a superbug, the attempts to limit the influence of the community on the reddit platform through the means of quarantine served to fortify the subreddit against other potential attempts to lessen its spread. Responding to the external events that were occurring outside of their community, the subreddit took a number of preventive measures to fortify itself against the possible banning of the community and actively built up its defenses like an army preparing to be attacked (see Appendix C for visual representations of the official positions of the forum). This position of defense, rather than offense, ramped up the sentiment in the community that they were being repressed, and that the resources that they were sharing were of such threat that the only way to silence them was through a ban (or in this case, a quarantine). The moderators of the forum and its power users aimed to not only control the narrative of their community externally but also internally, and thus took a number of official stances not just in their strategy to combat censorship but also official stances regarding where the community would then move to in the event of infrastructural breakdown. The practices to strengthen the group identity as well as 101 controlling the content of the subreddit by the moderators served two purposes: 1. A strong sense of group belonging would ensure the futurity of the group and 2. Controlling the content meant that the archives would reflect what the moderators wanted to be representative of the group for past, present, and future members. The community was built with a single vision that managed to grow in ways that perhaps the initial founder did not anticipate. The media reports exposing the forum, the constant attempts at bans from reddit and outside of it, and the subreddit’s reputation were all seen as positive things in part because all of these events brought attention to the community and recruited new members. The fear of being banned, although present, was assuaged by the moderators by the constant reminder that there was a backup, to reserve their names on the new forum, and that they would be safe on an off-reddit platform. At time of writing, the subreddit is still quarantined with no signs of the quarantine being lifted, but has maintained consistently high levels of activity despite the initial panic. In a way, reddit’s attempt to quarantine the community and to keep the rest of reddit “safe” from its infection may have only strengthened the virus that they were attempting to contain. In some way, the persistent threat of failure was what made the community act in the first place and to start putting into place official practices meant to preserve the group – and so when the threat became imminent and realized, it was able to absorb the shock. 102 CHAPTER 4: Case Study 2 Lost Civilizations: r/Incels and Incels.me The second case of this dissertation project focuses on a community known as the Incels, specifically the r/Incels community on reddit that was ultimately banned for its violent content. Incels are perhaps the most well-known community studied in this dissertation among the public, particularly because of their association with violent attacks in recent years that have resulted in a number of deaths. What this community demonstrates is different than that of r/TheRedPill and the third case study, r/AznIdentity: they were not prepared or organized for the event of a community shutdown (an infrastructural failure) and had not taken the necessary precautions to defend their community. Because of this, when the subreddit was shut down, all of the content was lost unless it had been saved by individual users. Contributing to this lack of preparation is that the world that they inhabited was a fractured, diffuse one that did not have a strong sense of cohesion across its multiple factions, and even the subcommunities themselves did not have a unified vision in how the community would be organized and what its practices were. But this would all change following a number of failures: including the ban from reddit, the new off-reddit forum would also face a shutdown by their domain registrar. Seemingly having learning their lesson, the forum reappeared immediately with all of its content having been backed up and archived, but this did not save all of the individual users. r/Incels is a notable case study because of how much they evolved and shifted their practices in regard to not just the ways that they produced knowledge, but how they started to preserve it by harnessing the affordances of digital infrastructure. Their attempts to fortify and defend themselves against the shocks that they would experience would change with each subsequent threat, and a part of this defense strategy consisted of a professionalization of the group. Their ban, and their subsequent 103 shutdown on their off-site forum, serve as fodder for other groups in presenting examples of censorship and oppression of non-mainstream ideas; but their status as a lost civilization means that much of their historical past lives on only in the cultural memory of users of those worlds past. Banned: November 2017 On November 7, 2017, when users attempted to go to the r/Incels subreddit, they were greeted with an image notifying them that the community had been banned due to violating reddit policies regarding violent content (see figure 34). Established in 2013, the community was a part of a larger online subculture of “Involuntary Celibates” (where “Incels” is from), and marketed itself as a support group for those who were unable to successful engage in romantic or sexual relationships. Other communities like it, like r/ForeverAlone, and other Incels subreddits exist, however r/Incels was the only one to be banned because of its particularly violent content regarding girls and women. The subculture had a high level of infamy on reddit and on the Internet at large because of the rhetoric espoused in it– ranging from violent threats against women, to talks of suicide due to loneliness, to ideological stances marketed as “the Black Pill.” By the time the forum was banned from reddit, they had amassed around 40,000 subscribers and were often the targets of mockery and brigading attempts. 104 Figure 34. Banned for violent content Where did the Incels come from? What does this subculture purport, and what are its goals? (a relational map of Incels can be found in Appendix B) What can be confirmed is that this was not the only “Incels community” that existed on the Internet, with multiple subreddits and other web forums being dedicated to discussion about “inceldom”. For the most part, Incel is more a self-appointed identity marker as opposed to necessarily being a political ideology. The loosely connected community is made up of men – and women – who have not had success in attaining sexual or romantic relationships despite desiring them. The origin of the term is widely contested in the Incel community, but what popular media have reported in their investigations is that a woman named Alana (who withheld her last name) coined the term nearly 20 years ago to describe the loneliness she felt as a result of her failure to attain a romantic or sexual relationship (Kassam, 2018). The website started as a simple text forum called Alana’s Involuntary Celibacy Project, where Alana would post links, theories, and other kinds of information as well as maintaining a mailing list. It was meant to serve as a support community, and was by no means a “movement.” However, Alana would ultimately sell the website and for years the term “Incel” remained an unknown term (Kassam, 2018; Tolentino, 2018). 105 The formerly friendly and supportive community of lonely people then morphed into a violently misogynistic online subculture, and a horrific event injected the term into the mainstream. In 2014, Elliot Rodger would kill six people and wound 14 others in Isla Vista, California, and after investigation into his YouTube channel and forum use, it was discovered that he identified himself as an Incel. His attack was motivated by rage toward women who rejected him, and the subculture upholds him as a “saint” who represents their community (Dewey, 2014; Kassam, 2018; McGuire, 2014; Tolentino, 2018). The Incels “community” is spread out across a number of different online platforms, forums, and is also constantly at war with each other – some groups allow women to join, whereas others vehemently reject the inclusion of women. The focus of this case study, however, will be on the r/Incels subreddit that was shut down in 2017. Although there were other Incels and communities for them spread out across the digital sphere, the subreddit was perhaps the most visible and most popular online community dedicated to “inceldom,” and made attempts to distinguish itself from other Incel communities. The Incels subreddit would regroup and re-establish its community in an off-reddit web forum, incels.me. But this move was not necessarily coordinated or planned before the subreddit was shut down. In effect, because the community was so fragmented and vast, there were no efforts to organize its materials or establish backup plans, and when the subreddit was shut down, it seemed to be the end of the reddit community. Although they did reappear on incels.me, the subreddit is a lost civilization, and the posts and artifacts that remain are still present because of the nature of the Internet but is continually reproduced through the memory of its members. Unlike r/TheRedPill, r/Incels knew that their end was imminent but didn’t have a community- wide back up plan or back up forum established before the ultimate banning of their subreddit. 106 Depsite this, their responses to censorship and bans were similar – panic, confusion, and lamenting the loss of their community. The “Incelosphere” Incels, like r/TheRedPill, had been a community targeted by other redditors for as long as the subreddit had been around. However, what is notable about Incels is that it was not the only subreddit for members to congregate: on reddit alone, there were a number of Incel communities like r/Braincels, r/Trucels, r/IncelsWithoutHate, r/ForeverAlone, r/MaleForeverAlone, r/supportcels, r/IncelSelfies, r/AskAnIncel, and others. Off reddit, the most visible Incel communities were on 4chan and on 4chan’s /r9k/ board specifically (see figures 35 and 36 for a visual representation of all these communities). All of them, however, existed on a spectrum in terms of its misogynistic content as well as whether or not women were allowed into the forum, and saw each other as competition and enemies rather than being affiliated with one another in a larger Incel “network.” So unlike r/TheRedPill which created affiliate subreddits with similar moderators, goals, and subreddit culture, the Incels community was extremely fragmented in terms of not only their community, but their ideology and discourse as well. Separating themselves from The Manosphere, Incel communities often refer to this conglomerate of Incel communities as the “Incelosphere”. 107 Figure 35. Incels communities on reddit. A strikethrough indicates the subreddit has been banned at time of writing. Figure 36. Incels communities off-reddit (own communities, not subcommunities like 4chan’s) 108 Regardless of the specific community any individual Incel belong to, the term Incel is used as an insult on reddit and other web forums whenever someone posted something deeply misogynistic or self-defeating in regard to a lack of success in pursuing romantic and sexual relationships (i.e., “don’t be such an Incel” or “go back to r/Incels”). On the more extreme side of the spectrum, r/Incels and those who adhered to their ideology believed that the world only favored women and “Alpha Males” and “Chads” (their term for an attractive man) and left out “low status males”. Identifying as “Beta Males”, Incels harbor a deep hatred of women as well as extreme envy toward other men who are sexually and/or romantically successful. Incel communities, regardless of their level of misogynistic rhetoric, also all contained posts from members about their struggles with depression and other mental health issues due to loneliness, and posts that expressed desire to end their lives. The posts on the forum also included the typical rapport between forum members – discussion about films, music, video games, etc. were also frequently present – however, the main mission of the r/Incels community was to spread their version of the “red pill” of r/TheRedPill but to take it a step further – calling it the “Black Pill”; which is made up of Red Pill truths but with an even darker, nihilistic, and more extreme interpretation of the world. Due to this more extreme view of the world and culture at large, Incels and r/Incels in particular have always been under a certain threat level due to outside perceptions of them. Although the term “Incel” would not have necessarily entered the public lexicon until Rodger’s murderous rampage, the community itself had been brewing for quite some time under pseudonyms like TFLers (True Forced Loneliness) and the anti-pickup artist movement (McGuire, 2014). The r/Incels community, however, came under threat after the changes to reddit’s policies regarding harassment and violent content. The community was well known for 109 its extremely misogynistic posts that purported that men were owed sex by women and that women were “depriving” them of this right, posts that fantasized about harming or killing women, posts justifying rape, and even posts that had videos, gifs, and photographs of women being beaten or even killed. The community was also rife with derogatory slurs for women, racial minorities, and LGBTQ+ persons. Another subreddit called r/IncelTears functioned as a watchdog community of r/Incels and was dedicated to finding posts and uploading screenshots of posts form the forum that were the most outrageous, violent, or hateful. Although the community had frequently been the target of brigading, trolling, and other modes of surveillance from other subreddits like r/IncelTears, the forum was active from 2013 to its closure in 2017. The 2017 Ban Wave Incels, despite starting in 2014, was relatively safe from the ban waves that had threatened communities like r/TheRedPill, perhaps due to the fact that the community was small and not so visible. They hit 1,000 subscribers in August 2016 (RedditMetrics, n.d.), meaning that although the content was always present, they had not been under the scrutiny of other redditors or the administrators perhaps due to their small community size. After the 2016 election and the events during the Unite the Right rally in Charlottesville in August 2017, the community grew significantly and became more and more well-known across reddit – one reason for this may be because of the establishment of r/IncelTears, the previously mentioned “watchdog” community, which was started in May of 2017 (RedditMetrics, n.d.). Although Incels perhaps did not grow as a direct result of Donald Trump becoming president and the rise of the alt-right, the community did grow during this timeframe and political/social moment. What grew the community more 110 likely is that they started being linked or mentioned in other reddit communities, in a mocking or degrading way. In 2017, the same policy change that threatened r/TheRedPill, and the panic in the forum following popular media pieces on toxic reddit communities, affected r/Incels as well. In response, the r/Incels community took measures to protect the subreddit by going into “private” mode after their community started being infiltrated by users of r/IncelTears as well as after they were frequently being mentioned as one of the “worst subreddits” in AskReddit threads like the one seen in figure 37, which was posted in May of 2017. AskReddit is a community that poses questions to the community about specific experiences, their opinions, etc. to garner responses. “What is the worst subreddit” or “what is the most toxic subreddit” is a frequently posted question, and r/Incels frequently was mentioned (see figure 37). Figure 37. What are some of the most toxic subreddits? As indicated in the figure above, where the user notes that they were banned for posting a question, r/Incels moderators had a policy of banning users from their forum who entered to ask 111 questions of the community. In particular, after r/IncelTears was started, the community went from having around 10,000 subscribers in in February 2017 to having 30,000 subscribers by August. This large jump was not met with all community members with enthusiasm: many r/Incels members believed that the high subscriber count was from users of IncelTears and other “lurkers” who wanted to mock the community, and to police them. One user, on a post titled “We’re being plagued by a different kind of normie7/female invader,” from August 2017 asked the community: “Has anyone else noticed how most posts and comments are being mass- downvoted? Usually, the normies who come here upvote everything like they're watching a circus show or something. We must've been linked on a sub/website filled with easily triggered, bluepilled brainlets.” – post by user wearezero Responses to this post speculated that it could have been a bot that was created just to target the r/Incels community. Whether it was a bot or groups of people engaged in downvoting all of the content on the community, the subreddit realized that they were being “infiltrated” and chose to make the subreddit private for one to two days in August 2017. These downvotes were not discouraging to the members however in the way that the attackers perhaps hoped they would be – rather, the community members saw these attacks as proof of them threatening the status quo, of them “winning the argument” so to speak because the tactic the “invading normies” resorted to was to downvote everything rather than engage with the community. After the subreddit went private, more information came from the community members in a post asking, “Does anybody know where we were linked?” with the poster noting that they had over 5,000 guests viewing the community within minutes. One possibility was an article published on August 18, 2017 on Quartz identified r/Incels as an alt-right community (Squirrell, 2017). 7 “Normie” is a term used on web forums like 4chan, 8chan, and some reddit communities that means “normal person,” and is meant to be a derogatory term (i.e., someone who ascribes to mainstream culture and beliefs) 112 Another user pointed out that the community had been linked not just by r/IncelTears, but also r/justneckbeardthings and r/cringeanarchy (which, ironically, would be banned the next year). “There was a post on justneckbeardthings, cringeanarchy, and of course inceltears. The neckbeard one was a teenager resisting the black pill and thanking the sub for it...? Cringeanarchy was the one about whether you would help a femoid. Inceltears is just... yeah.” – comment by user Catzy94 Other responses to the “alt right” classification in the Quartz article mocked categorization. Many of the commenters on the post as well as in general within the Incel community reject that they are a part of the alt-right and attest that a majority of the forum’s members are non-white (see figure 38). Note, however, that the commenter who provided the link to the Quartz article referred to it as “liberal media.” Although many of the forum members denied the alt-right affiliation, many did not deny that they disagreed with “liberal politics”, in particular anti-feminism is a sentiment that is core to Incel ideology and discourse. Figure 38. How is this sub even alt-right? In response to the Quartz article, there were a number of posts that not only questioned why the subreddit had gone private, but also openly mocked the researcher of 113 the article himself, posting links to the researcher’s Twitter page. Although most of the subreddit members celebrated the community going private, meaning that anybody who wanted to join needed to be approved by moderators or needed to already be subscribed to the subreddit, one commenter on the post “Apparently /r/Incels is now part of the alt- right, according to this researcher (read his article on qz.com),” argued that the subreddit going private would eventually spell the downfall of the community (see figure 39). Figure 39. “Slowly fade into non-existance.” This presented a conundrum for the community: by going private, they were not as easily accessible by people who wanted to join the community (since they would need to be approved and undergo some kind of vetting process to gain access), but by remaining open they were vulnerable to being brigaded, surveilled, and mocked by other redditors from other subreddits. Before the subreddit went private in August 2017 following the linking from other subreddits and the Quartz article, r/Incels users were asking the moderators to make the subreddit private, and were wary of posting during this wave of subreddit infiltration by outsiders. After the subreddit did go private, posts saying “We’re private now you can post again” appeared, with many users celebrating the privacy and freedom to engage within the community again. However, some of the members were concerned that there were “impersonators” in the forum who were intentionally posting extremely violent content to troll the community and to increase the 114 perception that the subreddit was a violent group (see figure 40) in order for them to get banned. Figure 40. Trying to get the place banned Similar to r/TheRedPill and other communities that were being targeted by trolling, brigading, and other modes of “infiltration” as the r/Incels users called it, users of the subreddit started calling for the “normies” who had invaded their ranks to try and debate them. In a post titled, “Dear Normies: Post your argument in the comments, and I will subsequently destroy it,” user TheIncelPill posted: “r/Incels is not, in fact, a bitter hate group. It is in fact the ultimate reflection of reality. I believe there is no argument against the Black Pill that doesn't stem from emotionalism. If you're ugly and socially awkward, you can't get laid. End of discussion. The rest of our ideology is a mix of Red Pill, MGTOW, and LMS Theory that, while successful at offending women, has never been properly debunked. I am confident I can defend this ideology from anything.” – post by user TheIncelPill This statement from one of the top users of the forum that not only outlined a base level analysis of the Incel ideology but openly challenged non-members to debate them demonstrates an interesting mode of engagement within the forum: they knew they were being watched, they knew that outsiders viewed their group negatively and as a hate group, and believed so strongly 115 in their convictions of the discourse of the group of being reality that they were prepared to argue with anybody who saw otherwise. It is not only the level of confidence in the beliefs of the community that are being defended, but rather this is a way of engaging in a form of epistemic production where the truth of the Incels community is being demonstrated as being closer to reality than that of the beliefs of the outsiders. The posts addressing the visitors varied from low-level insults of the non-Incels members, to inviting debate, but also were addressed to “Lurking Females” – these posts towards women were particularly misogynistic and engaged in sexual shaming, asking things like “Does it disturb your conscience to know that whomever your husband/bf is, you've already fucked multiple guys before him? Do you not feel any guilt?” (a post by user lthelthe). Regardless of the post content, these posts demonstrated that the increase in subscribers as well as “guests” on the forum (subreddits show how many people are currently on the subreddit viewing the content) was viewed by the existing Incels members as outsiders, thus warranting identification of them and challenging their worldviews. However, along with these posts that aimed to insult, alienate, and even drive out the outsiders, there were posts that defended the views of the forum with some members even claiming that they may be misogynists online but that in “real life”, they were respectful towards women: “But thats because I am so angry at the world. Im sick of being treated differently simply because im subhuman. The funny thing is, is im one of the nicest guys out there. Ive always treated women with respect outside of the internet, but it just doesnt work if u are subhuman because they dont give a shit. Ive done plenty of nice favours for my friends and their gf's and nothing changes. You are still seen subhuman and ugly and a loser at the end of the day. Time to rot till i die.” – post titled “I might be a complete misogynist online…” by user PM_ME_STRIPPERS 116 This image management on part of this user as well as some others on the forum stating that r/Incels was not a hate group was an attempt perhaps by the forum members to lessen the attention that the forum was receiving from outsiders who were attempting to shut the community down. The events that took place in this forum leading to its ultimate ban all happened within a span of a few months in 2017, but what perhaps made the forum more vulnerable was the release of an article from VICE identifying the most toxic subreddit communities. Although the Quartz piece did receive considerable attention, it was written in a more academic style that may have been less accessible to a general public. The VICE article, on the other hand, was written for lay audiences and readership, and had considerable circulation across social media platforms, thus bringing the community into the public limelight. September 2017 The 2016 presidential election as well as the Charlottesville Unite the Right Rally in August 2017 signaled a considerable shift in the political and social climate, and it could not be denied that the rise of the alt-right and other extremist groups was partly due to the Internet and websites like reddit that allowed for the establishment and spread of extremist ideology (Daniels, 2018; Kelly, 2017; Neiwert, 2017). Particularly after the Unite the Right rally where members of the alt-right, neo-confederates, neo-Nazis, and the Ku Klux Klan and other groups all marched in Charlottesville, Virginia and brutally beat counter-protestors, even killing one after she was hit by a car, media attention was amped up to identify the communities where these ideologies were being bred. The VICE article mentioned above drove traffic to the Incels community once again – and the number of subscribers skyrocketed as a result. Although GamerGate had brought attention to reddit before, the groups that were being identified by the VICE article and others 117 like it were not necessarily a “part” of GamerGate when it occurred – but their tactics and discourse were similar. The VICE article that was published on September 11, 2017 was posted in the r/Incels forum and viciously torn apart by the community members – mocking not only the choice of subreddits by the author of the piece but attacking the author himself. Other commenters noted that the examples used to highlight that r/Incels was a “toxic community” were obviously satire, highlighting the importance of not just understanding the language of the community but also their humor serving as a subversive and group identity-building tactic: “LMAO he directly linked StAlia's infamous satirical post "This is why you can't get laid" and treated it as complete seriousness. Holy fuck, how are you like 29 with a college degree and can't even use basic reading comprehension?” – comment by user BF8211 Other commenters brought attention to using archive.fo to link to articles/websites in order to avoid site visits resulting in advertising revenue for the organizations that posted the attack pieces on the community (see figure 41). This is a common tactic used across web communities for multiple purposes: (a) archiving the link and the content means that even if it is taken down by the original source, this archived copy remains and (b) to avoid having the “clicks” by the community result in revenue by way of advertisements that are posted on the page. Although in many cases it’s option A for why posts are archived – and not necessarily just posts of the community but also in cases where it is someone that they are attacking, and they want a record of the target’s post in case they try and delete it – but for r/Incels it was primarily to not give VICE the financial satisfaction (see figure 41). However, the VICE article, in its attempt to highlight the toxic communities that had flown under the radar (as opposed to those that were banned previously or the more explicit “alt right” subreddits), may have also driven people to join/subscribe to these communities as well since now they knew that they existed. 118 Figure 41. Stop giving them clicks The VICE article and the subsequent attention that it brought the community meant that the community locked itself down again – not only going private but not even allowing for posting of new posts or comments within the forum. This persisted for several days, and an old moderator was brought back to moderate the forum again in light of these “attacks” from non- members of the forum. The moderation of the forum became extremely strict in order to control the amount of traffic from non-members and guests that the r/Incel community viewed as harassers of their subreddit. The rules of the forum were already strict before these actions were taken to lock down the fort, so to say, but became even more stringent to not only quell the number of outsiders who were looking to cause chaos in the community but also to control the public perception of the community as well. The actions taken by the moderators included having a list of approved submitters, but this was an unpopular decision on part of the moderators with community members complaining that all of the moderators, except for azavii (the one who was brought back to moderate) were 119 “cucks”. One poster even noted that the subreddit was “garbage now” and wanted to make a new subreddit – particularly because one of the approved submitters was a woman, with one commenter (user BlackPillDealer) on the post saying, “Wait a minute, a roastie8 is an approved submitter, yet they won't approve me? Jesus. What's going on mods?” The move to make the subreddit private in order to protect itself was met with mixed results – some said that it needed to be public in order for other users to find the community, and that going private was a sort of admission of defeat by the community. The subreddit did go back and forth between being private and public, but some users wanted the subreddit to remain private because of the humiliation they felt by having their posts and comments appearing in popular media articles, on other subreddits, and other online spaces (see figure 42). Comments to this post asking why the subreddit didn’t just remain private listed a number of reasons that were common within the forum (1) it would be harder for other incels who were not a part of the community yet to find the subreddit and join them and (2) that staying public meant that they would be able to “black pill” normies and convert them to this ideology. Figure 42. I refuse to be humiliated again 8 A derogatory term for women in the Incel community, referring to the appearance of a vulva looking like “roast beef”, and specifically having this appearance due to a high number of sexual partners (this is obviously false, but is a false equivalency frequently supported and asserted within this community) 120 Debates around whether to remain private or not centered on staying public, on continuing the practice of going private and then public, and even opting to go private on weekends. The intense moderation by the moderators of the forum, and the return of the moderator azavii, quieted things down in the community for some time. The subreddit was reopened and posting was allowed once more, and the number of guests who were lurking on the forum seemed to go down. This act of making the subreddit private following increased attention from other subreddits as well as media attention happened in February, June, August, September, and October 2017. The increased attention to the forum in August, however, perhaps occurred not just because of the Quartz article but also because the subreddit made it to the “front page” of reddit in July 2017 – what makes it to the front page varies, but the algorithm on reddit chooses these posts based on popularity. The option to go private, as well as increasing moderation, helped the subreddit “stay alive” for the time being at least by diverting attention away from their community by making it inaccessible, as well as removing the outsiders through intense moderation. But in October 2017, this would all change Following the subreddit’s choice to go private in September and the subsequent decisions October 2017 on how the community should move forward, the atmosphere of the subreddit in October of 2017 continually pointed out that the subreddit functioned as a kind of “zoo” for non-Incels members and that this level of harassment and discrimination was detrimental for the community. Although this was a common sentiment expressed in the forum, the establishment of subreddits like r/IncelWatch and r/IncelTears perhaps amplified the feeling in the community that mocking Incels should be “considered a hate crime,” (see figure 43). On October 25, 2017, reddit 121 administrators made an announcement that they had updated reddit’s terms of service and policy regarding the posting and sharing of violent content on the platform. Figure 43. Incels and Hate Crimes The “oppression” that the community faced from outsiders and from communities like r/IncelTears would become a more serious concern after this change in reddit policy on what constituted violent content, according to reddit administrator landoflobsters: “In particular, we found that the policy regarding “inciting” violence was too vague, and so we have made an effort to adjust it to be more clear and comprehensive. Going forward, we will take action against any content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or a group of people; likewise, we will also take action against content that glorifies or encourages the abuse of animals. This applies to ALL content on Reddit, including memes, CSS/community styling, flair, subreddit names, and usernames.” - post by administrator landoflobsters Thus, it would not only be the content itself that would be in violation of this updated policy but also flairs (little images or text attached to usernames), subreddit names themselves, as well as usernames and even the styling of the subreddit (since subreddits can have custom CSS/HTML styling to them). After this announcement was made, the communities r/NationalSocialism, r/Nazis, and r/Far_Right were all banned because they were in violation of this policy. These updates to reddit policy were interesting considering in previous years, particularly when reddit came under fire for GamerGate in 2014, then CEO Yishan Wong 122 defended the platform’s “hands-off” moderation approach and placed n responsibility on the individual users. This change in policy, as well as the ones that came before it, signaled a seismic shift in the ways that administrators viewed themselves as responsible for the content on the platform – a platform that had been a hotbed of extremism for some time. In the post announcing the policy change, users that commented on the post came ready with examples from r/Incels as a community that was in violation of this change in community policy (figure 44). Although some users pointed out that the policy was still vague, there were a number of comments on the post that identified a number of subreddits like r/The_Donald, r/Incels, and others as being in violation but were still active communities. Figure 44. Examples of violence from r/Incels The frustration by the commenter in figure 44 demonstrates that this user, and entire communities, had been trying to get the administrators to act in response to the violence posted in r/Incels for quite some time – note that the commenter says “Because I’m starting to think reddit outright approves of this fucking sub.” This comment, and many others, provided a number of links of examples of the community inciting violence, and expressed frustration that 123 the reddit administrators had not banned the community. In response to the announcement, the Incels moderators immediately made a post to alert community members of this policy change on the reddit level and not just the subreddit level. Many of the comments from subreddit members pointed out that much of the “violent content” that people attributed to the r/Incels community was either misinterpreted (because it was a joke/satire) or that the content had been posted by “normie infiltrators” to make the subreddit seem more violent than it was. Some of the comments to the announcement were not hopeful at all – many of them seemed to understand that this was the end, and that it would soon be over for the community (figures 45 and 46): Figure 45. Comments responding to the post about the policy change pt. 1 124 Figure 46. Comments responding to the post about the policy change pt. 2 The collection of comments to the post in figures 45 and 46 are not in order, necessarily, but all highlight the concerns of the community: where will they go? Where can they go? Are there any back-ups? At this point in time, there was no official “plan” on part of the subreddit moderators – although a large community, the r/Incels subreddit had no plans in the event of a forum shutdown, unlike r/TheRedPill. Users suggested “going back” to sluthate, an off-site forum that was a prominent Incels-like community before “Incel communities” started proliferating the Internet, as well as moving to Voat or 8chan and having dedicated communities there. Regardless of the suggested platform, these responses point to the users looking to the moderators as the leaders of the community – because the users asked the moderators where they would go, as opposed to providing solutions themselves. In the figure 45, one user even says that they “hope for a mass exodus”, demonstrating that the love affair with reddit as a place for “free speech” and expression for unpopular thought and ideologies was waning and beginning to die. 125 Paradise Lost: Incels.me In a slightly unusual turn, the subreddit had actually gone private on November 4, 2017, and thus was private when it was ultimately banned. r/Incels was banned on November 7, 2017, not even two weeks after the change to reddit community policies and standards. On the same day of the ban, an off-site forum, incels.me, was launched. This means that the forum existed before the ban, considering the time and effort it takes to create the infrastructure for a web forum, pointing to some kind of preparation - although it was not an “official” stance of the subreddit that this backup community existed. The same moderators of r/Incels also joined the forum (figure 47) and were the first moderators of incels.me – but despite having this alternative community ready in the event of a ban, there was no back-up archive for the content that had previously been on r/Incels. Reasons why the announcement was not made on the subreddit could be that the moderators knew that the community was being heavily surveilled by non- members, and wanted to lessen attention from outsiders. Thus, the lack of advertising about the new forum may have been intentional on part of the moderators. Figure 47. The mods are the same as in reddit Despite no official announcement on the subreddit, some r/Incels members knew about the forum and migrated there immediately after the ban. The subreddit had a Discord server, an instant messaging application and chat platform, which required invitation to join but still had 126 “outsiders” in it, and the link to the incels.me forum was dropped there. Interestingly, some of the first posts in the forum questioned the validity of the space and whether or not it was a “honeypot,” a forum meant to entrap Incels, and fears of the forum being a space for surveillance by groups like IncelTears, the government, or whomever else emerged in the very first days of the forum (see figure 47). Although there were some of these paranoid responses, the vast majority of response by the new members of the incels.me was relief to not be under reddit’s rules anymore and that the community would be allowed to flourish without intervention. In figure 48, the user not only states that “we can finally be at peace,” but that “idels [sic] are eternal. We are eternal. You will never eradicate us.” Figure 48. You’ll never be able to gas us all out. The post above demonstrates a sentiment that is often expressed following some kind of ban or censorship, i.e. an infrastructural failure: it’s used as a validating and strengthening event rather than being seen as something negative to affect the community. The ban, despite having a detrimental effect, was responded to more positively than perhaps non-members of the forum would have anticipated. Many of the forum members, assumedly all being r/Incels members who 127 had migrated to the new forum, saw the ban as a positive – the reddit administrators would no longer control the community, the community would (for the time being) be watched less by curious redditors, and the lack of official community policy coming down from a large platform like reddit meant that things that were previously off limits in the community would be allowed. In figure 49, we see the user fukmylyf even explicitly stating that “getting banned was merely a new beginning.” Figure 49. Getting banned is a new beginning The way that this can be interpreted is that they viewed the ban not necessarily as a failure of the community itself, but rather a failure on part of the reddit administrators for pandering to the “SJWs” and being politically correct. This ban, then, was sort of like a phoenix- type death that allowed for the forum’s rebirth, free of the constraints that had been imposed upon the community by not just reddit administrators, but other reddit communities like r/IncelTears and others. In particular, the ban was seen as a sort of blessing for the community – and the sentiment that ideas cannot be killed and that the Incels would never “truly go away” despite the ban was echoed throughout the community. Thus, it served as an event that bonded the users together even stronger rather than forcing them to break apart. In figure 50, the user 128 theultimate341 started a post “Thank God We Moved Out of Reddit” and noted that the way the ban was interpreted, as well as the bans of other communities, was that reddit was bowing down to women and “their ideology.” In response, user makeouthill writes “THEY CUT OFF ONE OF OUR HEADS, TEN MORE SHALL TAKE ITS PLACE!” Figure 50. Ten more shall take its place This metaphor in figure 50 alludes to the Hydra of Lerna, a monster from Greek mythology that had many heads which would regenerate upon their removal, meaning that the monster was near impossible to kill. Equating the Incels community to the hydra is an interesting juxtaposition of not only how the forum views itself (an indestructible mythical beast with many heads that seems impossible to kill) but also how others perceived the forum (a monster with many heads, that never seems to die). What is interesting however is how fractured the Incels community was, and still currently is – there are multiple communities both on reddit and off of it that affiliate themselves with “Inceldom” but the ways that the forums discuss these issues as 129 well as their attitudes toward women vary wildly. Despite the differences in some of the base discursive tenants of the incel subculture, we can return to the metaphor of the Hydra again – Incels exist in many different forms, in many different places, and each “head” or community is its own entity. In r/Incels and incels.me, the general attitudes of the users were deeply misogynistic and violent, and thus the forum was seen as perhaps the best thing to happen to the community because these views could be freely expressed, shared, and built upon. Figure 51. They stole our legacy Although there was a backup forum in place after the ban of r/Incels from reddit, there had been no collective effort to organize and archive content from the subreddit, meaning that all of the content that had been posted there was lost (see figure 51) - in particular, the text posts were the greatest loss, because the images that had been uploaded through Imgur and then shared were still available. The loss of this collective body of knowledge that had been built in the subreddit was lamented by the community members, and others also noted that fellow r/Incels 130 members who did not know about the new forum were probably in a high level of distress. Despite the high level of offensive and hateful content on the subreddit, it acted as a digital community for many otherwise lonely individuals who had no outlet or space to share their views and feelings. The loss of all of the content, also, was met with responses that also expressed a high level of distress that this knowledge as was all gone (see figures 51 and 52). They lamented that the loss of the materials from the subreddit meant that their “legacy” was also lost, and that this legacy that had been wiped away was all that they had. Although some of the posts were available through Internet archives like WayBack machine and some users had maintained their own personal archives of material, for the most part the community suffered a huge loss in terms of their collective knowledge and memory – or at least, the physical artifacts of it. Figure 52. This is a fresh start 131 Despite a high level of dismay due to the subreddit’s post being lost, many saw the forum as a “new change” and a “new beginning,” a “fresh start” for the community (see figure 52) and perhaps this meant that things would be done differently. After the forum were established, there were efforts in the community by the moderators to establish an “Incels FAQ”, a “glossary”, and posts with “useful links” and other kinds of information that had been available on the subreddit but these new iterations were updated and more organized than the ones that had been on the subreddit. Similar to r/TheRedPill’s practices to provide the materials in a static location (in their case, the sidebar) that provided resources, links, and other means of sharing knowledge, incels.me did something similar and started maintaining these crucial community components in a more strategic manner. There was even the establishment of an Incels wiki, which started in 2018, and an online store to sell “Incel fashion” and other accessories to help raise funds to maintain the forum (see figures 53 and 54). Figure 53. Incel Wiki 132 Figure 54. Incels Store For some time, it seemed peaceful in the Incelosphere, and forum members began sharing the news with former r/Incels subreddit members about where the forum had moved to. There were some rules that were enacted in the forum, in particular regarding the use of VPNs (virtual private networks) or proxies to access the website. Anybody attempting to view the site would not be able to if they were using these services, and this was done particularly to prevent banned users from being a part of the forum. However, a number of violent events would occur that would draw attention to the community from the media, and resulted in the number of guests in the forum skyrocketing, similarly to what had happened on the r/Incels subreddit before. However, the forum was not “free” of infiltrators – because the link was being shared on YouTube channels/comments, in Voat communities, on 4chan and 8chan, and even on Discord servers, there was of course the possibility that there would be guests who were non-Incels lurking in these spaces. But what drew significant media attention to the forum were extremely violent acts. These events that were attributed to the Incels community in early to mid 2018, or embraced by the Incels community, were the Marjory Stoneman Douglas High School shooting in Parkland, Florida (Futrelle, 2018); but most notably the Toronto attack in April 2018 by Alek 133 Minaissian, who before the attacked explicitly mentioned the Incels community in a Facebook post (Tolentino, 2018). “Elliot Rodger’s Legacy Lives On.” On February 14, 2018, Nikolas Cruz entered Marjory Stoneman Douglas High School in Parkland, Florida, and open fired, killing 17 people. Cruz had been a student at the school but was expelled the previous year, and after the shooting news of his online life emerged – he often expressed racist, Islamophobic, homophobic, and misogynistic views on social media websites and group chats, and had participated in exercises with a white nationalist paramilitary group in Florida (Futrelle, 2018; Kennedy, 2018). Although not explicitly tied to the Incels community, the Incels celebrated the day of the massacre – Valentine’s Day, because of the meaning of the holiday for one being about romance and love – and noted that he “looked” like an Incel and that in his attack, had killed many “normies” and “Stacies” (Incels’ term for attractive women, used in a derogatory sense). Figure 55. A war zone 134 Figure 55 shows a meme posted in the thread, “Elliot Rodger’s Legacy Lives On,” on Incels.me soon after the Parkland shooting. It shows a screen capture from one of Rodgers’ YouTube videos, and the “friend zone” is a term that is meant to signify lack of romantic interest (in particularly men) by women who “put men into the friend zone.” 9The Parkland shooting, although not officially aligned with the Incels community, was still seen as a celebratory event in the community because of now who Cruz killed, but also because of what Cruz looked like. Whether or not to applaud the actions of a mass killer has always been fraught within the community, even with Elliot Rodger (their so called “saint”), and many members seemed uncertain whether or not it was appropriate to celebrate the deaths of innocent people. In response, posts were made claiming that regardless of Cruz’s motivation, that this was something to be celebrated because it was a battle in a long running war: “This is a war. And every war has casualties. They've changed society...forever. Ever since the sexual revolution began man after man has turned into incel after incel. All while women climb the latter. You *should* be happy, because every death is just one step closer to the truth becoming unraveled. The social contract was broken long ago. These Saints that you see are merely fighting back against it. If MEN learn the social contract then that is one step closer we are to putting foids back into their place. How many times do normies get online virtue signaling about how they wish this or that incel didn't commit suicide? *crickets*” – post by user eliotrogerhere on Incels.me This idea of a cultural war is common not just in Incels and Manosphere communities but the far right at large. Thus, the actions of mass killers like Elliot Rodger, Nikolas Cruz, and even mass killings that came before them were seen as victories – whether or not they were Incel. In the community, there were a number of swirling debates as to whether or not Cruz was an Incel, 9 A derogatory phrase meaning that women do not see these men as “boyfriend material” but only as “friends”, thus “friend zoning” them 135 and if he should be inducted as a “saint” for their community (much like Rodger). One chilling post noted that: “Why care about his identity, or whether he’s Incel or not? We should just rejoice for what he has done, it’s the actions that make the man and in this case, it was this hERo who stood up for Incels all over the world and thought that enough is enough. I feel no sympathy towards those who have died or gotten injured, expect for the Incels who were unfortunate enough to be a victim of the mass shooting.” post by user Octopusgun2 on incels.me Note the way that ER has been capitalized in the word “hero” in the quote above. This is a common tactic in the Incel community and is meant to symbolize the initials of Elliot Rodger – thus when someone is a hERo, it’s a very specific signaling to Rodger’s violent acts. This use of language within the community, and shifting of symbols and their meaning, is a key part of Incel forums as well as other far-right affiliated groups. These rules of language and interaction are often outlined in glossaries that the moderators of the forum make available, like in the case of r/TheRedPill and Incels.me – Incels had not had a glossary necessarily when they were on subreddit, although they included some definitions in their “FAQ” post on the community. These formalized language rules, grammar, and slang define the community and help to make them distinct from others, but also have significant overlap in their use of certain terms (“Chads”, “cucks”, “SJW”, etc.). This use of language in these circles are significant because they also signal to the ideologies of the members themselves and help to shape the contours of the discourse that occurs in them – even if Cruz was not an Incel, and even if February 14 being Valentine’s Day had no motivation on his attack, the Incels community projected their own meaning onto not only the act itself, but the person and the time that it took place. After the shooting, there was still some discussion about Cruz’s intentions and actions but as many conversations are wont to do in the age of the 24 hour news cycle, it all but disappeared from the main discussion after a short period of time. In February and March, however, concerns 136 started circulating in the forum again about the level of “guests” that had started flocking to the off-site forum of incels.me, and posts started appearing asking how safe the website really was – particularly, for its users. Because a number of articles had come out in recent years about how Incels were aligned with the alt right, and had a reputation for condoning and advocating for violence, the concern was not just about prying eyes from journalists and members of groups like r/IncelTears but law enforcement officials themselves. One post in particular highlighted this fear by asking the administrators of the forum – specifically, SergeantIncel, who was featured much earlier in this chapter for comforting fears of the newcomers that the same moderators of r/Incels were running the forum – about whether or not IP addresses of users were logged (see figure 55). Figure 55. How much data is being saved? Although some members claimed “I don’t care” or seemed to mock this user’s concerns about privacy of the users, these fears would be actualized and after the Toronto attack, media attention on the community would skyrocket. 137 “Incels are Finally Being Taken Seriously.” Alek Minaissian drove his van into a group of pedestrians on a sidewalk near a busy intersection in Toronto on April 23, 2018, killing 10 and critically injuring 16 (Mezzofiore, 2018). Inquiry into Minaissian’s motivation led authorities to a Facebook post by him that specifically stated “The Incel rebellion has already begun!” (Mezzofiore, 2018), and also hailed the actions of Elliot Rodger. As a result, the Incels community was deeply divided on the events – a large faction celebrated this attack, and its explicit affiliation with the Incel community, however many users also condemned the violence that had been tied to their subculture and even SergeantIncel posted an official post on the forum stating that Minaissian was not a known user of the site, and condemned his actions (see figure 56). Figure 56. Statement from SergeantIncel Debate in the forum about whether or not individual users or even factions of the community supported or condemned the attack continued for quite some time, even after the news media attention died down. However, there were some truths that could not be contested: 1. The number of “guests” viewing the forum was in the thousands (Incels.me had a counter 138 showing how many users were online, as well as the guests) and 2. Minaissian’s Facebook post speaking of the Incel rebellion and community, and referencing Elliot Rodger, meant that media outlets from CNN to The Guardian to USAToday to the CBC and BBC, and even Fox News all had reports about the Incel community. Even Alex Jones, the noted conspiracy theorist and far- right figure, featured the community on his podcast, and for some time it seemed nearly every media outlet, institutional or fringe, had a piece about the Incels community. This media attention was of course met with scorn – the community felt that they were being watched and the website was slowing down not just because of journalist writing pieces, but people flocking to the forum after learning about them. Although the posts varied wildly between anger at the attention (figure 57) and celebration of the attention (figure 58), one user lamented that the attention meant that the Incel community could no longer be free to do as they please on their forum: “After the recent attack, Incels will eventually no longer be in the shadows on the internet. We will be in the spotlight for the whole leftist media to see. Expect much more sub-8 male hatred and ultra-radical feminsm invading every corner of the internet. Braincels will be shut down. This site may even be shut down. Maybe other non incel sites with a large incel population (pol,r9k), will be forced to shut down. Instead of being simply shunned and hated by society, will be will driven away completely. Anyone with a non-bluepilled ideology will be branded a terrorist, no matter how peaceful they are. All incels want is to be loved, and our genetics sadly makes that impossible. It was over since the beginning.” – post by user Ap0calypse Figure 57. This board is done for. 139 Figure 58. Inceldom is mainstream The increased media attention resulted in the members also panicking about what would happen to the forum – a fear that was amplified after the Discord server was shut down following the attack and association with the Incel community. Members were also resentful of the “misrepresentation” of the community, and many noted that they supported what Minaissian did but were upset that he had mentioned the community by name, with one user stating: “I hate how this fucking autist mentioned us by name. A shoutout to ER would have been sufficient. I completely condone what he did, but this is the worst timing possible.” (comment by user fukmylyf). The mass influx also prompted the administrators to disable the creation of new accounts, and to put a permanent banner at the top of the page of the forum announcing that new accounts were not allowed to be created, and a link for an “Intro to Inceldom” post as well as SergeantIncel’s post with a statement (figure 59): Figure 59. Incels.me does not condone violence. The attempt to do some public relations-esque damage control on part of the moderators is an interesting shift from the previous Incels community on reddit – although posts were made by individual users addressing “normies” and other guests, there had not been an official 140 “statement” made by moderators in that community, whereas here there was not just an official statement, but a stickied banner that ran across the top of the page for the influx of guests. The move to close down registrations was also in part of the number of new accounts that were being made following the news that Minaissian had affiliated himself with the movement – something that was hotly contested in not just the statement but the forum itself. This PR strategy, and control over the narrative that the forum was putting out, was a sign of a shift of how the moderators viewed their role in the community – not in terms of their power to control the content, but also their power in helping the group survive. The Discord server being banned in particular set off alarm bells for some of the forum members because although the increased media attention may bring some new members to the group, it was ultimately a large number of guests who were there for either voyeuristic purposes or to be watchdogs on the community. Posts like “We need serious steps” and “Place to regroup?” and the like began to appear (see figure 60). Figure 60. What do we do if the site gets taken down? 141 The suggestions in the post of “We need some serious steps?” (figure 60) pointed to an intense fear of remaining on the “open” web and suggested establishing the community on the deep web, where they would be free of constant infiltration and surveillance from outsiders. These suggestions were common in this community among other Manosphere and alt-right communities, and they were always countered by the moderators positing that going private would then make the community difficult for newcomers to find. Posting was shut down for a number of days until early May, when registration was reopened on the forum. After the Toronto van attack, Elliot Rodger’s YouTube channel was also quietly taken down, after having been allowed to stay active for years after his murderous rampage. Minaissian’s explicit references to Incels and Elliot Rodger resulted in hits to the community that were peripheral but still close enough to feel – the Discord server was banned, as well as the administrators and moderators of the Discord server from Discord itself, and Rodger’s YouTube channel was erased. Interestingly, there had been a “Doomsday Preparation” post in the forum for quite some time before Minaissian’s attack that was posted on February 13, 2018 – one day before Cruz’s murderous rampage in Parkland (see figure 61). In it, SergeantIncel lists the number of reasons why the site could go down: DDOS (Distributed Denial of Service) attacks, domain problems, or hosting problems. Rather than focusing on things like harassment, infiltration, and mass surveillance, SergeantIncel pointed to things that were a part of the very infrastructure of the website itself as reasons for why the group would go down. In the post, he also states “As you probably know Incels aren’t exactly welcome anywhere anymore. That’s why it’s important for us to have a clear plan laid out in case anything ever happens to the site, and that you understand the problems that can arise,” (see figure 61). 142 Figure 61. Doomsday What was interesting about the shift in strategy for organizing the community not only involved SergeantIncel’s post shown in figure 61, but also the official statement released and the establishment of archives for the forum as well. After the massive influx of guests, users started suggesting that archives be created not just for current community members, but also for new members as well, with one user suggesting: “My suggestion is to create "Archives/Libraries". They are like subforums and can be read and commented, but you can't post new threads there. The way the posts get there is when mods or trusted incel members replace thread of exceptional quality to there from the other subforums. Why should such an strange and work-intensive thing be created? Because even the best content will eventually be lost in the flood of new posts, and no new member is gonna scroll back to page 100, missing some gems this way. Through an archive they can be passed on to future generations of incels.” – post by user Erenor 143 An archive forum would be established on the website soon after, called “The Blackpill Archive” which contained what the moderators viewed as the best examples of Black Pill science that had come from the community, although they would eventually do away with this subforum after a few months. Instead, a post was made and stickied at the top of the “Meta and Feedback” forum on the website called the Blackpill Archive, along with the Intro to Incels post and their rules/terminology. The Incels forum would see a massive influx of guests again shortly after the Toronto Van Attack, on May 18, 2018 after Dimitrios Pagourtzis entered his high school in Santa Fe, Texas, and opened fire – killing 10, injuring 10. His act was heralded as a success in the community particularly after news was released saying that he had killed a young woman who had rejected his advances (Hennessy-Fiske, Pearce, & Jarvie, 2018). The increased attention to the forum was again a point of distress but also celebration, with some Incels (seen in figure 62) even stating that the media was acting as a free publicity agent for the community. Figure 62. Mainstream recognition Interestingly, on the day of the news of the Texas shooting breaking, the incels.me moderators also mad an announcement that the community also now had an official Twitter page – something that had never existed before for the Incels community, pointing to some intimate knowledge about how moving off-platform and having some kind of “official” account on 144 Twitter would perhaps help to not only reach more members, but also speak on behalf of the community to a much larger audience (see figure 63). In particular, in the post announcing the Twitter page the moderator Master noted that they had called out David Furtelle on Twitter, the operator of the famous Manosphere watchdog blog We Hunted The Mammoth. Figure 63. We have a Twitter account now! The massive influx of guests due to the attacks carried out by Minaissian and then Pagourtzis crashed the servers of the forum, because they could not handle the level of traffic that had started coming in. To help offset the cost of maintaining the forum as well as the servers that hosted it, the moderators launched an online store for Incel apparel, mugs, etc., for members to purchase and help support the forum financially. Launched in May of 2018, the online store was hosted on TeeSpring.com, and a link was provided at the top of the page for users to access (as seen in figure 54). But despite these attempts by the community leaders to calm users and establishing a store to build funds, concerns about the future of the forum considering the large influx of guests continued to be echoed in the community. But another interesting development occurred in the community in a response to the shootings: the need for a “safe word” for Incels to identify themselves to avoid being killed: 145 “I'm surprised nobody has made a thread on this already, or maybe it has been and I just wasn't around to see it, so i'll make it again. We need to establish a widely known and agreed upon "code word" or "code phrase" that an incel can use to identify himself as one so that an incel shooter can identify a "brother in arms" and let him go free. At some point some of us are going to be present at one anothers ER events, and in the heat of it there'll be no time to explain or make long sentences, a quick phrase that can just be blurted out is the ideal thing.” – post by user BlkPillPres Suggestions for the “safe word” included “Go ER” (Elliot Rodger) and “Natural Selection”, but discussion around this community-wide effort to avoid being gunned down by other possible Incels seemed to be delegated to this one post. Regardless, the forum seemed to shift direction in terms of not only how they would support themselves financially, but how they would preserve their knowledge (particularly, their Black Pill knowledge) and the prior launches of the Twitter page and the online store. In late May, the Blackpill Archive was launched (mentioned earlier in this section), and also discussion around the need for the Incels to become more organized as a political movement – something they vehemently denied particularly after mainstream media identified them as such – surfaced. Regardless of organizing as a political movement, the discussion did seem to pertain to a need for some kind of organized response to mainstream media and other “normies” in regard to Incel discourse and the Incel community, which the actions of the moderators seemed to predicate. Further, discussion around who was allowed to be in the community was amped up during this time, but one thing that remained constant is that women were not allowed in the forum. The move toward “official stances” as well as archives culminated in the establishment of an Incel wiki in July of 2018, but unlike other Wikis where editing was open to most users, the Incel moderators wanted to only have a set number of allowed editors for the Wiki (see figure 64). 146 Figure 64. Wiki editors wanted These strategies were a massive departure from the previous r/Incels subreddit which was disorganized and had no repositories of information, let alone a store, Twitter account, and Wiki about the community and its terms, history, and other relevant entries for the preservation of community knowledge. The forum would be a continued source of journalistic reporting for months following the Toronto attack, and although there was the establishment of the account on Twitter, the Incels community themselves did not need to try to spread their message into mainstream media and other platforms – rather, the media was doing this for them, and this was celebrated on the forum as much as it was despised (see figure 65). Figure 65. Featured all the time In addition to the store, a bitcoin address was also created and posted in the forum by SergeantIncel to help offset the costs associated with maintaining the site (see figure 66). During the summer of 2018, particularly in August when this post regarding Bitcoin information was 147 made, the focus of the forum shifted from “doomsday prepping” and more toward strategic maneuvers to address the cost of maintaining the website as well as establishing repositories of Incel knowledge. Despite the barrage of attacks, media coverage, and other things that could be seen as detrimental to the community (and they were), the general atmosphere of the forum seemed positive, and as figure 67 shows, this continued with the establishment of a new Incel server on Discord and another forum: Looksmax.org. Figure 66. Bitcoin Figure 67. New Discord Server The establishment of the new forum was for Incel members who consistently requested new subforums or resources regarding looks, surgery, and other methods of “escaping inceldom,” which were not topics that had their own tags or forums on incels.me. This was a way for the moderators to control the discussion on the main forum but to provide another space for these kinds of discussions, thus also pointing to this strategic change in how the community organized itself across digital spaces. Like Incels.me, Looksmax.org had a strict “no women” 148 policy, and allowed for posts that were not allowed on the Incels.me forum but still pertained to the so-called “Incel lifestyle.” The increased attention from the media seemed to have been one of the motivating factors for the establishment of the store, bitcoin address, and establishment of “archives” in the form of the wiki as well as the Blackpill Archives. On the note of increased media attention, forum members also did interviews with the BBC, which was working on a documentary about the Incel community, and this was seen as a positive move for the community because it would attract more members (see figure 68). This highlighted the group sentiment that what they were espousing was a counter-narrative to the mainstream one that was increasingly being rejected. Figure 68. The BBC interview will double our amount of users In September, the same month where BBC had solicited interviews from the Incels.me community, reddit quarantined not just r/TheRedPill (the subject of the previous chapter) but also the other remaining Incel community r/Braincels, which incels.me did not associate itself 149 with. The fear in the forum around the quarantine was more disdain than fear, particularly disdain toward the idea that r/Braincels members would then migrate to the forum, and that the quality of posts would be affected as a result. Concerns arose in the forum again about the number of “imposters”, infiltrators, and people from r/IncelTears and other places having free access to all of the content in the forum, and requests for the forum to go private continued, or at least a “members only” section. At this point, “Incel” had more than cemented itself as a term in the general public and the public lexicon, and the community was well known due to the number of reports, documentaries, and YouTube videos and podcasts made about them. But with increased visibility comes consequences. In a sort of prophetic act, the moderator Master posted a link to the backup domain in the event the incels.me website went down (see figure 69). This would indeed happen in October 2018, and incels.me would have to move, once again. Figure 69. Official backup domain Incels.me is Dead In mid-October 2018, the moderators of the forum had another running banner telling users to use the domain incels.is until further notice, due to an issue with the domain provider. Although initially the moderators told the users that this would be a temporary issue, ultimately it would become permanent: the domain registrar suspended the domain due to the forum not 150 following its anti-abuse policy, and this pointed to an infrastructural failure at the highest level: the domain itself, which then affected the whole of the community. However, unlike the situation when r/Incels was banned, the website reappeared exactly as it was on incels.me – all old posts were preserved, and the site did not change at all except for the domain (.is as opposed to .me) and the banner being changed to “Incels.is”. Despite the posts outlining the backup domains, this move still resulted in the loss of hundreds of members from incels.me, meaning that although the actual content of the website was preserved but not necessarily its main userbase. In another interesting move that was a departure from the way r/Incels was run, the announcement about the domain registrar suspending incels.me and the move to incels.is included a formal press release by the moderator Master (see figure 70). The statement not only included the news about the domain being suspended by the .me registry but also included information for a media contact. The .is registry means that the forum was now being hosted on servers in Iceland, a country that SergeantIncel noted in his post as being better at ensuring free speech and freedom of expression. This would be the second time that the community had come under attack by forces outside of its control, and although the suspension was seen at first as being temporary, it became evident quickly that it was permanent. As mentioned earlier, increased visibility results in consequences, and despite having been on the .me registry for nearly a year, the violent attacks attributed to the Incels community as well as the massive influx of media attention perhaps alerted the domain registrar to the true nature of the community. 151 Figure 70. For immediate release During this time, posts were made by the users who had bookmarked the backup domain but also those who found it eventually by other means noting that they were extremely distressed when they tried to go to the incels.me website (see figure 71). Despite the efforts of the moderators to alert users to the backup domain, perhaps it was not aggressive enough to let users know about the “plan” in the event of a forum shutdown. Unlike r/TheRedPill, who advertised the back-up forum aggressively for years, the moderators of incels.me only had a few posts pertaining to the backup plan. Regardless, users eventually made their way to the forum, but in 152 fewer numbers than the incels.me domain – something that had happened before when r/Incels shut down, and incels.me became the backup. Figure 71. Almost roped In figure 71, the poster Brahcel notes that he “almost roped” because he could not find the community (“roped” or “roping” refers to suicide by hanging, i.e. a noose). However, he goes on to say that the promotion of violence needed to stop in the community to avoid being censored again. The banning of incels.me seemed to indicate a shift in the moderators’ policies as well: before, moderation seemed to be focused on weeding out trolls and other infiltrators, however on incels.is the moderation policies seemed to have evolved. Although much of the content remained violent and misogynistic, the moderator team expanded itself and more users started being banned for violating rules on the forum. However, posts in the new domain continued to ask what had happened to all of the members, and one comment listed some possibilities: “Couple factors probably: - After the incels.me to incels.is migration many people were lost. - Google is not listing incels.is posts highly in their search. I used to think they were blacklisting us, but we're starting to rise again which means it was just a side effect of the domain name change. - Lots of people get banned due to aggressive moderation and the fact that many people can't post on here without insulting and trolling others constantly which results in bans. - Some turnover is expected as many people are only "temporarily incel" then get laid/dates.” 153 Certainly it's not due to the incel epidemic slowing. The opposite is true for that. There are more of us being made every day.” – comment by user RageAgainstTDL The loss in membership was attributed not to a slowing down of the Incel community but rather things that were at the infrastructural level – the domain move, not being listed high in Google search engine results, and people being banned. But another concern about the loss of members also points to the status of being an Incel – since the term stands for Involuntary Celibate, users who successfully engage in romantic or sexual relationships are no longer allowed to be a part of the community (although exceptions are made for sexual encounters with escorts and/or prostitutes). The need for a contingency plan exceeded just the backup domain and chat servers, but rather on what would happen if the status if a moderator changed (see figure 72). Figure 72. What happens if a mod gets laid/gets a girlfriend? The identity of Incel, is a core component to being a part of the group, but since “Incel” is a state of being rather than a more stable identity marker (like race, nationality, etc.), the chances of it changing are far more likely. Symbolically, to the group, a moderator who has sexual or romantic relationships should not be a moderator of the community, demonstrating that there is a significant amount of gatekeeping not just for the users but the moderators themselves. For the 154 community, only somebody who had never had sex (barring escorts and/or prostitutes)10 or never had a girlfriend should be allowed to be an Incel. However, the tension of this (who gets to be an Incel) would be highlighted following two attacks in November of 2018, shortly after the move to Incels.is. “The Fire Rises.” In November 2018, two shootings occurred in the same week: one at a yoga studio in Tallahassee, Florida, and another at a country music bar in Thousand Oaks, California. The Florida yoga studio shooting was carried out by Scott Beierle, and an investigation of his online presence revealed that he not only expressed deeply misogynistic views but also celebrated the actions of Elliot Rodger (McLaughlin & Cullinane, 2018). This, to the Incels community, was of utmost importance: although shootings had occurred before, only the attacks carried out in Toronto by Minaissian had any ties to the “Incel” community explicitly, and even then there was only a single Facebook post. Beierle, on the other hand, had YouTube videos and even a Soundcloud with songs he had made himself with misogynistic lyrics. He also identified with the Incels community in these videos and posts, and after news broke of the attack that left two women dead and five injured, YouTube immediately took down his YouTube channel. However, when media outlets released photos of the shooter, the Incel community immediately decried him as a “Chad”, i.e. an attractive male who is romantically and sexually successful, with one user even commenting on the thread about the shooting: “Damn. Now watch us get blamed ONCE AGAIN for Chad's actions. Chad kills 2 foids and we'll be the ones blamed for it, just like it's Chad comitting domestic abuse and rape (And getting away with it because rich, good looking Chads ALWAYS WILL) and then the bill gets passed to us when we've done fuck 10 This is a hotly contested issue in the Incel community, whether sex with prostitutes and escorts lets one escape from Inceldom 155 all, because it's human nature for society to blame uggos for all that is bad in the world.” - comment by user Bronzehawkattack The comments above this one, before photos were shared, were all celebratory and congratulatory to the shooter and the community itself – in particular, the influence of Elliot Rodger. However, this quickly turned when the community did not “see” (literally) him as one of them due to his appearance. Another thing to note about the discourse about the community is the fact that many Incels pushed against the media narrative of the community being majority white – a common acronym in the community is JBW (Just Be White) meaning that white men have no issues in pursuing sexual or romantic relationships, and many users noted that a majority of the userbase was not white. Although this cannot be proven by mere observation of their posts and their community (since usernames and their avatars do not reveal much about the identity of the user), race is a major topic of conversation in the forum and the antisemitism is rampant – but the common binding thread across racial and political lines in the community is misogyny. However, the pushback against labeling the Incel community as a social movement is particularly due to the divides in terms of race – despite all being a part of the same community, one user noted that there is no “Incel solidary or Incel brotherhood”: “And that's the real black pill. Just read what white supremacist incels like @StormlitAqua write about ethnics (including ethnic incels). They all want an ethnic genocide. That's their dream. Just look at how ecstatic white incels are when Trump said he's going to stop birthright citizenship - because they know only ethnics (including ethnic incels) will be effected. Just look at how they constantly gaslight and deny JBW even though it's a proven fact. No such thing as incel solidarity. No such thing as an incel brotherhood. Every incel is on his own. Especially ethnic incels.” – post by user rabitter Regardless of the shooter’s race, as well as his physical appearance, he started being referred to as “Saint Yogacel” in the community due to his online content and his acknowledgement of 156 Elliot Rodger and identification with the Incels community. Like the other mass murders that brought attention the community, users were torn about whether or not the shooting should be seen as a positive – one user posted that it makes things worse, and in response another user stated that “The recent shootings bring me happiness, and bring our cause into the spotlight. They are good things.” (comment by user ManletHalfCurry). Despite the cause being “brought into the spotlight”, there were concerns about the increased visibility of the forum as being one associated with violence (see figure 73). Figure 73. Beware American Incels In figure 73, the poster warns Incels in the United States that they will be used a political scapegoat, and be blamed for any and all attacks regardless of the affiliation with the community – the poster even tells the users to “get ready for war”, echoing a similar sentiment in alt right circles that a war is ongoing, or imminent, to fight for cultural dominance and power. Online, this power is asserted through censorship and bans of extremist content, and Incel community members responded to this not just in the migration of the subreddit, but also by the establishment of archives and other knowledge repositories. In any war, there are factions, and Incels.me (and then Incels.is) seemed to be committed to organizing a more collective identity and approach for the community. In the vein of preservation, when the YouTube channel of Beierle was taken down, a user on incels.is found Beierle’s website (or what appeared to be), and preserved it – as well as Beierle’s YouTube videos that were remaining and the album he had 157 uploaded on SoundCloud (see figure 74). Preservation, then, was not just for the content that came from the community itself, but any other content that was deemed relevant by the members and deemed worthy of preservation. Due to the nature of the Internet, anybody can save posts and other kinds of content, meaning that the archival process is in a way more democratic – for all groups. Figure 74. We have to preserve and archive this The second attack that occurred in that week happened at a country music club in Thousand Oaks, California, where 11 people were killed - but the attack was not linked to the Incels community. Regardless, the Incels community did applaud the attack, particularly because of its location – a dancehall/bar. Speculation as to whether the shooter was Incel or not was again met with mixed messages – some hoped that he was, others noted that “I can see your point but 158 there’s only so many shootings (((they)))11 will take before shutting us down.” (comment by user Insomniac). The forum seemed torn on what they wanted, particularly in terms of the violent acts that some seemed to celebrate whereas others condemned. The Incels community, compared to the other two communities studied as case studies, is perhaps the least “together” in terms of their collectiveness, with the moderators doing their best but ultimately because of how quickly the Incel “identity” can be changed resulting in a severe lack of community cohesion. Censorship on Twitter would also result in the moderators starting an account on Gab, colloquially referred to as “alt right Twitter”, and the online platform where the Pittsburgh shooter who targeted the Tree of Life synagogue posted anti-Semitic, racist posts (Lorenz, 2018). Gab is marketed as a “free speech” social media platform, and many prominent figures of the alt right who were banned from Twitter quickly migrated there. Similar to how many banned subreddits banned from reddit moved to Voat, Gab acted as the “Twitter alternative” for people who were banned or communities who were banned from the Twitter platform. Although Incels had not been banned from Twitter, in late November 2018 the moderator Master noted that they were “tired of censorship” due to nearly every post on the Twitter page being reported to the Twitter administrators (see figure 75). Figure 75. Follow us on Gab 11 The parentheses around names/pronouns/etc. are called an “echo”, and is a type of dog-whistle by anti-semites to indicate a person/people of Jewish faith and ancestry (Anti-Defamation League, n.d.) 159 This highlights a tactic used by many far right extremists: when one platform is shut down or bans a person or community, they just move to another. In the case of Gab, it was specifically created by someone with far-right sympathies, and also demonstrates that despite the current Internet era being dominated by centralized platforms like Facebook, Twitter, and other social media sites, the capability and the infrastructure to create other spaces still exists – and it enables. After the move to incels.is – and even despite the mass murders that prompted a massive influx of visitors to the site again – things quieted down in the community in terms of discussion about “backup plans” and the like. However, some events in the community did signify a continuation of what appeared to be the new strategy that the moderators of the forum had in mind: the continual addition of new knowledge and even an off-forum website dedicated to this (the “White Pill”, see figure 76) and even a weekly LiveStream (see figure 76). Figure 76. LiveStream and Whitepill.org 160 During this time, members of the community were not only being featured by mainstream media in documentaries (like the BBC and CNN), but the moderators and some power users of the site started doing podcast interviews. This was met with some resistance, however, in that one user of the community saying that the podcasts only made “Incels look stupid” and that there was no point in trying to speak with those who were not in the community or at least sympathetic to it (see figure 77). Figure 77. Don’t do interviews on Podcasts Regardless of this user’s response, the willingness of the moderators and top users of the forum to begin engaging with non-Incel community members and to start appearing on media sources that were outside of the Incel-world seemed to demonstrate the desire of these moderators and users to bring more visibility to the movement on their terms. Some users in the forum were supportive of this move, but stated that there perhaps needed to be “official spokespeople” and that there needed to be more preparation for debates (figure 78). 161 Figure 78. Spokesmen Although there was no follow up on this post, the “unofficial” spokesmen of the forum can be regarded as the moderators themselves – they release official statements, make the brunt of the decisions for the forum, and even introduce alternative platforms and forms of media for continued community building (forums, podcasts, livestreams, etc.). By this point, the community had become at the very least infamous, and some members lamented that the forum was no longer obscure, and thus safe from prying eyes. But the increased visibility was also seen as a positive in that it would help the longevity of the forum, and thus the community: “Various big youtubers such as Sargon,Paul,Lauren Chen etc. makes video about us,we always have way more guests in the forum than registered users, we have newcomers joining our site daily, the mainstream media smears us any way they can,they're movies and other media potraying us, not to mention foids calling themselves femcels for not getting a Chad and copying many of our viewpoints and perspectives and terms(LOL Pinkpill? Moids?), and lastly the 3 podcasts that some of our members decided to participate in thus bringing more attention to us. It seems as though MGTOWs and the alt right are no longer a hot commodity these days, now it's us and I wonder how long this will last. I wonder what kind of new group will pop in to take our throne as the hottest trend to talk about. Case in point, it looks like we're the hot new thing and to be honest I never been apart of a group that's at it's highest point of notoriety. It's hard to say how I feel about this. In one hand it looks as though our problems are being brought out into the limelight, however we're being demonized and we can't reveal we're incels in real life due to obvious reasons. In any case I'm going to have fun riding the waves of us being the current trend and I think you guys should do the same.” – post by user TigerFestival 162 This post highlights an interesting tension members of the group feel – despite some users appearing on documentaries and even on podcasts, real identities were not being shared on any of these media appearances. Even the livestream was all audio with no video, and the level of notoriety of the forum affected the members’ willingness to share with others in their face-to- face life of their membership there. In January 2019, the Canadian Broadcasting Corporation (CBC) released a documentary about the Incels community on the show The Fifth Estate, and soon after the website would temporarily shut down. The Incel Institution After the CBC show The Fifth Estate released its documentary on the Incels community in late January, the incels.is website was hit with a DDOS attack12 in February that lasted around four days. Although the source of the attack was not shared (and perhaps not known), the moderator Master did note that it was a bit of a coincidence that the attack occurred shortly after the release of the documentary. As a result, moderators scrambled to work with domain hosts, Cloudflare (a web security service that helps protect websites from DDoS attacks, etc.), and enlisted the help of a new Anti-DDoS provider to get the site up and running again. For updates, they relied on the Twitter account in regard to why the forum was shut down and any updates regarding when it would be up and running, and to alert users if it was down again. In the post announcing the reason for why the site had been down for the past several days, the moderator Master encouraged users to check the Twitter for updates – and also asked users to donate to the newly established Subscribestar account to help with the costs of running the website (see figure 79). 12 DDOS stands for Distributed Denial of Service, a tactic meant to flood a website’s infrastructure (in particular, servers) to shut the site down temporarily due to a limited capacity for websites to handle multiple requests 163 Figure 79. DDoS Attack In the post, the moderator not only shows a graph of the attack itself but states that they are determined and dedicated to keep the site online – even if it takes their own money to do so. Users in the forum panicked when the website was down for days following the documentary, and lamented that it was perhaps “over” for the community (“It’s OVER” is a common phrase on Incels and other online spaces). This, however, was not a common sentiment expressed but what did emerge were concerns that these attacks on the community would continue because of the extremely high visibility of the forum and because of how “Incels” had entered the mainstream. Appearing on podcasts, in documentaries, and even speaking with reporters through email, the power users of the forum as well as the moderators seemed to indicate a desire to control the narrative of the forum by asserting their views of it to the mainstream media and other fringe outlets, including YouTube channels and podcasts. The actions of the moderators, however, demonstrate something interesting in how they aimed to steer the direction of the community – although it was past the initial period of data collection, in March of 2019 the moderators announced that they were soliciting writers for a new website, “The Manosphere”. Responses to the post waned between mocking the idea of 164 creating a semi-professional outlet to write about Incel and other Manosphere issues, but others volunteered their services in terms of software engineering and other IT support to help support the website. This act of establishing what appears to be a sort of online magazine, and soliciting a full time social media manager, was a massive shift from the Incel community that had existed on reddit before it had shut down. Although previously members of this particular Incels community did not necessarily see themselves as a part of the Manosphere, despite being lumped in there by other Incel communities and other Manosphere groups and mainstream media, the moderators launched the website to encompass all arenas of the Manosphere as seen in figure 80. Figure 80. The Manosphere This professionalization of the group marks an interesting departure from the vehement assertion by not just moderators but other group members that they were not a political or social movement – they were not only organized in terms of preserving content and having backup domains, but were creating an entire media ecosystem on behalf of the community and its 165 beliefs. The massive shift of the Incels community to having lost everything after their subreddit was banned to eventually resulting in an organized media ecosystem point to the moderators’ intimate knowledge of navigating digital infrastructure but also the modern methods of shifting and changing discourse. Although the post does not that they will not be able to pay people for these positions in the beginning, the labor of the community would then hopefully result in some kind of monetary gain – making it an official group, one that brings with it professional and monetary gain as well as community. The Incel community is an interesting case of how a group that spreads out across various Diffused Extremism homes on the Internet responds to infrastructural failure. Perhaps more so than r/TheRedPill, what keeps members coming back to the community are the bonds and social interaction that are possible through the forum itself. It not only is a space for validating thoughts and feelings that they have, but rather can also provide a sort of digital home for those who feel that they are at the margins due to their lack of physical attractiveness and lack of success in relationships. Despite this, the community serves as a hotbed of bitter rage toward women, and validates and supports these worldviews through their content. The community allows for the creation and maintenance of bonds, and through this shared frustration and anger a new world is built. But what the r/Incels and incels.me/.is moderators did by establishing new nodes of their community and by having official stances, press releases, and an institutionalized system for speaking on behalf of the community as well as supporting it created something new altogether on the ashes of something that was once destroyed. In some way, the destruction of their community and having a status as a lost civilization may make the community stronger than others in this dissertation and of those not included in 166 this project. Invoking a sense of nostalgia for the past strengthens group members’ affiliations with the community and with the identity as a whole, and a need to sustain the group was what was of the utmost importance. Although perhaps a reductive cliché, the moderators and leaders of the forum learned their lesson from not having an organized system and backup plan for their community, but because these practices emerged only after a loss, it appeared they were still muddling through how best to move forward. The frequent need to move from reddit to a new platform to a new domain registrar means that these bridges were continually being built and then destroyed, but the remnants remained like a trail of breadcrumbs to what once was. By having these bridges and bonds in the community, archiving and other modes of preservation and fortification could occur. This Incel community is the most prominent of all of those that span across the Web, and is the most infamous due to high media coverage and discussion of it on other platforms. At time of writing, the community has once again changed its domain name to incels.co – and one of the purported reasons for this is to enhance Search Engine Optimization (SEO) to have the website be higher up on a list of results, meaning that the group not only wants to sustain the community as is but rather to continue attracting new members and growing. The community’s resilience to failure and loss demonstrates the ways in which deplatforming may not always be the most effective method, and that destroying the “city” does not necessarily destroy its foundation. 167 CHAPTER 5: Case Study 3 Refugees: r/AznIdentity and the MRAsians The final and third case study of this dissertation project is a study of a community of Asian men, specifically, r/AznIdentity. Although this “community” has existed for quite some time, the subreddit that was studied in this case study was established in 2015. r/AznIdentity, and its affiliated subreddits, push a deeply discourse about hegemonic Asian masculinity and about Asian women (which is deeply misogynistic), and are well known throughout Asian online spaces for their attacks and harassment campaigns that overwhelmingly target Asian women. Although not a large group, their practices as well as the institutionalization of their community is perhaps the most sophisticated out of all three case studies. In some ways, their smaller size may help to facilitate a stronger connective bond and collective identity among users, and their focus on Asian issues as well as Asian identity among diasporic Asians is a strong binding thread. Throughout its short life, r/AznIdentity has had to contend with similar issues as the other case studies in terms of being threatened by changes to reddit policy and outside scrutiny of their group’s discourse. Although they have yet to experience a ban or a quarantine, they are acutely aware of how they are perceived, and thus navigate changes in the system accordingly to preserve their group. But first, a caveat: the subject of this case study is perhaps the most personal and the most traumatizing one that I studied during the course of this project. For starters, I found it by accident – having been a redditor since around 2010, I knew of the existence of r/TheRedPill and the Incel community for quite some time before they became the subject of my dissertation. I found the r/hapas and r/AznIdentity community by accident – I was looking through my subreddit subscriptions (since reddit allows the functionality to follow specific subreddits to 168 populate your feed) and saw r/hapas. “Hapa” is a Hawai’ian term meaning “half”, possibly adopted from the Japanese term indicating mixed-race person hafu, and was specifically used to refer to mixed-race persons regardless of the “mix” per se, and was co-opted to mean “mixed race Asian” by half East Asian and half Southeast Asians in particular (Bernstein & De la Cruz, 2009; Gamble, 2009). In this context, r/hapas was primarily focused on building a community for half white and half Asian men, but specifically the children from this pairing of white men and Asian women. The community openly supports Elliot Rodger, the Isla Vista shooter, saying that it was this specific mixed race heritage of his (Asian woman, white man) that led to his psychological problems and the mass murder he committed. In effect, they sympathized with him, and used race-based science, personal anecdotes, and other means to prove that WMAF (White Male, Asian Female, their acronym) children were less attractive, self-hating, and psychologically troubled compared to other “combinations” of hapas. The group believes that half Asian women have it “easier” because their Asian-ness makes them exotic and sexually desirable, whereas half Asian men who pass as Asian are emasculated and unable to pursue romantic and sexual relationships in Western countries. As a half Asian woman, the forum sparked both disgust and intrigue because of the level of identity work that was being done in the forum but perhaps in a direction that wasn’t necessarily productive in terms of a positive identity for hapa men – or women. From this subreddit, I found the community r/AznIdentity which was recommended to me by the reddit algorithm while I was looking at the r/hapas page. The community was also talked about within r/hapas, and one day curiosity got the best of me and I clicked. I quickly learned that the community based itself on a very specific ideology – not unlike the r/hapas community – that villainized Asian and Asian American women and was deeply misogynistic, but hiding their 169 misogyny in rhetoric familiar to more progressive social justice circles. Using Asian/Asian American as the main nexus of their identity development work – in particular, being proud of being diasporic Asians (Bernstein, 1997, 2005), the group addresses many issues that pertain to Asian American or Asian activism in the West (lack of positive representation in media, fetishization of Asian women, emasculation of Asian men, the “bamboo ceiling”13, etc.), but distinguish themselves from other Asian American subreddits in their goals, aims, and their strategies to achieve them. Colloquially referred to as “MRAsians” (a combination of “men’s rights activist” and “Asian”), the r/AznIdentity community and its affiliates are known for leveling large-scale harassment campaigns against Asian/Asian American activists, particularly women, non-binary persons, and feminists in general (Ng, 2018). In particular, they target women who are dating non-Asian men, and shame them for not partnering with Asian men as well as sending disturbing death and rape threats (Ng, 2018). Seeing themselves as different from other “Asian” spaces on the Internet, r/AznIdentity position themselves almost like refugees who need to not only establish a community for themselves, but to actively antagonize the groups that they are ousted and kept out of. Started by a group of users who were banned from the subreddit r/AsianMasculinity, the r/AznIdentity subreddit has perhaps not undergone as many dramatic changes as the first two case studies in this dissertation – however, they have used very similar tactics and perhaps been more strategic in spreading themselves out over the digital media ecosystem to achieve their ends. They engage in the work of “us vs. them” to build this distinct Asian identity – a distinct Asian male identity – and the building of this alternative Asian (male) identity is the primary focus of this group’s activism. 13 The “bamboo ceiling” is similar to the term of the “glass ceiling” for women, who struggle to advance in careers to higher positions. For Asian persons, the term has been repurposed to “bamboo”. 170 The Asian/Asian American community online, and particularly on reddit, is fractured even among the subgroups themselves – for instance, r/AsianMasculinity had a moderator who was a moderator of r/TheRedPill, and was focused particularly on self-improvement for Asian men and came out of r/TheRedPill beliefs (see figure 81), but the users who started r/AznIdentity were banned from that community and chose to start their own. Figure 81. r/AsianMasculinity’s roots in TRP The Asian Masculinity Ecosystem r/hapas occupies an interesting space within this microsystem – they don’t actively organize, or even try to affiliate themselves with other groups to expand their network, but are a sympathetic group to r/AznIdentity and vice versa (see Appendix B for a relational map of this community ecosystem). r/AsianAmerican actively bans users from r/AznIdentity, and is painted as the main “other” in r/AznIdentity particularly because of the moderators at r/AsianAmerican focusing on feminist and more women-centered issues. These factions, divisions, and other 171 community fractures are perhaps less important for explaining these groups’ practices and how they manage to sustain themselves than what r/AznIdentity has done in order to grow their community and their group (see figure 82 and 83 for the MRAsian online universe) – they’ve done everything from establish chat servers, to Instagram and Twitter pages, to establishing their own online magazines and “media” conglomerates. They not only focus on expanding their community within a specific sphere of the Internet – like r/TheRedPill does with the Manosphere – but rather attempt to inject themselves into popular media and culture more aggressively. Figure 82. The MRAsian community on reddit The reach of the r/AznIdentity subreddit expanded far beyond the confines of their reddit home. Strategically using a variety of platforms, the r/AznIdentity community not only aligned themselves with other communities on other platforms, and even media outlets, but also started their own projects (see figure 83). This network building serves as a way to access a wider 172 audience as well as to continually reinforce and maintain group identity, which is something that the leaders are acutely aware of. Figure 83. The MRAsian online ecosystem off reddit Thus, this case study explores the responses of r/AznIdentity and some of the responses of r/hapas to perceived threats to their online communities. Especially in the last two years (it is currently 2019), the term “MRAsians” entered the public lexicon of at least the Asian/Asian American community online (particularly on “Asian Twitter”) and this is in part to activists coming forward who had been attacked by the MRAsian community. On the other, this rise of the MRAsian was a coordinated plan that had been happening for years, and like the other two case studies, had been slowly simmering beneath the surface. 173 “Get Twitter Now.” The r/AznIdentity subreddit was started after the 2015 update to reddit policy, and r/hapas only had 100 members on April 1, 2015, meaning that both of these communities did not necessarily respond in the same way as r/TheRedPill to this policy change. The r/AznIdentity subreddit was started as an alternative community for Asian men in response to what a group of users saw as oppressive censorship practices by the moderators at r/AsianMasculinity. In a post from December 2016, around one year after the r/AznIdentity subreddit was started, a moderator made a post celebrating that the subreddit had more active users than r/AsianMasculinity, and noted that this growth over the past year spelled out a hopeful future for the continued growth of the community (see figure 84). In response, the comments noted that r/AsianMasculinity moderators censored too much content, and that the focus of the subreddit was too narrow – r/AznIdentity helped to fill in those gaps of r/AsianMasculinity, and was actively trying to push itself as the premier digital community for Asian men on reddit since its inception. Figure 84. A top destination for aware Asian Men (AM) 174 However, soon after the subreddit was established in 2015, the moderators of the forum started pushing the need for expanding the discourse of the group beyond the confines of reddit. Rather than responding to bans and other imminent threats of infrastructural failure like the two previous case studies, where the practices were mostly in-group and within the community itself on a specific platform, r/AznIdentity moderators practiced expansion more explicitly across platforms. Although r/TheRedPill had a number of affiliate subreddits, r/AznIdentity set its sights on Twitter (see figure 85). Figure 85. Get Twitter Now In the figure above from March 2016, the moderator arcterex117 made a post that revealed two things: 1. That the subreddit leaders believed in expansion outside the reddit platform and 2. That Kulture Media, an online “magazine” and media watchdog of sorts were 175 affiliated – and even run – by the same community (see figure 85). The post also notes that the moderators have tried a number of different venues to spread their message – from Facebook to 9gag to other forums, but that they had had the most “success” with Twitter in terms of reach and engagement. They implore community members not to just follow the main account, but to actively retweet and try to drown out the other Asian voices that they viewed as antithetical to their goals (i.e., mostly to silence Asian women and feminists). However, one quote stands out in particular for why the moderators were motivated to try to amplify the voices of their community off-reddit, “Great ideas without an audience means our ideas die with us.” In the post, the moderator also notes that this is a main goal of the forum – to have “the guts to change the world” and to not just engage in “idle discussion” on reddit, pointing to an acknowledgement of the limits of possibility that the reddit platform affords. The main barrier to recruiting more members into the community and to accept their beliefs, however, lies in the way that they spoke about Asian women, particularly about Asian women who date outside of their race. The forum, despite marketing itself as being a “pro-Asian space” not marketed specifically to men, espoused rhetoric that engaged in sexual shaming of Asian women who date outside of their race, particularly those who date or marry white men, as well as claiming that the children of White Male, Asian Female (WMAF) pairings were mentally deranged (using Elliot Rodger as their main example of this). The scientific articles posted in the forum were often race-based scientific studies that claimed that mixed race children suffered from more mental health issues than “monoracial” children, and often posts in the forum were highly bitter and angry towards Asian women, with posts that often harshly criticized Asian women for rejecting Asian men in sexual and romantic relationships. Because of the high level of toxicity in the discourse about Asian women on the forum, the moderators had to enact stricter 176 rules about what kinds of discussion about Asian women and about dating Asian women were allowed on the forum. Commenters rejected this decision, pointing to how r/hapas discussed these issues and how their uncensored commentary on the subject actually boosted their traffic and membership: “If you look at /r/hapas. They've gone all-in on the war against WMAF and it hasn't hurt its traffic, its boosted it. If anything truly embracing that mission has attracted more passionate men and women and even more live activity than /r/AA. I agree with gently pushing for new ideas and insight, but I feel the full blood community is still pulling its punches. There are still a lot of apologists and macho men denialists. In real life I see few macho guys with gorgeous wives in CLC with kids that have the right to brag like that. I wish there were more. I get that a lot of guys are dating out (offensive team) feel ambivalent about publicly denounce WMAF. But let the defensive team do its job. As long as good AF are acknowledged, supported, and loved - let the bullets fly. If you guys don't like it, write articles about that awesome success dating out rather than dousing other hard working posters with cold water.” – comment by user fakeslimshady in response to a post regarding forum rules about Asian women discussion The moderators of the forum wanted to encourage constructive dialogue about Asian women and “dating out,” rather than allowing posts that focused on villainizing Asian women and calling them “sell-outs”. The derogatory term used in the forum to refer to Asian women they viewed as race traitors is “Aunt Lu” or “Anna Lu” or even just “Lu” for short, a sort of adoption of the “Uncle Tom” insult from Black culture meant to refer to a Black man who is seen as a race traitor by being subservient or accommodating of white people, and thus betraying their own culture and community (Spingarn, 2010). The equivalent of this term for Asian men in the forum is “Uncle Chan”, or “Chan” for short, and often the main targets for attack were Asian celebrities – but particularly famous women like actress Constance Wu or authors like Celeste Ng and Nicole Chung. The main discursive formation that they encouraged in the forum is the idea of the “self-hating Asian,” and any Asian women who dated white men in particularly were 177 the worst offenders of all. The forum’s discourse, then, is deeply ethnonationalist and misogynistic, and is where the crossover is clear between r/hapas and r/AznIdentity. It was this kind of rhetoric that the subreddit pushed in its discussions, although r/AznIdentity tried to rein it in in terms of providing “constructive” commentary on Asian women, as well as on its Twitter page. Particularly, the activism of the community on its Twitter through Kulture focused on pushing content about Asian men to paint them in a more positive light than what popular media depictions of them have – in particular, to push Asian men as attractive and objects of sexual desire, rather than emasculated and desexualized. The goal of Kulture, then, as an official sort of “media outlet” for the community, was to help not just build the reddit community itself, but to attract non-redditors to join their cause. In a post from December 2016 updating users of the community on the impact of Kulture Twitter on this, the moderator arcterex117 noted that it was a part of his “seven part strategy in Building Capabilities,” and celebrated the high profile engagements and followers that the account had attracted. Using strategies to boost engagement and traffic that are typical to helping increase visibility of accounts, the Twitter account and website grew: “We played a key role in mobilizing the community to stop Mail Order Family. We whacked around Ken Jeong until he blocked us; but got a lot of Asians who might be too wary to speak their mind about him to do so, and like and re-tweet our slams on him. We've also reinforced the #UnderratedAsian hashtag- example. The work to get here has been quite a bit. I won't go into gory detail. But we've had to think out of the box to get the ball rolling; starting is always hardest, and most accounts rarely get the kind of reach and visibility we do. We bought ad campaigns on Twitter, we promoted Tweets, we did Follow campaigns (by following others, some number followed us); but a lot of it was finding which tweets got the best response and doubling down on that. Steady, continued effort paid off. (I do remember starting this a year ago, spending weeks tweeting, and following hundreds of accounts, only to have a grand total of 1 (one) person following us back; at that point, I told the Slack activist group that so far our Twitter strategy 178 had been a failure, although we would still keep plugging along. /u/shadowsweep said we're just get started and it could get big. you were right dude!) We'll continue to build the channel and also leverage it to get across the message of the need to improve Asian representation in Hollywood and other Asian causes. I'm sure in light of the last election cycle, given the rise of AltRight, and election of a President who enables AltRight, we'll have a needed debate on what strategies Asians need in order to prevail in this shifting environment. My view has always been: build the capabilities to influence the world around us. This will hopefully be one of many such initiatives.” – post by arcterex117 This professional-level use of not only creating a new medium for sharing the views of the community and to spread its discourse (Kulture) but also using an alternative platform points to how the group sees itself and its mission: not just to push the issues that Asians face in the West, but to become the sole activist group. The group was concerned not just how their ideology was not being accepted or being fought against by other Asian groups, but also in maintaining superiority over the discourse that came out of other Asian activist spaces in the United States in particular. However, the post also talks about the coordination that was occurring on the Slack channel, which was a tactic by the community to avoid censorship on other platforms since Slack is a more private instant messaging service. As noted above, the moderators needed to make a post to hopefully control the kinds of hateful rhetoric that were being leveled towards Asian women in particular, but the moderators also seemed to acknowledge that Asian women needed to join their cause in order for the “movement” to be more successful and to become the preeminent voice of Asian issues in the West. In particular to highlight this division between this community and other Asian activist groups, the subreddit moderators and users constantly posted about how much more successful their community was compared to r/AsianAmerican, the formerly largest subreddit for Asian 179 Americans on the reddit platform. Although the reason why r/AznIdentity started was in response to bans and censorship on r/AsianMasculinity, the main target of vitriol and othering was focused on r/AsianAmerican – whose moderators and users the r/AznIdentity community viewed as race traitors and pseudo-activists. In March of 2017, the moderator arcterex117 made a post almost gleefully announcing that r/AznIdentity had more page views than r/AsianAmerican, had better content, and was overall a “better” community than what was the “dominant” community: “It is kind of amusing looking at these "mainstream" subs in the rearview mirror; they once seemed so large and imposing, we were concerned when they censored AI and prohibited mention to us. But in a way their pettiness played to our advantage, we didn't realize how many disaffected former AA members there were. I trust as our growth advantage increases, they will continue to try to characterize us in negative ways, as could be expected as per sour grapes. AI doesn't talk down to members. We don't censor people because we're afraid what white people will think. We don't portray Asian men as oppressors of Asian women. We don't ram SJW groupthink down people's throats. It's the kind of free exchange that Reddit was supposed to cultivate. The sad thing about AA is that the mods are a net negative; the free exchange would be possible without them. They are just in the way. AI is what a true Asian community is like when self- described 'leaders' largely get out of the way. We really never expected to overtake what was once the go-to sub for Asians so quickly. But it's remarkable what an active, vibrant community can achieve in a short time.” – post by moderator arcterex117 The moderators and commenters noted that the main issue with r/AsianAmerican was that the moderators and community members there were “Asian women who hated Asian men” or “Racist white boys with Asian girlfriends”. There was a lot of concern in the forum about trolling from white users on reddit, in particular when it came to issues about Asian masculinity and Asian women, and to combat this the moderators implemented early on a system where they required visual identification of the user to prove that the user was, indeed, Asian. This practice of not just active moderation in terms of content and banning users based on what they post, but 180 rather not allowing members in until they proved that they were Asian, was required for r/AznIdentity and their off-site forum, AsianSoul.org (figure 86). The forum had been established in May of 2016, and wasn’t necessarily in response to a fear of bans: it was established to be a space free of white trolls, brigading, and other account suspensions (censorship). Thus, it was still sold to the members of the subreddit as a space where they would be able to openly express their views free from fear of trolling, brigading, and censorship from subreddit moderators. Figure 86. AsianSoul forum announcement from 2016 Figure 87. Anonymize your Asian Activism from 2016 The moderators of the forum would also post guides for users on how to anonymize their activism, as seen in figure 87, and to protect their real identities online, still signaling to a strange 181 dichotomy of activism within these spaces: a desire for anonymity, in fear of retaliation in their personal lives for their views. All of these actions from the forum’s inception through 2017 were highly focused on building the community, but the discourse around Asian women – although improved and relegated to a weekly “Gender Issues Thread” rather than being allowed as general discussion on the subreddit starting in 2016 – alienated a user enough to establish their own subreddit. In March of 2017, the subreddit r/EasternSunRising was established by a frustrated r/AznIdentity user who singled out the policy regarding content about Asian women on r/AznIdentity and expressing their desire to have a space for East and Southeast Asians only. The user also states that they do not believe in censoring discussions or allowing white and non- Asian users to even engage with the community, let alone be members. The user, natalie_ng, goes on to state how the community is specifically meant to be an uncensored, Asians-only space that stresses AMAF solidarity (Asian Male, Asian Female): “And last but not least, I want to stress AMAF solidarity. I want to see it reflected not only in the content, but also in our mindsets and on our team. Yes, unlike the bullshit that outsiders (and white-worshippers) try to feed everyone, OUR culture is NOT misogynistic or sexist. If it was, China wouldn’t have the most self-made female billionaires as it does now nor would it have proverbs that state how “women hold up half the sky”. I want a team that actually reflect OUR culture and beliefs and let everyone see that AMs and AFs can certainly work together and get along spectacularly given certain circumstances (particularly on the behalf of the AFs to extend understanding/empathy to our men who are less advantaged than us in western society). Right now, I’m the only AF on the team (the others are all AMs) since I suspect that the majority demographic of the community will be AMs, so having a team that somewhat reflects that seemed the most fitting. But once it gets larger and attract more AFs, I hope to add on another AF. I would like a VERY pro-AM/Asian AF who is pretty knowledgeable about Asian history/facts like u/paintthefqnwalls but unfortunately, I feel insanely guilty asking her to mod again when I’ve already dumped at least a couple other modding jobs on her, lol. So if anyone knows any other AF that fits the criteria I’m looking for, do recommend. There are other objectives to the creation of my sub that I’m not going to get into for now since this is already getting a bit long (feel free to read the mission statement or the “about” section on the sidebar to know more) but to conclude, I 182 want to emphasize that I want a place where my group can feel the most at home in, something that I do think is rarely given to them. Nonetheless, given how much marginalization we endure, how many negative stereotypes others throw at us and are able to get away with it, how rampant brainwashing/white-worshipping occur in our community, for the few who survived it all and recognized we only have each other for support/understanding, we at least deserve our own place where we can connect and network together. And that is one of the goals that I hope r/EasternSunRising can achieve.” – post by user natalie_ng A comment to this post by a moderator of r/AznIdentity, user aznidentity, seemed to push back on the claims of the user natalie_ng in their claims about the subreddit appeasing white people and non-Asian people of color in their community as well as censoring discussion around “Lus”. Although they wished them well, they expressed skepticism about the new subreddit being able to achieve its stated goals. As of writing, the new subreddit is still smaller than r/AznIdentity with only around 1,500 members, in comparison with r/AznIdentity with nearly 20,000. However what is interesting here is the continued fracturing of a community that the r/AznIdentity moderators were attempting to bring together: the establishment of r/AznIdentity was in response to what they viewed as a symbolic failure by the r/AsianMasculinity moderators and the larger Asian activist community at large, and the establishment of r/EasternSunRising was also in response to this symbolic failure of r/AznIdentity in their policing of content regarding Asian women and their allowing of white and non-Asian people of color to comment on posts in the community. In short, the founder of r/EasternSunRising saw r/AznIdentity as having failed in its mission in Asian activism by censoring content. Throughout 2017, the subreddit continued its mission to expand the boundaries of the subreddit community through means like Twitter and encouraging off-platform engagement in a variety of ways. Most notably and perhaps most differently from other case studies in this dissertation project, r/AznIdentity made a post in April 2017 soliciting project proposals that they 183 then would award $1,000 to support the project (see figure 88). In the post, the moderator arcterex117 noted that they had started a nonprofit, where the moderators donate money, and noted that he himself donated $35,000 to it to encourage its growth. This, perhaps, is what makes the practices of the forum extremely distinct from the other two: the actual institutionalization of the movement (i.e. a nonprofit) and what appears to be a group of moderators/leaders who have the financial means to support the movement itself. The establishment of a nonprofit and the sizable donation made by just one moderator may point to how and why they were able to establish a media entity such as Kulture and have the means to push their online presence on Twitter by purchasing ads –and even hired a freelancer to maintain the Twitter account. What is still left unknown, however, is who the moderators and leaders of the forum are – which they wanted to maintain to avoid fallout in their personal lives over their affiliation with the group. Figure 88. Sponsored Projects 184 These sponsored projects, as seen in figure 88, were meant to be seen as a continuation of other r/AznIdentity affiliated projects like Kulture, WokeAsians, and even a clothing collection. They were not only navigating the actual digital infrastructures that allowed for the establishment and strengthening of an online community, but branching out into different structures to push their group and ideology into mainstream discourse. During 2017, like other online communities, users were keeping track of what was going on around them on different platforms as well as the political and social climate that they were in. In August of 2017, Jeremy Long14 was banned from Twitter and Instagram. Long was an Asian American adult film star who was often talked about in the forum as a positive example of Asian masculinity who subverted stereotypes about Asian men, and had previously done an AMA (Ask Me Anything) discussion on the subreddit. Paranoia and conspiracy theories about bitter white men and “white Incels” who worked in information technology sectors controlling the proliferation of positive depictions of Asian masculinity and sexuality was raised after Long’s accounts were banned from these two platforms (see figure 89). Figure 89. Impeding our progress The ban was seen as a move to impede the progress of the Asian masculinity movement that r/AznIdentity was pushing so aggressively. Another user saw this ban, as well as the rise of the alt-right and the Unite the Right rally in Charlottesville, as causes for concern about the 14 Long is currently serving a 10-year prison sentence 185 futurity of the forum, and recommended that subreddit members sign up for the AsianSoul.org forum (see figure 90, they also misidentify Jeremy Long as Jeremy Lin, a basketball player). They identified that they needed to maintain communication on “woke” networks, which seems to point to a paranoia and skepticism about the freedom and openness afforded to this community by major platforms like Twitter, Instagram, and reddit. Although the forum had been implementing policies and trying to moderate content, particularly misogynistic content about Asian women, another adverse event would change the forum’s strategies once more. Figure 90. Woke social network Ban Wave 2017 and Networked Harassment The 2017 change in reddit policy as well as the ban wave of 2017 that saw the removal of neo-Nazi and alt right groups from reddit, as well as r/Incels, was perhaps the first “adverse event” that became of major concern to the MRAsian reddit community. Interestingly, in September of 2017, the r/hapas community had a post where a user brought up the concern of using similar terms to the alt right – in particular, “cuck” – within the forum (see figure 91). Although a moderator responded to the post stating that they hardly saw the use of the term, they changed the AutoModerator function to remove posts and comments immediately with the term anyway. This is an interesting move by r/hapas that was also seen in r/aznidentity – the need to avoid using terms that would associate them with any other kind of hate movement, in particular the alt right. For r/hapas, the main concern was being affiliated with the r/Incels community. 186 r/hapas was often compared to r/Incels on different subreddits, however this comparison was vehemently contested and fought against by the r/hapas community. Figure 91. Shouldn’t we stop using Cuck? Shortly before the announcement of the reddit change in policy that would ultimately see r/Incels get banned (case study 2), reddit user chinglishese started a subreddit, r/againstharassment, and made a two-post series outlining their experiences of how reddit facilitated the continued violent harassment of Asian women. chinglishese was a moderator of r/AsianAmerican, the hated subreddit of r/AznIdentity and r/hapas, and outlined the years of harassment that she had faced as well as recounting the harassment that other moderators of r/AsianAmerican had experienced. In their post, she described the tactics that r/AznIdentity used to silence and suppress the voices of users and moderators of r/AsianAmerican. The harassment started with users from r/AsianMasculinity and r/hapas, but what’s perhaps most distinct about the harassment from r/AznIdentity is how aggressively they tried to recruit members from r/AsianAmerican to join their community that chinglishese outlines: “Fast forward another year. We bring on new mods to stem the tide as our subreddit grows popular. More new subreddits such as /r/aznidentity pop up, their 187 rhetoric mirroring and even going further than that of /r/asianmasculinity. We hold strong in our policy to give these trolls no attention. /r/Aznidentity, despite our best efforts to ignore them, try desperately to recruit from us. They buy ads targeting our subreddit, using the ‘sponsored’ posts to circumvent our rules. (We engage with the Reddit Admins and they eventually take it down after several rounds of prodding.) The /r/aznidentity mods actively tell their members to pm individuals in our subreddit inviting them to join their sub. They hint at reddit rule-breaking behavior to circumvent Reddit’s rules against creating alt accounts to circumvent bans. Just like in the masculinity subreddit, the more "moderate" users encourage couching their woman-bashing in less misogynistic language to keep their hatred under the radar, but have no problem letting it fester in their communities.” – excerpt from post by chinglieshese In the posts, chinglishese goes on to outline a long history of harassment and abuse from these Asian-male subreddits and more specifically about the vitriol she faced as an Asian woman in particular. The attempt to make Asian women and Asian feminist focused subreddits with moderation also fell short – both r/AsianTwoX as well as r/AsianFeminism were, and still are, constantly being brigaded, infiltrated, and its users harassed by members from these other MRAsian subreddits. Note how she also mentions that they would circumvent posting bans by buying “sponsored posts,” which in reddit are advertisements, in order to still muscle their way in to the r/AsianAmerican community. The level of financial expenditure by this community is, unfortunately, impressive, and is perhaps one of the prime drivers behind their attempts to take over Asian communities on reddit and online, and to silence and suppress them. In a follow up post after the announcement on October 25, 2017, that reddit would be changing its policy on violent content, chingleshese wrote: “To those of you who have stood by me and followed this saga so far: Thank you. You know who you are. I find strength in knowing we might not be as vocal, but our voices are being heard. Just yesterday Admins have announced that they’re beginning to ban subreddits that harbor and glorify violent content. (/r/aznidentity has already begun trying to scrub their image while just two months ago they were celebrating the opioid epidemic in America as being “karma”. Archive). They may try to smear us by calling us “PAA” (Progressive Asian Activists) but there’s nothing wrong with supporting important causes such as anti-black racism, solidarity among people of color, economic justice, and ending gender-based 188 discrimination. We refuse to be bullied into silence by self-proclaimed “social justice leaders” who present the false dichotomy of siding with them or with white supremacists. We may disagree over policy and tactics, but we’ll continue to fight for a space on Reddit for the Asian American community without making room for dangerous radicalism and harassment.” – excerpt from post by chinglishese The term “PAA” to refer to “Progressive Asian Activist” is the MRAsian’s derogatory term equivalent of “SJW” (Social Justice Warrior) used in far right circles to mock and degrade, people, and particularly women, in progressive social movements (A. Massanari & Chess, 2018). Chinglishese is hopeful here that the change in reddit policy would mean that the communities would be removed from the reddit platform. As she notes, r/AznIdentity responded to this news by trying to scrub their subreddit of anything that would be seen as being in violation of reddit rules. This was most apparent with their announcement on the subreddit on November 2, 2017, “condemning” harassment and violence against any demographic (see figure 92). In this post, the moderators commented that there were a number of accounts that were “claiming” to be from r/AznIdentity, r/AsianMasculinity, etc. that were sending harassing messages and hate speech to other users, and that these users were not actually representative of the community at all. Figure 92. We condemn the glorification of violence This move by the moderators of r/AznIdentity as well as their attempts to clean up previous posts that would be in violation of this policy seemed to indicate two things: 1. That 189 they knew that they would need to have a public facing statement on behalf of the community, and 2. That the “clean up” of offensive content was necessary to sustain the group. The initial announcement that reddit was starting to crack down on far-right subreddits and banning them on the community was met with comments informing other users to be wary of this: “Be very wary. This is hardly something to celebrate. Just wait until PAAs start calling for this sub to be taken down as a "hate" sub.” (comment by user inkedotli). r/AznIdentity also brought attention to this change in policy in particular that the subreddit would be under more scrutiny than usual, and the moderator shadowsweep made a post with reminders of what content was prohibited as well as prohibited behavior. In regard to the possibility of the community going down, he pointed to the Twitter page to check for updates regarding the status of the subreddit: “If this sub goes down then we'll regroup. We will post updates on asnidentity on twitter for updates. Some stuff will be lost, but the main ideas/write ups have been archived already.” – excerpt from post by moderator shadowsweep In comparison, r/hapas responded semi-nihilistically (i.e. saying things like “well it’s been fun” rather than trying to figure out an alternative plan), but in April of 2017 a user of the community had made an archive of all r/hapas posts from the beginning of the community to April 2017. Whether or not this was done as a protective measure is uncertain, and although the archive had been stickied on the subreddit sidebar for quite some time, it would eventually be removed after the massive reddit site redesign and reformulation of the subreddit’s purpose following the number of policy changes that saw so many communities being banned. When r/Incels got shut down, the r/hapas community started to ask the moderators if there was any backup plan or alternative forum for the community to migrate to. However, these comments were not met with any comforting news or information: a moderator themselves even provided a response that stated that there was no official backup (see figure 93). Like r/Incels, r/hapas was 190 not nearly as organized nor as vigilant in preserving its group’s content, its knowledge, nor its history. If one visits the community today, there are no links on the sidebar when there was previously an entire section of “resources” before, and perhaps this is to maintain the subreddit as one that is “apolitical” and thus helping to avoid surveillance by other reddit users and reddit administrators in particular. Before the massive website redesign and shift in strategy for the community in general, there were links to the r/hapas archive, halfasian.org, a list of white nationalists with Asian wives, and hapavoice.com (see figure 94). Figure 93. There is no official back up. Figure 94. The r/hapas sidebar in 2017 Although r/hapas seemed to not be as invested in responding the ban as aggressively as r/AznIdentity, r/AsianMasculinity made a post that mimicked the one on r/AznIdentity 191 condemning violence as a measure to protect its community as well. These public pronouncements of the subreddit’s values are an interesting tactic – rather than trying to shame the reddit administrators for censorship, they seemed to take a defensive, fortifying approach to protecting the community. After the change in reddit policy, the r/AznIdentity moderators made a plea to their larger community in an attempt to help Asian women who wanted to be a part of the forum feel more welcome and less alienated. In the long post speaking about the state of the community as needing to expand its boundaries to include Asian women and all Asians, moderator AsianMovement wrote: “From the birth of this subreddit to this very day , r/aznidentity has always held a strong focus on Asian American issues , especially those concerning Asian men. Nevertheless, although this focus has remained strong throughout my time as a moderator on r/aznidentity , we , the moderators on r/aznidentity have tried to expand our roots from being a male-only subreddit , to a subreddit that is inclusive of all asian americans, male or female. We have worked towards making this subreddit more welcoming towards our sisters , and we've made substantial progress in the past year. The conversations we've enabled have allowed us to understand more about each other in a way that was not possible before. Although we have made great progress , their has been one remaining problem that has persisted throughout the years , which has festered as time has gone on. This problem , is perhaps the one problem that is the elephant in the room among the AM here.**Their are many guys on this subreddit who have persisted in a pattern , where they do not give our sister's the benefit of the doubt, choosing to immediately attack them over any perceived problem , over the slightest suspicion.**When it comes to criticizing AF however , they give them the benefit of the doubt. They talk over them, as if their opinion does not matter , or holds no weight what so ever. This is unfair to the Asian women have voluntarily participated in this subreddit. It is important for everyone to know this: These women wouldn't have joined this subreddit if they didn't care truly about the same issues that you and I do. They would not choose to say here time after time , even after being criticized at every point , if they didn't give a damn.” – post by moderator AsianMovement Unlike the other case studies in this dissertation, r/AznIdentity actively tried to recruit women into the community (as seen in the post above and others throughout this case study) and to expand participation in the movement by doing so – however, women who attempted to 192 participate were often flooded with misogynistic direct messages from verified users on the subreddit. Commenters to this plea from the moderator were met with responses that claimed that they would welcome women, but not “Lus” into the community at all, and in particular that they would not welcome any Asian women who did not take race into consideration as their first criteria when choosing a romantic or sexual partner: “For other Asian American women who want to reconcile with us and join our cause, ask yourself if you would give other non-white men a chance in courtship. If you find dating a black man, latino (mestizo) man, or swarthy middle eastern or west asian abominable, then we won't need you either. It's obvious that you place your preference on race and race only first. You will only consider the race of your potential beau or mate first before any other quality. No. We will not accept the cliche that it's a preference.” – comment to moderator post by hotasianman This intense focus on dictating the sexual and romantic partners is known as “mate guarding”, and is a concept coming from biology and other natural sciences and simply explained are a set of practices to control romantic and sexual access to a mate, in particular for reproductive purposes (Flinn, 1988; Shackelford, Goetz, Guta, & Schmitt, 2006). In effect, it can be understood in lay terms of the responses to a perceived threat of another person, regardless of gender, trying to “take” one’s romantic or sexual partner – i.e., jealousy. However, within this community, the mate guarding practice is more typically seen in response to interracial dating and dating “outside of the race” – and specifically for women, not for Asian men. Therein lies a largely hypocritical part of this group’s discourse and their practices – they police the romantic and sexual choices of Asian women heavily, even going so far that Asian men and women need to reproduce together (see figure 95), but then advocate for and support Asian men dating non- Asian persons. 193 Figure 95. The importance of Asian Americans having children Further, most of the harassment and hatred levied toward Asian women that the group refers to derogatorily as “Lus” and even terms like “sluts” and “whores” are almost always targeted toward Asian women who are not married to or dating Asian men (Ng, 2018). After the reddit change in policy as well as the pleas from the subreddit moderators to control the criticism levied toward Asian women in the forum - which was marketed as a “need to recruit more members” but was perhaps really to avoid scrutiny of the subreddit being labeled as misogynistic and perhaps being banned – the subreddit continued its aforementioned tactics in expanding their group outside of the subreddit itself. Continuing with their maintenance of Kulture, their other off-reddit communities, and monetization and professionalization of the community, what the community is perhaps most infamous for is its crowdfunded porn featuring an Asian male lead and to recruit more Asian men into the pornography industry – both as performers and producers (see figure 96). 194 Figure 96. Crowdfunded porn as activism The project would eventually be funded and feature an Asian man and a white woman, and the goal of the community to crowdfund pornography extends beyond just mere media representation in this specific genre – rather, it invokes the issues around desire, sexuality, and heteronormative notions of relationships that are powerful discursive constructions on modern society. However, the subreddit still maintained its desire for its leaders and community members to remain anonymous in fear of the retaliation they would face if people in their face- to-face lives learned of their involvement within these communities: to avoid the risk of exposure, they offered a multitude of different avenues for donations (see figure 97), including cryptocurrency. This ability to send monetary support through a variety of anonymous channels online also highlights an affordance of digital infrastructure that has permeated every aspect of our lives – not just the cultural and the social, but the very building blocks that make up day-to- day functioning (banking, etc.). What is impressive about this case study compared to the others 195 is the willingness of the community members to donate money to the projects and causes of the forum – although some donations occur on r/TheRedPill and r/Incels, most of the comments are more mocking and derogatory in regard to donations, and less supportive than r/AznIdentity. Figure 97. Donations The crowdfunded adult film would eventually come to fruition and received quite a bit of news coverage, particularly on “news” sites like NextShark that aligned themselves with much of the rhetoric and issues that r/AznIdentity based its practices and beliefs on. Throughout 2018, the subreddit would continue to fund projects to this end – to push their vision of Asian men into media, particularly on digital platforms, and to engage in any tactic necessary to control these narratives. On the subreddit, in 2018 in particular, the moderators heavily pushed engaging in conversations with their created hashtags, as well as starting an Instagram hashtag (#fellowyellow) and telling users to post “sexy selfies” of themselves. Toward the end of the summer of 2018, however, the term “MRAsian” would make its official appearance, and an aggressive campaign would begin across their various platforms and channels to take control of the discourse and move it in a direction as outlined by the moderators/leaders of the community. 196 MRAsians Up until 2018, the term “MRAsian” had not been used to refer to this forum. Although it is uncertain who exactly coined the term, it started emerging in August of 2018 to refer to the collection of subreddits mentioned throughout this chapter, but in particular r/AznIdentity. Specifically, they achieved notoriety after author Celeste Ng openly tweeted about her experiences with these groups and referred to them as MRAsians, and linked to another tweet by another Asian woman who screen captured the subreddit’s Twitter strategy to demonstrate how their tactics were not dissimilar of those by the alt-right in leveling Twitter harassment against women and other progressive activists (see figure 98). This document with their Twitter strategy is stickied on their sidebar, under “Activism”, and also includes a long list of accounts to follow that belong to not just prominent celebrities that they see as useful to their cause, but also other r/AznIdentity members. Figure 98. How to get the most out of your Twitter account In the guide, they also encourage community members to not use their real names or photographs and instead to just google search and find an image of a “decent looking Asian 197 male” who isn’t too famous to avoid detection that the image is fake. Regardless of where the term came from, or who used it first, it does appear that Celeste Ng tweeting the term out to her audience on Twitter popularized the term on Twitter and across other media platforms. A Google Trends search finds that the term had never been searched for until late August 2018, right after when Ng and others used the term to describe the r/AznIdentity subreddit and their affiliates (see figure 98). This label was violently rejected due to the subreddit’s community believing that the issues that they fought for were focused on race and racism against Asians, and not about masculinity or patriarchy as men’s rights movements are known for. However, long before this label as chinglishese’s story of the harassment she faced, those affiliated with r/AznIdentity were using the tactics being pushed by the moderators to harass Asian women and LGBTQ Asians on Twitter and even emailing, Facebook messaging, and other means of contact to harass them into silence (see figure 99). Figure 99. Mobbing 198 This extremely aggressive strategy pushed by the moderators to spread the group’s message and to silence and suppress the voices of those that they viewed as not supportive and against their goals demonstrate the intimate knowledge of platforms, and their capacity for harassment, by the moderators. These strategies not only had significant financial backing (as discussed above) but were a key component of the group’s practices and goals. Viewing Twitter as “the realm of feminists”, their strategies on Twitter in particular are indicative of their awareness of the platform’s ability to shift public opinion and its reach. It wasn’t just Twitter, however, that they relied on in order to cement their status as being the voice and community for the Asian American movement – they even decided to establish another online magazine, as seen in figure 100, to push their message and to serve as a “publication” to publish their statements regarding controversy about the community. Figure 100. Emperor Magazine 199 Using Medium.com15, they decided to call it Emperor Magazine, and seemed to model it similarly to other men’s magazines like GQ in terms of its topics. This move, as well as the establishment of Kulture and their Twitter strategy, demonstrates that the moderators of the forum seem to possess a great deal of knowledge on how to use channels of mass communication to mobilize, indoctrinate, and to recruit. Another media outlet, NextShark, was not established by the community per se but was strongly supported, and NextShark is similar to other online publications like Huffington Post by covering topics from politics to entertainment to lifestyle. However, NextShark is specifically meant for Asian audiences and covers Asian-interest topics, and often posts content produced by MRAsian celebrities like Albert Hur (who was banned from Twitter in 2018). One user responded to the news about Emperor Magazine by commenting that the proliferation of these online outlets would help the community’s causes: “This is great. The more independent media we have out there representing us properly, the better. We need some more stuff out there that's not quite as politically correct and "bipartisan" as the more popular western-Asian-centric mediums out there like NextShark. As it stands, our voice is considered "extreme", even when we simply deliver woke messages based in rationality. The mainstream has villainized us so much that it's gotten to this point. Just browse the twittersphere and the big mainstream reddit subs. Even reasonable points of our view get admonished as "extreme", "hypermasculazn", "incel", "jealous", "bitter", "patriarchal", "misogynistic", etc. The mainstream rhetorical space/Overton window has been set-up to oppress and marginalize our views. Calling us extreme is just one way they do this. It matters not that we come from a place of logic, rationality, evidence, and ultimately, truth. What matters to them is that our views are against their self-interest. Thus, they are heavily incentivized to keep us down. Why would they want to give up the cultural and racial hegemony that makes every aspect of their lives easier? It's about competition. Thus, by proliferating mediums like online magazines that support our views, we can disseminate the truth, and even though it goes against the grain of the established mainstream rhetoric, over time, people will read our views and be recruited onto our side. We need to employ clever marketing and information dissemination tactics to continue to recruit more activists for our cause. I'd like us 15 A blogging platform 200 to have mediums that are more moderate and mild in tone that causal readers can easily digest, and mediums that are more harsh in getting at the very core truth of what's going on that can recruit passionate supporters - individuals who've felt the racism and oppression all their lives. Thank you for starting this. As I see it, we're still in the nascent stages of hitting the mainstream consciousness.” – comment by user NAITNC The post above indicates a level of knowledge of not just the strategies needed to push their agenda, but also of the constraints that they face in the sociopolitical realm they are attempting to infiltrate and take over. These media strategies were not implemented alone, however, and there were also similar calls in the forum as in r/hapas to stop using the language of the alt-right and internet culture to avoid affiliation with these hate groups (see figure 101). The strategy, then, was not just placed on the saturation of media sources across platforms regarding their community and its discourse, but also to discontinue using language that they view as antithetical to their goals. As the quote above also mentions, there needed to be more ‘clever marketing and information dissemination tactics,” and the control over language that wasn’t synonymous with Internet and alt right culture was a move toward this tactic. Another commenter noted that “Yes, you need a media ecology, not one to rule them all. Things need to cycle around, a pluralism of viewpoints both keeping things in check and promoting the good stuff. We need to amplify signal, minimize noise.” (comment by user 78fivealive). 201 Figure 101. Stop using the language of the alt right Using the analogy of amplifying signal and minimizing noise, the call to stop using alt right language is perhaps an attempt to minimize noise (and scrutiny). The Twitter strategy and the establishment of Kulture, Emperor Magazine, a number of Instagram accounts, podcasts, YouTube channels and more are all a part of the larger r/AznIdentity project to saturate the Asian diasporic online social world with their discourse in lieu of others. On the subreddit, there is a stickied section on their sidebar that not only has links to their wiki (as a way of preserving and maintaining their knowledge), but also all of their “Friends and Projects” (see figure 102). Thus, they not only claim affiliation but a strong one with the use of the term “friends” and by including the community’s own projects to achieve this critical mass with the ultimate goal of cultural saturation of their discourse. 202 Figure 102. Friends and Projects Similar to r/TheRedPill, this community also has a list of affiliated subreddits, but unlike r/TheRedPill, the outside links are intentionally made and established outlets to spread the message of the group. r/TheRedPill features blogs, websites, and others that are similar in terms of their discourse, but are not started by the r/TheRedPill community per se. Note above that they also established a subreddit called r/AsianAmericans (plural) which was probably an attempt to divert traffic away from r/AsianAmerican (singular), their main “enemy”. Their manipulation of language, their practices to establish a vast media ecology, and the financial backing that they gave to their community and the encouragement of harassment campaigns that are perhaps being coordinated in their Discord and Slack servers (which are instant messaging applications they use to bypass reddit censorship and rules), show how a robust infrastructure has been built to push the agenda of the group into the mainstream. 203 The focus of this group, as opposed to the other case studies in this dissertation, seems to be primarily focused on two things: 1. Networked harassment to silence voices that aim to criticize their community and 2. Flooding the digital and popular media ecosystem with their discourse. As opposed to preservation and archivization, like r/TheRedPill and even to a greater extent r/Incels with their establishment of wiki and other “archives”, r/AznIdentity was more concerned with its public facing media content than the maintenance of the actual subreddit itself. Their strategy of outward involvement across platforms as opposed to inward involvement in cultivating their subreddit community is a different kind of media strategy and movement that highlights the possibilities of the Internet and digital infrastructure. However, there would be a frightening moment for the community when r/TheRedPill and some other subreddits became quarantined, and their response to this highlighted the tension of remaining on reddit in lieu of a separate platform. In 2018, the subreddit and the aforementioned Emperor Magazine made official statements in response to Ng’s Twitter posts about the MRAsian community as well as in response to her piece that was published in The Cut that told the story of her experiences with harassment, the discourse of the communities, and the harassment that had been levied at other Asian women on Twitter and elsewhere. Shortly after the Twitter posts that Ng had made in response to the harassment she had faced from their community in August, the r/AznIdentity moderators published a piece on Emperor magazine asking the community members to “move the discourse in this direction” for those engaging in “debates” on Twitter (i.e., harassment). In the article, the writer writes in all caps: “WE ARE NOT WINNING THE CONVERSATION AT THE MOMENT.” (Emperor, 2018) This tactic of shaming the community members for their 204 lack of action had been common for years in the subreddit, particularly in regard to the lack of Twitter participation among the users. Along with these outlined strategies in how to engage in shifting the discourse on these topics to fight back against Ng and other “PAAs” on Twitter, AsianSoul.org made a subforum with archived posts for users to access as references when engaged in debates on other platforms about the issues that the MRAsian community pushed aggressively and actively sought to engage in (see figure 103). The link to this “resources” repository was shared on r/AznIdentity, and was encouraged to be used by the community to quickly have citations and references when they needed to “prove their points.” This organization of arguments that are easily accessible as reference materials further strengthens the group’s mission and identity by clearly outlining and categorizing stances for the community. Figure 103. Evidence Notably, the database materials also had translations in other languages, which signifies another departure from the other case studies’ tactics: the other communities in this dissertation focused heavily on English language, but being focused on Asian communities, r/AznIdentity and its affiliates have a number of resources – including their glossary of terms – translated into Korean, Japanese, Chinese, and other languages not just for accessibility purposes (i.e., English as a second language users) but also to engage in these debates in other online communities 205 where English is not the language used. This obsession within all three communities in having “solid proof” to refute the claims made by progressive activists that were viewed as the enemy (SJWs, feminists, “blue pillers”, PAAs, etc.) signifies that there is a value placed on “science” and “knowledge” in these communities, and the value of citation and reference when making arguments. However, many of these articles were misinterpreted, taken out of context, or even manipulated to meet the goals of not just r/AznIdentity, but all the case studies discussed in this dissertation in general. Figure 104. Sign up for AsianSoul.org Although the establishment of the subreddit r/TheBanOut2018 and the reddit change in policy that would result in the quarantining of r/TheRedPill raised some concerns in the community, the moderators just pointed to AsianSoul.org as the backup forum in the event of a forum shut down (see figure 104). Concerns about the r/AznIdentity subreddit being shut down were assuaged after the very subreddit calling for bans of communities was banned itself due to a lack of moderation, and the conversations quickly reverted back to attacking Ng and other Asian women and feminists, as well as their supporters. The continued comparison between r/AznIdentity and the subreddit that the initial founders of the community migrate from, 206 r/AsianMasculinity, was ongoing – particularly after r/AznIdentity surpassed r/AsianMasculinity in terms of members (see figure 105). Figure 105. Revenge In the post above celebrating the success of r/AznIdentity, fakeslimshady notes that this is an act of revenge, and that “r/AsianAmerican You’re Next”, meaning that the goal of the community was to overtake all of the other Asian communities on reddit and beyond like a swarm. However, they were acutely aware of their visibility now that Ng, and then later actress Constance Wu, spoke of the community referring to them as “MRAsians” as well as “Asian Incels” (McHenry, 2018). Depsite the increased visibility, the forum did not have an explosion in new subscribers (they’re still currently hovering around the 19,700 mark) and were, at time of writing, not under any threat of bans or censorship. The conversations and discourse on the subreddit previously before the rolling out of new reddit policies all moved to the off-site forum or occurred in invitation only chat servers on Discord and Slack. One user in 2018 even attempted to make a graph to demonstrate the way that “Asian reddit” has changed, shifted, and its different factions (figure 106), which is perhaps an attempt to historicize the community at large to make sense of the landscape that r/AznIdentity finds itself in. This historicizing attempt, however, of course fit the narrative of the forum and even mockingly refers to chinglishese’s 207 establishment of r/AgainstHarassment and uses the language of the forum in referent to the term PAA, WMAF, etc. Figure 106. The history of Asian reddit and their catalysts The subreddit, and all of its affiliated projects and off-reddit platform presence, saw itself as more accommodating and open, and “real” than any of the other communities or factions of Asian/Asian American activism. The members of r/AznIdentity lamented that it was a shame that the users of r/AsianAmerican did not view the world through their lens, and that they did not support Asian men enough, and that r/AsianAmerican did not represent most Asian Americans at all. One user, sam_rock_well, commented that “my point is not that they are regular asians, it's that they control the narratives because they control the institutions.” Thus, r/AznIdentity and its network saw itself as challenging the dominant Asian American movement’s narrative, and institutions in terms of the organizations that had already been established, and pushing against 208 this cultural dominance they saw as detrimental for Asian men. The battle here is being fought against a variety of digital platforms, with the financial backing to support it, and thus perhaps poses the best case study of how fringe groups use not just their money but their knowledge of navigating platforms to suppress voices and amplify their own. The Politics of Digital Space What the practices of r/AznIdentity exemplify is the intimate knowledge of how to navigate digital space as well as the politics of not just creating, but taking over, the spaces of those they wish to conquer. The assemblages of communication that r/AznIdentity not only had access to with the rise of centralized platforms and ease of community establishment as a result but that facilitated these political power plays are highlighted in not just the subreddit, but all of the tactics and strategies they put forth to push their narrative as reality. The moderators, as the leaders of the community, used any resource they had at their disposal (particularly, financial) to manipulate the contours of public discussion around Asian and Asian American issues in the United States, going so far as offering how-to guides in how to engage in harassment on Twitter and other platforms. The body of knowledge they pull from in the establishment of these digital spaces is fundamentally informed by the structure of these digital ecosystems, and facilitated by them. Although they may not be archiving and preserving as much as the other case studies in this dissertation, r/AznIdentity painted themselves as not just refugees, but pioneers in the Asian American movement, for focusing not just on issues that pertain to Asian men but also in their coordinated use of a variety of media systems – both traditional (i.e. a “magazine”) and “new” (digital platforms). 209 This community is extremely strategic in trying to hide their activities and their coordination from administrators and others who wish to surveil their movements (see Appendix C for maps outlining their stances). Because they are aware of the level of radicalism that their views espouse, despite vehemently holding on to the notion that their version of reality is correct and their discourse is the ultimate truth, they are careful in not just the language of the forum to not alienate newcomers but also in the very ways that they package these views. The packaging in the form of “official” platforms like Kulture and Emperor as well as the actual financial backing of projects makes them seem less like a hateful online community (which r/hapas never managed to escape from) and a legitimate political and social movement. As opposed to being against media attention and coverage, they openly embrace it as a community, for they view this attention as being beneficial to their cause. They hide their discourse and its formations in meme pages, in online magazine and media conglomerates, and the production of actual media projects like a crowdfunded adult films. This hidden delivery system is the most sophisticated one out of any of the case studies examined in this dissertation, and in particular because the community does focus on issues that have been in the realm of Asian American activism for years (representation, racism and prejudice against Asians, etc.), they perhaps bypass administrative censorship because they use social justice language to describe their plight. However, it is easier to quickly see how deeply misogynistic, essentialist, and ethnonationalist they are with just a few clicks, and although the moderators have been strategic in trying to clean up this content, it’s still hidden in the “official” articles and readings the subreddit provides its users as well as through its antagonization of other Asian spaces on reddit (r/AsianAmerican and the Asian feminist subreddits in particular). Although they have never been truly threatened by a ban, this community started because of 210 censorship on another, and in response they let a specific form of erotic rage to drive their practices. The threats to this community are less from the digital platforms themselves but a society and movement they see as not echoing their violent views on women, and they react and coordinate accordingly. If it weren’t for the nature of the content, some would dare to say that these strategies and establishment of entire ecosystems online are impressive, and demonstrate the power of platforms in a slightly different perspective than that of their network and visibility as a subreddit. What gives the movement their strength is what’s perhaps less visible upon first glance, and how they obscure their true goals through these symbolic practices. Similarly, what gives all three cases their strength is often the very things that threaten their existence. What r/AznIdentity does well is staying one step ahead of administrators and outsiders in moving their more hateful discourse off their reddit platform, pushing money into their projects, and having a unified strategy that is supported by a majority of the community members. Further, another key part of their success is their sprawling media ecosystem that exists outside of reddit: although r/TheRedPill and r/Incels did do something similar, what r/AznIdentity has been instrumental in doing is hiding these affiliations on other platforms, similar to “cloaked” websites on the right where authorship is hidden to advance a certain political agenda (Daniels, 2009a). The difference being here, however, is that the rhetoric is often packaged as Asian and Asian American activism, even going so far to use the same language and report on the same issues, but it’s quick to realize what kind of message these cloaked sites are trying to push into the mainstream. Regardless, r/AznIdentity is similar to but wildly different from the other two case studies in one significant way: their defensive strategies as well as their financial support. 211 CHAPTER 6: Discussion and Conclusion Is deplatforming an efficient method to halt the spread of extremism? Advocates for the strategy claim that it is an effective strategy, but deplatforming is perhaps not the most effective solution in the long run. Concerns about deplatforming’s effectiveness were raised in January of 2019, when the Southern Poverty Law Center released a report that detailed the activities of a web domain hosting service called Epik. Owned by a man named Rob Monster, the company has been quietly stepping in and hosting a number of platforms like Gab, BitChute (the “alt right YouTube”), and has served as the web host for the popular neo-Nazi podcast Radio Wehrwolf (Makuch, Koebler, & Mead, 2019; Southern Poverty Law Center, 2019). Although Monster denies that his political views align with those of the alt right, the company updated its Twitter biography to claim that it’s a web hosting service keen on “protecting free speech” and Monster also made an account on Gab, regularly interacting with users (Makuch, 2019). Thus, despite the attempts to deplatform entire social media websites as well as individual podcasts, celebrities, and pages from Facebook, Instagram, and the like, the way that these policies and bans are enforced are not consistent across platforms or providers (i.e., domain registrars, etc.). For instance, although Facebook, Instagram, and Twitter banned Milo Yiannopoulos, his YouTube page is still active. Despite the attempt to shut down Gab, they merely moved to a new “home” by finding a domain registrar willing to house them. By providing the necessary infrastructure for these websites to exist, they manage to persist in the face of failure. Thus, although platforms attempt to halt the spread of these groups and ideologies, the infrastructure provides an alternative solution for them to re-establish themselves. Much of the conversation, both in popular media and in academic scholarship, surrounding these groups tends to hone in on what platforms can do to halt their spread, as well 212 as calling out the logic of the platforms that were manipulated for malicious ends (Daniels, 2018; Kraus, 2018; A. Massanari, 2015; Noble, 2018). But the strategy of deplatforming to halt the spread of extremist thought is a double edged sword. On the one hand, it can be an effective method at halting the spread of hate speech, the infrastructure of the web means that these websites, people, and communities can travel elsewhere and find new homes. Additionally, even though these celebrities are deplatformed, the movement itself does not die – the celebrities are merely figures for this ideology and discourse, and although they do have a prominent place in the movement, are ultimately not necessarily the “leaders” of them. The double-edged sword element of deplatforming comes in the form of how deplatforming, and censorship, often serve to strengthen group identity and adherence to the cause rather than discouraging it, as shown in the cases of this dissertation. What deplatforming prominent figures like Jones, Yiannopoulos, and others does in turn is alert communities to start preparing for the possibility of failure – like a siren, the deplatforming of prominent figures as “examples” of what will happen to those who peddle hate speech doesn’t necessarily function as a warning for others to not follow suit, but rather gives them time to strengthen their communities against this failure. The actions of these platforms are what these extremist groups’ respond to, and the old cliché of “every action has a reaction” is fitting for what is occurring in this complicated web of action. By focusing on the practices of these extremist groups and their meaning making actions, we are better able to see how they are able to build something that’s enabled by platforms and infrastructure that transcend the limitations of them. The hidden systems that these groups build take shape in the symbolic infrastructures that they create with the aid of material and digital infrastructure. Scholarly and other political commentary often discuss the ways platforms incentivize and allow for these groups to grow - and the ways that they shed themselves of the 213 responsibility of what people do on their platforms (Gillespie, 2010) - but we talk less about what these groups do and how they navigate these structures and what they build. Therefore, what’s missing from previous analyses is how these groups manage to persist – not due to the nature of platforms, but of these groups’ practices that ensure their futurity. A large part of what enables them to sustain their communities is that they build a symbolic infrastructure, that is able to transcend the constraints of the digital platforms. The platforms that these groups use are simultaneously the very thing that facilitates their existence and also the enemies that they are constantly fighting against. As such, critical analyses of how online extremist groups must go beyond merely examining their groups and how they attempt to inject their discourse into the mainstream but what they do in order to sustain themselves. In an effort to begin understanding how these groups manage to persist despite all of the attempts to stop them, I observed these communities and their practices over the course of two years. Although I initially wanted to understand their discursive practices and how those practices build their communities, what became of greater interest was the sustainability practices that allowed them to persist even after attempts to silence them. At the time of writing, there have been a number of massive changes in the ways that platforms contend with the issue of hate speech and extremism in their digital spaces. Although alarm bells were raised in the mid-to late 1990s about the growing number of white supremacist groups on the Internet (Blazak, 1999; Stern, 1999; Thiesmeyer, 1999), it seemed that many of these concerns fell to the wayside and were not perceived to be a legitimate threat. It would be nearly 20 years later, after the election of the 45th president of the United States, that lay people acknowledged the power of digital infrastructure in facilitating the rise of extremism. Although technological determinism was rife, particularly with media outlets that pointed to these platforms as the cause of the rise of 214 extremism, one cannot ignore the power of sociotechnical systems in facilitating a place for connection, for identity building, and for mobilization (Alava et al., 2017; Daniels, 2013, 2018; Koulouris, 2018; Neiwert, 2017). Despite acknowledging that the Internet has a hand in the radicalization process, scholars and policymakers acknowledge that controlling hate speech and extremism online is a tricky endeavor due to the nature of discourse itself, which exists in and outside of digital platforms. Although many activist groups, individuals, and scholars are applauding these major platforms (Twitter, Facebook, and reddit) for moving towards a policy of deplatforming hate speech and violent communities, the discourses and ideologies of the online extremist groups continue to persist. It is not necessarily the nature of platforms or the nature of social movements exclusively that have enabled the continued growth and spread of these groups, but rather that all of these forces are dynamically interrelated. For example, the groups in the case studies of this dissertation successfully navigated the constraints of their platforms, which paradoxically hindered and enabled their community construction. Their deeds, thusly, implicated the structure and administrative governance of the very platform that they antagonized. Despite the animosity leveled toward the platforms, these websites provided the necessary foundation and space for these groups to enact their practices. Given that these platforms simultaneously constrain and enable mobilization, the three case studies at the center of this dissertation all consisted of different kinds of movement and sustainability practices that point to how these modes of production, these changes in responding to the platform’s constraints, and being able to navigate the affordances of web infrastructure allow for their perseverance. These practices include archiving, fortification, identity maintenance, and network building, all of which contribute to the creation of a symbolic 215 infrastructure that can be carried from platform to platform. It is this infrastructure that they create through these practices that allows for them to move from platform to platform, shifting and evolving across the rhizome that is the web, and this possibility of movement while preserving their community is what allows them to persist despite infrastructural failures in the form of bans and censorship. The possibility of movement was highlighted in each of the case studies in this dissertation project, even as each group’s practices both mimicked and deviated from one another. The mimicked practices are obvious at the level of the nature the groups themselves (all are a part of the Manosphere, but have very different goals), but deviated because of the strength of their preparedness and their organization. In particular, the “mimicked practices” were due to the constraints and affordances of the platforms that each group found themselves on – they could only do so much, and go so far, within the boundaries of reddit. Ideologically, they can all trace their discourse to a misogynistic and racist worldview; but the amount of attention, pushback, and censorship they have faced occurred at different times and with varying intensity. The variance and intensity of the levels of infiltration, surveillance, and threats were mostly dictated by the size and infamy of the community, but they were all under similar kinds threats nonetheless due to their affiliation with more extremist groups. In particular, they were all under the most threat by the reddit platform itself, who continually changes its policies that threatened the groups’ existence. The responses to the situations that these three groups found themselves in all had different catalysts: policy updates on reddit; intense media attention; and pressure from other users were all involved in the quarantining, banning, or fear of banning that occurred in each group. 216 Despite the differences in catalysts for action, all three of the groups in the case studies adopted similar practices to bypass the rules of the platform and to exploit its affordances: 1. Stricter content moderation, including banning users and creating new affiliations for the group within and outside of it (identity maintenance and network building); 2. Establishing off-site forums and “contingency plans” (fortification); and 3. Archiving. It would not only be at the platform level that these threats and bans would occur - for instance, incels.me was banned by its domain registrar and had to move to another forum (incels.is) on a different domain. The threat to these groups did not exist just at the reddit/platform level, but at the infrastructural level in and of itself. These modes of infrastructural failure and platform failure are interrelated with one another, and informed the practices of the groups in tandem in response to these fears. To be more precise, at an infrastructural level, reddit provides the necessary digital space for communities to establish themselves and to not have to create their own forum infrastructure, making it easily accessible. However, reddit in its capacity as a platform has also evolved its practices of governance of what users do on their sites, and is reflective of a larger shift in platforms assuaging themselves of any blame for what their users do to ultimately acknowledge their complicity in facilitating the growth of extremism (Daniels, 2018; Donovan & boyd, 2018; Lyons, 2017; Marantz, 2018; Noble, 2018). By enforcing stricter policies in terms of what is acceptable within the platform, large-scale communities like reddit, Facebook, and other social media are now instigating failure for groups that fail to respect these rules. These changes in policy that impacted the very infrastructural stability of the communities due to a shift in platform governance in quelling extremism are what instigated these failures, but this failure was not a surprise – it was anticipated. And because it was anticipated, the groups who were under the most threat could prepare accordingly. In the following sections, I discuss the ways that this 217 anticipatory failure ultimately led to action, and resulted in defensive and protective measures being enacted to ensure the group’s futurity. It was this anticipation of failure, and failure’s influence to motivate action, that ultimately resulted in a set of practices which created something I have termed symbolic infrastructure. I explain this concept after summarizing the events and practices of the groups in the following pages, and explore some of the other emergent themes from the study of these three groups to better understand the nature of extremism and social movements in the 21st century. Anticipating Failure Every change in policy was a warning sign for each of these groups, who were keeping their finger on the pulse of the changing nature of platforms in their enforcement of stricter policies. In the case studies, each group had differing levels of preparedness in the event of an infrastructural failure: r/TheRedPill started their preparations in 2015 after the first wave of bans on reddit that would set the precedent for the banning of groups in the years that followed; r/Incels had no disaster preparedness plan set up years in advance, although the forum that they moved to had been created before the subreddit was ultimately banned; and r/AznIdentity was acutely aware of what was happening in other spheres of reddit and planned accordingly, and evolved their moderation practices with each new policy made by reddit administrators. r/TheRedPill did something similar, by adapting their moderation of content and users of the subreddit in accordance with reddit’s new rules in order to evade censorship and banning. r/TheRedPill, and r/AznIdentity, established off-site forums and websites (including using chat servers like Discord and Slack) long before any threat was actualized in their communities, but 218 r/Incels did not do the same. Although it appears that r/Incels announced the new forum in a Discord channel, they lost a significant portion of their reddit membership in the move. The attempt to announce the new forum after the ban of r/Incels, but failing to reach a majority of its members, indicates that r/Incels, compared to r/TheRedPill and r/AznIdentity, did not actively try to build a stronger community that would pay attention to these impending platform bans. Although r/Incels members were aware of their infamy on reddit and on the Internet in general, the level of “disaster preparedness” in the forum was far less than that of r/TheRedPill and r/AznIdentity. The subreddit also had no archive of its content, and thus in the ban and move to incels.me, lost all of their posts, user information, and more. r/TheRedPill, and r/AznIdentity, both actively curate and maintain archives of their content, and community members also participate in this archiving process by offering their coding skills or own personal collections that they created before the requests/announcements made by the moderators that archivization was occurring or being planned. Similar to case studies done by De Kosnik on feminist web forums and fan communities (particularly fan fiction), these communities relied on a crowdsourced approach in archiving their content in order to access a certain level of power afforded to them by the ability to make and maintain archives at all (De Kosnik, 2016). The possibility of creating archives puts the power in the hands of the users of these communities, who by establishing them grants them access to shape a version of truth and reality that supports what happened within their websites. This labor, though, requires a certain level of devotion and loyalty to the group and belief in its cause (De Kosnik, 2016). Within the case studies of this dissertation, differing levels of loyalty and commitment in preserving the group were present: r/AznIdentity’s members are perhaps the most committed to the group, often donating money for the community’s initiatives like Kulture media and the crowdsourced 219 pornographic film; whereas r/TheRedPill has a handful of devoted members who help the moderators with their efforts; and r/Incels is perhaps the group with the least “devoted’ members in terms of financially and structurally supporting the community. The practices of establishing archives and other sustainability practices to preserve the group and to strengthen it did not occur organically. In fact, the moderators of each group (who serve as the de facto leaders of their communities) were the ones actively pushing for this archivization and other modes of preservation in response to infrastructural failure. But this also brings up questions about the intensity to which these new moderation practices were enacted: r/AznIdentity and r/TheRedPill are still active and present on reddit, meaning that those two specific communities currently have very different content compared to say from their beginning, due to changes in reddit policy that the moderators took extremely seriously in fear of their community being banned. The moderators of both forums did not only start enforcing these new strict moderation practices after policy changes to only new content: they actively went through and deleted past content as well as a protective measure. r/Incels attempted to subvert the surveillance from reddit vigilantes that drew attention from reddit administrators and the whole of reddit by making their subreddit private numerous times in 2017, but did not necessarily overhaul their moderation strategy the same way the other two communities had. This would change after they moved to their off-reddit forum, incels.me, which would also experience a failure but without the massive consequence of losing everything. Further, by moving to an off-reddit forum, the same rules did not apply that had brought about their downfall on reddit – but this did not protect them from failure. In 2018, the .me domain registrar banned the community following high-profile killings that were attributed to the Incels community, which marked a new shift from platform censorship to infrastructural 220 censorship. This shift had of course occurred before in the extremist circles of the web with the neo-Nazi website The Daily Stormer being banned from the web hosting service GoDaddy and then Google, ultimately bouncing around a variety of domain registrars until finding a new home (Lavin, 2018; Mettler & Selk, 2017). This ability to jump from web host to web host also highlights an important affordance of the web beyond just the establishment of forums and archives – but rather, the ability to be able to move and establish new homes. These modes of digital migration are fundamentally a part of how these groups manage to persist, and can also serve as a shared experience that strengthens group identity. Migration Across the Digital Frontier The movement and establishment of new homes across the digital frontier creates an ecosystem, even if some of the “homes” are destroyed, they are remembered and kept as a part of the collective memory that informs the group identity. All three of the groups in the case studies exhibited some level of world building in the establishment of their ecosystems – the extent to which differed, as well as where this world was built. In the case of r/TheRedPill, the efforts to build their world were mostly concentrated on reddit and, up until the quarantine, the off-site forums and other websites they had built did exist but were not active. Focusing on reddit, perhaps because the community itself was first established on reddit as a combination of Pick Up Artistry and Men’s Rights, r/TheRedPill built an entire network of subreddits related to their discourse and ideological beliefs. The creation of a vast reddit network was intentional for two reasons: 1. To help moderate and control the discourse on the main forum and 2. To appeal to a larger group of people. To do this, they established r/RedPillWomen; r/RedPillParenting; r/AskTRP (for 221 newcomers, etc.); and even r/altTRP (Red Pill teachings adapted for gay men). They did establish forums.red and the other backups after threats like the changes in policy dating back to 2015, but this was not the main “hub” of their community. In contrast, r/AznIdentity was started by a group of users banned from r/AsianMasculinity, and although they did a horizontal move by staying on reddit, they also expanded their activism and community beyond reddit through the establishment of Kulture Media, their Twitter strategies, as well as Instagram and Facebook pages to share their content. In the case of r/Incels, it’s a bit difficult to hone in on the “togetherness” or the “hub” of the larger Incel community. Because Incels are spread out all over the Internet, there is no one “hub” like r/TheRedPill or r/AznIdentity for this particular group. The forum studied in this dissertation is perhaps one of the largest and most well-known Incel communities, and their world building practices – after they were banned – consisted of maintaining backups of the new forum, establishing a wiki, and starting to manage some form of archive for their content for their larger userbase. Despite this, the Incels community studied in this dissertation is still the smallest in terms of membership: r/TheRedPill has over 300,000 users, r/AznIdentity is over 20,000, and r/Incels is difficult to tell from their forum, but before the shutdown of their subreddit, they only had roughly 9,000 members. All three communities also hierarchically categorize their members – based on number of posts, their level of celebrity within the group, whether they are moderators/administrators, etc. As such, the management of the communities is based not just on maneuvering around digital infrastructures but enacting and reinforcing certain practices that have been decided by a small group of power users/moderators. The way this is reinforced, and enacted, is facilitated by the digital platforms that these groups find themselves in – demonstrating the entanglement of practice, action, infrastructure, 222 and platform. In particular, the way that these practices are enacted are most visible through the use of coded language within the forum to strengthen a group identity and to perhaps bypass censorship, as well as moving certain modes of discussion off the platform altogether to avoid surveillance. Jumping from platform to platform, and having the choice to, is an intentional one that shows a high level of knowledge of what is and isn’t allowed within a specific space as well as understanding the need to manage their public image. The identity that is created and put forth, and the one that is consistently supported through moderation practices and the items that these groups choose to preserve, is one that is created and reproduced by the de facto leaders of the forums. The initiatives, the strategies, and the contingency plans are not necessarily created and deployed after mass consensus of the forum users, but at the discretion of the forum leaders themselves. Despite their power, leaders are not a constant force within any community or movement. Although many movements see the eventual downfall of their leaders, including the alt-right in the case of Richard Spencer and even Steve Bannon (Weill, 2018) and a number of other social movements throughout history (Gitlin, 2003; Hardt & Negri, 2017; Milan, 2015a), many of these movements persist despite losing their “leadership.” What is left behind in the wake of leaders leaving the community, or even at times being ousted from them, are the social norms of the community, the identity that was cultivated, and the things that they created. The foundational discourse and practices that they establish and maintain during their tenure is continued – by providing the basic framework, they provide the necessary structure for new members and future leaders to build upon. This continued layering is made possible by the things that are preserved thus creating a symbolic infrastructure – a shared history, a collective identity, a shared narrative, and the cultural memory of members both long and short term. 223 Taking together all of these practices and situations that the groups found themselves in, I argue that symbolic infrastructure is what these groups carry with them after material infrastructural failure, and the pieces that make up this symbolic system is possible due to the affordances of digital platforms. In the following sections, I explicate symbolic infrastructure and how its creation allows for the persistence of hate groups over time despite attempts to thwart their spread. Facilitated and aided by existing web platforms and digital infrastructure, these groups are able to build something that transcends the material constraints of the platforms that threaten their existence and to navigate infrastructure in ways that ultimately allow the community to sustain themselves. But what they carry with them is this symbolic infrastructure, and it is through their practices that it is created. Symbolic Infrastructure Infrastructures are layered and embedded systems that shape the very contours of everyday life and guide the actions and practices that make up societies and communities (Bowker, Timmermans, Clarke, & Balka, 2016; Larkin, 2013; Parks, 2015; Plantin et al., 2018; Star, 1999). Digital infrastructure, the technical systems that allow for the creation and growth of these groups, facilitate the practices that are the basis for these communities by providing a space for interaction and connection (Flesher Fominaya & Gillan, 2017; Milan, 2015b). What symbolic infrastructure means, as opposed to terms like “knowledge infrastructure” as coined by Bowker and others (Borgman et al., 2013), is an infrastructure that carries these groups’ discourses across platforms despite infrastructural failure. Knowledge infrastructure, in comparison, is a system meant to facilitate the sharing of knowledge among institutions and organizations. 224 But unlike institutions and formal organizations that develop standards and practices for the easy sharing of information from one group to another to facilitate (often scientific but other industries as well) work, symbolic infrastructures are meant to preserve the culture and discursive practices of a community that is often existing in a liminal space where their beliefs do not adhere to mainstream culture. Star once famously said that infrastructure only becomes visible upon its breakdown (Star, 1999), but these groups constantly lived with the constant threat of breakage, meaning that although there had not yet been an infrastructural failure, the infrastructure became visible due to their responses to the threat of breakdown. Symbolic infrastructure helps to assure the continued survival of the group, the ones included in this dissertation and otherwise, in the event of bans and censorship. These groups build vast archives of information, use spaces like the reddit sidebar to place readings to ensure socialization at scale, and consists of the cultural memory that members hold of the group that is shared, moved, and built upon. Symbolic infrastructure is made up of both the material artifacts that these groups create, but is primarily premised on the shared practices of these groups that produce and reproduce their discourse, their identity, and their networks. The practices that build up and support this symbolic infrastructure include, broadly, the practices of archiving, fortification, and identity maintenance and network building. Subpractices that inform these larger practices are demonstrated in the diagram as presented in figure 107 – but of course, these small subpractices all inform the larger creation of this symbolic infrastructure that these communities are able to carry with them from platform to platform. Although all of these processes (in particular, archiving) occurred in a number of different web communities, social movements, and other groups that were attempting to shift their cultural realities (boyd, 2010; De Kosnik, 2016; 225 Gamson, 1992; Lee, 2016; Polletta, 1998a). But taken together within the specific context of web-based communities and activism, they create the necessary infrastructure for these groups to continue to persist in the face of infrastructural failure. These symbolic infrastructures do not exist independently, and like the conceptualization of knowledge infrastructures are often in interaction with and layered upon one another (Borgman et al., 2013; Star & Ruhleder, 1996). This is most evident in their shared discourses, but also the shared spaces that these groups continue to inhabit (i.e., reddit) or used to inhabit next to one another. Symbolic infrastructures, like other forms of infrastructure, are relational and are contingent upon a collectively agreed-upon definition of what the group is, their boundaries, and their collective practices. The three case studies in this dissertation demonstrate the ways that these practices are enacted, reinforced, and evolved as the needs of the community change in relation to threats of censorship and bans – and what allows them to change, to evolve, are the symbolic infrastructures that they create. Symbolic infrastructure, unlike collective identity and culture, has an intentional quality to it, which enables these subcultures and social movements to embed themselves within the existing cultural infrastructure they see as oppositional to their goals. They are not only trying to embed – they are attempting to replace, to take over, and their practices of archiving and establishing a collective body of knowledge indicate this desire to spread out across online communities in an attempt to inject their discourse into the mainstream – and to shift culture. The groups are all connected by these practices as well as their similar ideological leanings that shape their discourse. This shaping is a critical part about the creation of symbolic infrastructure because it highlights the curation process that help to build it – i.e., what these groups choose to include in it, and therefore these choices point to how they organize and distribute information and 226 knowledge within as well as beyond the group itself (Borgman et al., 2013; Larkin, 2013; Renzi, 2015; Thorson & Wells, 2016). These possibilities of action, these affordances granted to these groups by the shape of digital infrastructure, are what enable these processes. Below, I explicate specifically how these subpractices inform the creation process of symbolic infrastructure in the goal of epistemic production, and how each of these subpractices also interrelate to one another (see figure 107). Figure 107. Symbolic infrastructure and the practices that build it Identity Maintenance and Network Building. One of the key components and subpractices in the creation of a symbolic infrastructure are the practices of identity maintenance and network building. In particular, all three of the groups enact and maintain collective identities through the use of narrative-heavy “theory” readings, reinforcement of social norms 227 and forum rules within the communities, and by establishing their own language and continually socializing new members into the group through these practices. These readings, archives of top posts, their glossaries and rules, as well as strict enforcement of social rules within the community all serve to help these groups re-group and re-establish themselves even if they are kicked off of one platform. Even in the case of Incels, who had no archives or backups after they were banned from reddit, what helped the reestablishment of the group on their off-reddit website was the collective knowledge and memory of the users who migrated from the subreddit to the new forum. Although belief in the discourse and the ideology behind it is one fundamental motivation for users to continue participating in these communities, it is also the connection and collective identity that they feel with other users that keep them returning, sharing their knowledge, and reinforcing the doctrine. Identity maintenance and network building happen both internally for the group through encouraging relationships with other users within the forum and by maintaining a certain group identity as envisioned by the group leaders. Externally, this occurs through the establishment of new relationships with outside websites, groups, and even through public statements in the response to a media crisis. All of these practices place boundaries not only on the group itself, but also place boundaries around the discourse of the group, and helps to reify and reinforce a certain group identity. The creation of an us vs. them dynamic occurs with creating boundaries around the group itself (i.e., we are TheRedPill, not Incels) and the larger subcultures they found themselves within, but also with the extremely aggressive othering of non-members – whether this be the “normies” for the Incel community, “Blue Pillers” for TheRedPill community (Blue Pillers are those who are not “awake” yet, i.e. having not taken the Red Pill), or PAAs (Progressive Asian 228 Activists) for the MRAsians. The establishment of a collective identity is a key part of the identity maintenance activities which also include the use of a shared language that is a foundation for performing these identities (Benwell, 2014; Black, 2006; Bourdieu, 1999; J. Butler, 1988). Identity also informs the boundaries of the group – those that do not adhere to or support us are not a part of this community, as seen previously in Case Study 1 where r/TheRedPill continually reminded members that they were not a part of reddit, and occurred in the other two communities of r/Incels and r/AznIdentity with their vehement denial of being affiliated with other Manosphere communities pointing to a desire to stand apart. This denial of affiliation was also a strategic move by these groups after they noticed the ban of r/Incels and the quarantine of r/TheRedPill – by separating themselves from the larger Manosphere community, they were trying to sustain themselves. This practice of identity maintenance and network building is crucial across different organizations and institutions in the establishment of shared practices that ultimately create and maintain certain types of knowledge, privileging it above other forms or the knowledge of other groups (Bowker et al., 2016; Cozza, Gherardi, & Poggio, 2018). Narrative, in particular, is a key part of this boundary work within social and political groups because it not only expresses shared practice but a shared set of beliefs that guide the practices at all (Brown, 2006; Cozza et al., 2018). Although there was a lot of boundary enforcement concerning how each of these groups saw themselves within the larger digital ecosystem, in particular within the Manosphere, the way that their collective identity was supported was not necessarily through this distinction but rather the distinction between them and their non-Manosphere counterparts. Often, these would be the very people that they would often target, either individually or as a collective, for harassment 229 campaigns, as a “boogeyman” type character, or to refer to entire groups as a whole in order to continue building up the identity of the community itself. But a part of this identity maintenance and network building included expanding the boundaries of their groups – for instance, r/TheRedPill started establishing communities for women, gay men, parents, etc. and r/AznIdentity also did something similar by evolving their initial goals to include more communities to strengthen their group. This occurred not just in affiliating with certain people like in the case of Manosphere celebrities on r/TheRedPill but also in creating alternative subreddits, Instagram accounts, Facebook pages, chat servers, and more. This spreading out is a key component of the network building practice that helps to fortify the group against attacks – since they are not concentrated in one location, an attack on one (i.e., r/TheRedPill being shut down) does not mean that the entire group is damaged. This network building allows for the group to move, to seek support, and to strengthen themselves against future attacks. This strengthening, in particular, takes shape not just in maintaining group identity but also by using the material structures afforded to these groups by the structure of the internet in the shape of fortification. Fortification. The work of identity maintenance and network building also informs the practice of fortification and resilience building – in particular, the establishment of larger networks, affiliations, off-site forums/archives, and identity validation. Fortification is a practice that takes shape in both the symbolic and material forms – on one hand, the fortification occurs as a process along with identity maintenance and network building, but it also takes shape when the groups built these material/digital spaces for them to move to. The most obvious mode of fortification came in the form of establishing off-reddit platforms, chat servers, and other kinds of social media accounts like on Twitter and Facebook. Fortification occurred in different modes 230 as well: aggressive moderation after each policy change on the reddit platform and even on incels.me following their removal from the .me domain; cleaning up of offensive content to avoid providing ammunition for digital vigilantes to point to these groups as toxic and extreme; and also through the “damage control” public relations-esque response sin the form of official statements, press releases, and even changing their unofficial policies of speaking to the press. In terms of moderation, it was not only the heavier policing of new content and the removal of offensive old content, but also the practice of making subreddits private, disabling posting, and bringing in new moderators to help control the image of the specific community. Thus, fortification is more than just creating alternative homes (i.e., a kind of doomsday bunker), but also practicing defensive techniques in the face of threats. These fortification strategies were a way of building community resilience that would ensure the survival of the group. These sustainability tactics, in the form of off-reddit websites, the establishment of official stances and statements, and the sharing of “doomsday plans” from the moderators to the larger community supported the continuity of the group and helped lessen the paranoia and fear of regular users. By fostering a sense of safety through these contingency plans, the fortification was not just material but affective as well, and helped to solidify the collective identity and allegiance to the community. Another key component of these fortification strategies included the building of off-reddit archives by regular users as well as moderators, who would enlist the help of the community in building these backups of information. However, archiving is far more than a practice meant to preserve content meant merely for extraction, and is a political act in and of itself that is facilitated by the affordances of digital platforms. Archiving as an attempt to gain access to power is a practice that has been used 231 in multiple contexts, but is a practice used by these groups in particular to sustain their communities. Archiving. All three of the case studies practiced some level of archiving of the content that was created and shared within the communities themselves. The archives ranged from text posts, to entire collections of “readings” that the moderators deemed important to the discourse of the community, and also images/memes that the group shared amongst themselves. The archives existed not just as back up repositories of information for the community in the event of infrastructural failure, but to ensure that their ideas wouldn’t die out with them in the event of failure. The realities that they were building within these communities suggest that their archiving and knowledge preservation practices were not just meant for the preservation of the group but an attempt to make an incision into the body and fabric of history itself. As Lee noted with Indonesian activists, this “cutting into history” through the use of collective memory, archivization, and knowledge production and preservation made incisions on history and culture that “nobody thought to close,” (Lee, 2016, p. 15). This incision created the necessary opening for those who did not have access and the power to shape the truth to enter, but in the case of extremist groups it gave them the opening to enter, infect, and to spread. By doing so, they did not enter wanting to change the status quo but rather to create something new. The possibility and the ability to create these archives, to cut into history, is what De Kosnik referred to as “rogue archiving”, nonprofessional archivists who engage in cultural preservation on the Internet, and also engage in its remixing (De Kosnik, 2016). But this rogue archiving isn’t just the work of marginalized or minority groups, and can be used to support and preserve the interests of dominant groups – despite this, the creation of an archive gives the group a status of existence by providing proof that they were once there (De Kosnik, 2016). 232 Archiving is a key process in not just preserving information but in preserving group identity and discourse. In Lee’s context as well as many other studies on how social movements use narratives, collective frames, and archivization to support their movements and communities, these activists were attempting to push back against oppressive regimes in the name of democracy, freedom, and tended to lean left-wing. These archives served as more than just repositories of information for Lee’s context as well as the cases in this dissertation – they were a way of shaping the past in order to inform the present, and to project this new reality into the future that’s grounded in a historical time – the incision is one that cuts into memory and consciousness that grounds the group into a historical imaginary that becomes actualized through these practices (Lee, 2016; Polletta, 1998b; Trouillot, 2015). This epistemic production occurs through the practices that build this symbolic infrastructure, and it is these epistemic forms that foreground and influence the group’s core identity, ideology, and discourse. It is through this epistemic production that these new realities are produced and maintained – symbolic infrastructure, unlike participatory culture, is not a playful practice and process but rather has a specific goal to shift culture itself. Particularly, symbolic infrastructure relies on a number of strategies of subversion meant to facilitate the continued survival of the group. Strategies of Subversion It cannot be ignored that the actions of the three groups are those that were adapted and learned through participation within larger digital culture, particularly their involvement in “toxic technocultures” that led them to these communities in the first place (A. Massanari, 2015). Similar to other communities of practice, membership in these groups required a certain level of knowledge of how to use the platforms and navigate their design – as well as knowing the 233 correct language and other social norms to participate in a meaningful way within them (De Kosnik, 2016; Escobar, Kommers, & Beldad, 2014; Johnson, 2001; Nakamura, 2002). These practices follow the logics and the constraints of the platforms they are on – particularly in the realm of moderation and surveillance by platform administrators (Gillespie, 2017, 2018). Not only enacting their own rules for what is considered to be acceptable behavior within their social realm, the members of these groups and particularly the moderators enacted strategies to subvert, to circumvent, and to challenge the official “rules” that governed the entire platform rather than their one small community. Particularly, this subversion took form in the evolution of moderation practices, concealing discussions by moving off the platform, and by hiding their agenda in a variety of different mediums. These subversions served to not just avoid censorship but to try and shift culture in a direction that favored them. Instead of having to abide by official versions of history and the rules that govern the authority to know and decide that truth, they are able to subvert these authoritative standards on who gets to decide what is history and what is truth (De Kosnik, 2016; Foucault, 1982; Stoler, 2008; Trouillot, 2015). The battle over what power and truth is fundamentally shaped by access to systems that grant this ability to decide history, to decide truth, and when infrastructures that support these systems collapse their fragility highlights the power dynamics inherent in them. In particular, these claims to truth and power are a mode of epistemic production, and therefore marks these groups as different from other communities online. Similar to other social and political movements, these groups needed to come up with plans to sustain their epistemic and discursive forms. Of course, the need for movements to sustain themselves over time is not unique to the far right. Throughout history there have been movements that have lasted because the practices 234 that the groups adopted allowed for the movement’s reproduction over time (Hardt & Negri, 2017; Lee, 2016; Polletta, 2008, 2009; Polletta & Jasper, 2001). Narrative is a key component of not just recruitment but also maintaining a group identity and common goal (Polletta, 2009); activist created archives are necessary for socialization into the group on the same ideological premise that is the goal and motivation for the group in the first place (Lee, 2016); and in online contexts, communicative action is a foundational part of online mobilization and organization that facilitates a politics of visibility (Milan, 2015a). But all of these aforementioned works were focused on more progressive social movements, and did not necessarily take into account how these processes would occur within extremist, radical, right-wing groups. Different from more progressive, left-wing social movements and deviating from the norms of participatory culture, what these groups attempt to do is not to remix culture, but rather to recreate it. In particular, the creation of symbolic infrastructure to engage in epistemic production is what makes these groups distinct from other online communities like fandoms who tend to remix cultural products and who treat all of media as a malleable archive (De Kosnik, 2016). This stands in opposition to the practices of extremist groups who are not aiming to remix, but rather to recreate. Recreation though occurs in subversive ways - one key factor that helps in grounding their epistemic and discursive forms is the way that they maneuver around censorship using language. The hard truth of the matter is that the far right and the affiliated fringe groups (i.e. The Manosphere, etc.) use many of the same arguments, methods, practices, and rhetoric of these so-called “good” movements in order to achieve their goals. This method of adopting the strategies of the left also exist among the larger alt right, who have been noted for adopting the left-wing community organizing book Rules for Radicals by Saul Alinsky (Harkinson, 2017). The way that this was done in the communities studied in this case study was the most apparent 235 in of r/AznIdentity: they hid their real agenda in the rhetoric of progressive social justice movements as well as affiliating themselves with causes that were of important to other Asian activist groups in order to recruit more members into their group and to skirt around the increasing censorship of more extremist groups. The way that these communities cut across one another but also cut into more progressive social movements in their attempts to mimic, disrupt, and to overtake is a strategic practice meant to give their movements legitimacy. Perhaps in some way, what the Internet has allowed for in terms of providing an infrastructure for hate groups to convene, connect, and communicate on has made an incision that nobody thought to close, and as a result the wound has only grown wider and deeper as governments, activists, and citizens of various countries grapple with difficult questions on how to stop the spread of hate-filled violence (Alava et al., 2017; Crawford & Lumby, 2013; Daniels, 2018; Donovan & boyd, 2018). On one hand, deplatforming as a strategy may be ineffective in halting the spread of hate speech and extremism because of how deeply embedded these discourses and ideologies are within cultural and infrastructural systems. But what the cases in this dissertation highlighted is that deplatforming may have the opposite effect of its intended purpose: it can strengthen the group and alert them to begin the necessary preparations to ensure their survival. Failure as a Nexus for Action The three case studies in this dissertation did not adopt these sustainability practices and begin their doomsday preparation without some kind of motivator – rather, by experiencing failure themselves or observing the failures experienced by adjacent groups, action was taken to prevent an ever larger-scale failure (like the complete disappearance of the group itself). By 236 engaging in the practices of archiving, fortification, and identity maintenance and network building, they built symbolic infrastructures that would be able to absorb the shocks (Borgman et al., 2013) that happen within digital infrastructures. What differentiates symbolic infrastructure from other concepts like culture, collective memory, and community is the intention through which it is built and its material artifacts that are accessible and even changeable. Collectively, these groups with their moderators construct mediated versions of reality that they try and push as the one truth to rule over them all (Couldry & Hepp, 2016; Gamson, Croteau, Hoynes, & Sasson, 1992; Latour, 1999; Polletta, 2008). The moderators of these communities didn’t just actively engage in moderation practices to “save” their communities after policy changes that would mean their closure, but also were actively cultivating a very specific worldview and community identity: for instance, the moderators of r/TheRedPill not only aggressively moderated the main board but began building out (and serving as moderators for) affiliate subreddits that would expand their reach and appeal to “red pill’ more people into their beliefs. Culture, as an experimental system and which is always changing and shifting (Fischer, 2007), is often super-powerful in the structuring of everyday life, and often feels unchangeable and overwhelming to many oppressed groups and subcultures that exist within it. It is in the face of failure and destruction that the fragility of infrastructure, and of culture, are revealed and these systems are entangled within one another. This entanglement aggressively asserted itself partially because of what the Internet provides in terms of its possibilities. Specifically, platforms and infrastructures are the medium of possibility for these practices to occur – they provide the necessary foundations, they facilitate movement, and facilitate growth. What the findings of this dissertation reveal is that they manage to sustain their communities and persist in the face of failure not necessarily because of the 237 accessibility and structure of the Internet itself, but rather because they are able to maneuver around the logics of these systems and to exploit their affordances. Because of the way that platforms are designed and their low threshold for participation, the exploitation of their affordances was similar in each group. Although previous scholarship focused on the level of platforms, the focus should be on how infrastructures inform these practices among extremist groups, warranting deeper investigation into these hidden systems that shape digital life. The communicative practices as well as the practices of preservation within these case studies highlight the potential and possibility of the Internet – but also make transparent its faults, and the faults of our policies to control them (Kitada, 2012). The faults of the Internet and the struggle to control it are also what these groups were responding to in the form of an ever-evolving sociotechnical landscape that significantly shifted its stance in terms of their complicity in facilitating the rise of hate-based violence. Thus, despite the constant threat of failure, all of these levels of communicative possibilities and interaction contribute to the sustainability practices of the groups. For example, the way that these platforms are designed make it possible for average users without a high level of technical know-how to establish online spaces for their groups. This does not mean that the Internet was what caused these groups – rather, it just provided a new digital space not constrained by time, space, or borders for groups of people with similar ideologies to convene – and to create. The structure of the Internet not only allows for connection, but for sustaining these connections and groups through practices like archiving posts, building off-site repositories, and other means. It is not only the capacity of platforms and their affordances that allow these groups to persist by moving from platform to platform, but they manage to survive because of what they are able to build. Thus, they not only persist because of the way that their rhetoric entered the 238 mainstream, but because of what they created – facilitated by media infrastructures and their knowledge on how to navigate them – to sustain the movement. These sustainability practices, despite the goal to preserve the movement, were not met without tension and conflict, and this conflict often occurred due to challenges in group hierarchy. Follow the Leader? Hierarchies emerge in any kind of community, online or face-to-face, and appear in different forms based on the cultural and social norms of that society. But with any situation where there is a hierarchy, there is also a fight for power. As the case studies demonstrated, this structuring of their community based on a hierarchy of users that placed moderators (“mods”) and other famous power users at the top would sometimes lead to tensions within the forums about not just the kinds of content that were being allowed, but also about the plans/future of the community itself. Across all cases, moderators exerted their control not just by cleaning up any content that would earn the communities unwanted attention and scrutiny – particularly after policy changes moderators would go through and delete a large number of posts that would invite this – but also by squashing dissent within the community as well. The users of the forum are subjected to this hierarchy and need to accept it in order to continue participating within the group, but sometimes this falls apart in the case of users being banned for their behavior or being outed as imposters. But the negotiation of the hierarchy did not only occur within the group internally, but also included the very platform and digital infrastructure at the very top of the food chain due to their power in shutting down the groups altogether. The negotiated order of the “non-human” domain registrar/server provider, platform, and community all have differing levels of power that control the decisions, practices, and actions of 239 the group as a whole. These non-human actors, but also the people shaping the decisions of them, served as powerful villains that often were blamed for any kind of failure – and doing so helped to strengthen the group identity by pointing to these authoritative powers as the enemy. But the responses to them highlighted a tension – what is best for the collective vs. what does the collective actually want? All three cases had thousands of members but perhaps for active members this number is much smaller – most likely in the hundreds – but even so the moderators of the forum had to walk a fine line between doing what they felt was best for their users and community while also trying to exist within the boundaries imposed on them by the infrastructures and platforms they needed for their community to exist at all. Although there were a number of things that they built that were “static”, like the archives and other repositories of information, the processes to build these spaces and material objects were anything but. This dynamism marks a shift from our understanding of not just traditional social movements, but how they operate online and how order/leadership is negotiated and enforced. The tension among the “leaders” of the forum in the form of the mods as well as the other users are part of what informed their very practices. In a time where centralized platforms are giving everyday persons the ability to paint versions of reality that often are colliding with one another, the politics of knowledge building and these attempts to shift discourse are on full display (Foucault, 1982; Noble, 2018; R. Rogers, 2006; Swidler, 1986). The Web did not fundamentally alter human behavior but rather provided a new space for expression and disruption of the political tools of the past (Rogers, 2006). In the case of these three case studies, and of the larger rise of extremist social movements that have been fueled by the Internet, they don’t necessarily use the Internet and the Web in innovative, groundbreakingly unique ways but rather use the infrastructure of the Web just as it was intended to. Their ability to maneuver 240 around these systems’ constraints and exploit their affordances is the innovative nature of their practices outlined above. These politics of knowledge have been a long fraught battleground by groups, communities, and nations, all vying to lay claim to truth and thus, laying claim to power (Foucault, 1982; Trouillot, 2015). Archives, not being merely static material spaces, are symbolically dynamic in the choices made for the objects within them, and help to produce, reproduce, and transport discursive forms. Portable Discourse and Culture Transportability of the material and symbolic objects is a key factor in the perseverance of any group. The things that the groups in these case studies built are both static and dynamic, and their discourse is a large part of what they carry with them. The way that these things are preserved are facilitated by digital infrastructure – through this, they are able to build a symbolic infrastructure that is embedded within the infrastructure of the digital but also the structures of life and society. The materials they share and preserve, the language that they use, and the very discourses they base their thought upon are artifacts that all exist outside of the infrastructure of web, but are diffused throughout the communities as a kind of glue to keep them together. The culture that they create in opposition to dominant culture demonstrates the way that power circulates through these practices, particularly through the practice of archiving (Bolick, 2006; De Kosnik, 2016; Derrida, 1998). Thus, much like how religious and political ideologies are a complex, reflexive process that struggle to be established as the regime of truth in a given society (French, 2012; Mair, 2012), culture and memory through their silences and its reproduction of facts, establishes its own regime of truth. The cases in this dissertation actively fight against culture in an attempt to establish their historical narrative and their symbols as truth, as the 241 foundation of culture within their social worlds, and to gain access to higher forms of power (Foucault, 1982; Mair, 2012; Olick, 1999; Trouillot, 2015). Despite the limits of its possibilities, the Internet has helped facilitate an infrastructure for circulating information, community, and discourse that is malleable by groups attempting to lay claim to cultural norms and truths, and resulted in an experimental system that is built through the practices of netizens who live in the world (Fischer, 2007). Through their practices, they are able to harden these infrastructures and institutionalize them, even temporarily, which becomes the basis for the creation of symbolic infrastructure. Although primarily built by the moderators of the groups that do serve as its leaders without “official” names, it guides not only the rules of interaction but also the practices of the group as a whole (Bourdieu, 1977; B. Butler, Sproull, Kiesler, & Kraut, 2007). By subverting the standards of not just who was the authority to participate in archivization and knowledge production work but reality itself, far right social movements engage in an information politics which fundamentally structures how netizens understand their realities. From search engine result placement, Google image results being gamed to be racist, and even on reddit where the algorithm is manipulated by users through the upvote and downvote function (Beer & Burrows, 2013; A. Massanari, 2015; Noble, 2018; R. Rogers, 2006). These practices and the knowledge of how to game these systems and exploit them for how they were designed points to a new form of warfare that the users of some of these communities themselves referred to as “social guerilla warfare.” “Warfare” informs the way that these case studies and the larger far right view their capacity to change society and culture writ large. But they are limited in their battle strategy due to the very platforms where they are engaging in many of these tactics through harassment, disinformation and propaganda, and recruitment (Daniels, 2018; Donovan, 2019; Jeong, 2015; 242 Koebler & Maiberg, 2018; R. Lewis, 2018; Marwick & Lewis, 2017). Other social movements and online communities have had to build similar archives and symbolic infrastructures because of this fear of failure – for instance, feminist message boards and fandom communities began archiving their content to make their content freely accessible and usable by current and future members (De Kosnik, 2016). For all of these groups, attempting to archive and lay claim to what is valid knowledge that is worthy of preservation was not just for the present members of the community but also to ensure the community’s future, even if just to leave a message that they once existed. The groups studied in this dissertation are a departure from those studied by De Kosnik – particularly because their archiving practices are not focused on media and remixing cultural products (like in the case of fan communities) but rather attempting to change culture itself through these practices. Culture is what grounds realities, and breaking culture allows for a destruction and reshaping of reality. “Culture Wars” and the Fight for Reality Destruction in the present in order to create a new future operated as motivators for the modes of warfare that these groups found themselves to be engaging in. A fundamental aspect of how these communities view themselves and their communities to larger society is in their use of the term of “culture wars”, “Red Pilling” non-believers into “seeing the truth”, and how they view themselves as engaging in a mode of social guerilla warfare by participating in these movements. Their definition of themselves as engaging in this mode of cultural and social warfare is a form of symbolic subversion to challenge what these groups view as oppressive authorities of reality itself (Armstrong & Bernstein, 2008; Gitlin, 2003; Latour, 2005). This frequent referencing to the idea of a “culture war” and the need to engage in this mode of 243 “warfare” is supported not just by the practices that they use (i.e., harassment and building their communities and collective identities) but their attempts to push their discourse into the mainstream. Attempting to make room in popular discourse and popular platforms for their hateful ideologies reveals another facet of these preservation practices like archiving: the archive is not just a space of preservation but establishes the law of what can be said (Foucault, 1982). By making “official” accounts for their groups on mainstream social media platforms (particularly on Twitter), and in the case of r/AznIdentity creating other media forms like magazines and “watchdog” websites, the leaders of the subreddit dictate the acceptable discursive practices. Building a nationalistic-like devotion to the group, the constant reminder that they are engaged in a form of cultural warfare indicates that (a) culture is a battleground, one that is inherently only changeable by some form of violence and (b) that these groups see themselves as not just a community, but a military of sorts dedicated to fighting for a certain set of ideals. But engaging in “warfare” takes shape in many forms within these groups, and not all members are dedicated to acting as foot soldiers for the wills and goals of the leaders/moderators of the forum. They not only actively antagonize their perceived enemies, but are also often engaged in battles within the communities themselves due to infiltration by non-members, trolls, or even imposters – the work of the moderators exists not just to ban these outsiders but also to ban and keep in line the members that step out of line or violate the norms and rules of the community. Thus, they are not only negotiating the rules of the platforms that they find themselves reliant on for the continued growth of their movement but are constantly also negotiating hierarchy and order within their communities as well. This identity work and the negotiation of hierarchy is a necessary strategy for the sustainability of the group in order for the 244 group to not fracture and become disparate from the rest of the ecosystem that is established. Not only needing to navigate the politics of the platforms and their shifting nature, the groups themselves have to continually evolve their strategies and practices to fortify the group against possible attacks. The fight for reality and asserting that the members of the group were engaged in warfare highlights this tension well – reddit policy mandated that there would no longer be any wiggle room regarding violent content and harassment, but these were two things that all three communities relied on for their battle strategies. This tension, of not just their discourse about fighting a culture war but having to fight this war within the confines and boundaries of what the platforms allowed, show how the platform, the structure, and the practices are all bound together. In response to the platform cracking down on violent content, the users of the community moved to messaging applications like Discord and Slack and managed to maneuver around the rules. These groups, again, do not exist because of how these platforms are designed but rather they exist because of the interactions they are able to perform individually and collectively. The term “social guerilla warfare” that was used often on r/AznIdentity is an interesting one because of what it represents: a type of movement and violence that is not constrained by space and time and pointing to an intimate knowledge of how to navigate these systems – better than the enemy, making them easy to ambush. What made their tactics successful in “defeating” and silencing their critics and enemies was made possible by the centralization of social life online. Centralized Hubs of Extremism Centralization makes it easier for people to find, track, and surveil other groups, people, and movements because everything is typically in “one place”. Platforms are designed to be self- 245 contained systems and the centralization of the Internet experience has occurred due to the rise of these centralized systems (Andersson Schwarz, 2017; Gillespie, 2010). Despite this, many platforms now work together to become one large self-contained system, one that is increasingly becoming an infrastructure and less of a platform (Plantin, 2017; Postigo & O’Donnell, 2017). Despite this centralization, the web’s infrastructure still exists in the way that it was in the past – and users are able to traverse across the web environment in much the same way as before. Harnessing the power of web infrastructure, particularly due to its vastness and multiplicity of spaces for building community, these three case studies were able to break free of the confines of any single platform by relocating when either they were ousted or as a protective measure to avoid this exile. To sustain their communities and their movement, they would alter the practices to remain within the platform. Although all three of these groups would use alternative spaces in order to post content and have discussions that weren’t allowed on the “home” community due to the constraints of the policies, the building of these spaces as well as the discussions had in changing the accepted norms of the group point to a constant push-and-pull dynamic again between the moderators and the users. Movement, despite its intent to preserve the group, also created confusion and chaos. Movement is a tricky thing in any kind of space, web or otherwise, for a large group of people. Although the group did have a collective identity and users often would continue supporting and returning to the forum because of this shared connection, for all of the groups the basis of collective identity within the community was less based on more typical markers (race, ethnicity, nationality, interest, etc.) and more based on a shared affect, an affect of rage toward women and “modern society.” This collective identity is built perhaps less on these identity markers (except, in these groups, being men) but more on their shared practices and emotional 246 responses to a world they see as oppressive to them. Thus, when infrastructures fail when these groups are banned or otherwise, the movement to a new platform occurs not just at an individual user level but as a collective kind of movement, one that relies on these shared sets of practices and knowledge. It is not only the centralization of digital life that has facilitated the growth and spread of extremist thought, but rather the centralization of shared practices and subversive tactics that these groups use to sustain their movements. This shared subversion, and action, are the very building blocks of the practices that create symbolic infrastructure and allow for its portability across structures in the event of failure. Reproducing Structures When one considers all of these practices and actions in relation to infrastructure and platform, the structure of it all is facilitated by these reproductive practices that ensure the continual spread of certain discursive forms and ideologies (Bourdieu, 1999; Giddens, 1984; Levine, 2015). The modes of agency that are visible in the practices of the three case studies are meant to reproduce a certain structural property of their micro social systems for them to persist over time (Giddens, 1984), and taking advantage of the rhizomatic nature of the web has been a key component in how these hate groups manage to persist. Being able to break free of the chains of material and physical constraint that once prevented many of the members of these groups from finding each other, the web provided the necessary infrastructure to break down these constraints but also introduced new ones with things like policies, user agreements, and other modes of governance that structure the lived realities of netizens and particularly for social movements (DeNardis, 2012; Gillespie, 2017; Milan, 2015b). By looking at practice, and the possibilities of action, it’s possible to look past the technological determinism that many have 247 thrown as arguments for why the web and the Internet is at fault and brings forth the human element and the element of agency in how these spaces are not just structured but created and supported. The digitally mediated nature of human existence and social life requires the reimagining of how the field understands the building of social worlds (Couldry & Hepp, 2016). Power circulates not only in the form of the digital infrastructures themselves but makes itself known through the practices and freedom of movement from platform to platform that are afforded by the Internet. By using situational analysis and being grounded in the theories discussed earlier in this dissertation, uncovering the ways that far-right political actors are enabled by digital infrastructure to form community and establish archives, and yet, also constrained since many of them are lay users who may not understand the complex technical aspects, possibly illuminates the ways that digital platforms themselves engage in social action and also exist as the foundation for social systems. This complicated relationship of the digital platforms with the rise of far-right social movements has raised significant concern over the role of social media in the political process, the manipulability of digital technology by individuals as well as the platform corporations themselves, and the very idea of “reality” itself. Memory, history, knowledge, truth, and reality all have evolved along with digital technologies and warrant a different approach to the study of how power is being constituted and reproduced within these digital environments. Viewing reality through the lens of these specific social worlds and cultural systems that are established by human actors within the non-human actor of the digital platform demonstrates how the social situation is both the political and cultural realm of the “actual” world as well as the digital one. Reality, like the cultural and social worlds that comprise and shape it, is relational and filled with a multiplicity of actors all 248 vying for power to establish their own sense of order upon it (Couldry & Hepp, 2016b; Fischer, 2007; Foucault, 1972, 1980; Trouillot, 2015) – and thus, upon others outside of its boundaries. Narrative, archives, and consistent communicative actions are all building blocks to the maintenance of the movement and to ensure its reproducibility, which the groups in this case study also recognized and utilized in their building of these symbolic infrastructures. Similarly, the practices that these three case studies adopted and realized are meant for this continual reproduction of not just their movement, but the discursive forms they create to try and disrupt the culture they are fighting against. Disruption is not a projective goal (i.e., disrupting dominant culture) but is a fundamental part of the groups’ day-to-day existence (i.e., constant attacks and threats of failure). The practices that these groups used are almost always in response to some kind of event – in this case, the situation that they found themselves in was a shifting sociopolitical climate about the nature of the web and platforms in facilitating and supporting hate speech, fake news, and other toxic modes of discourse. But what can also be argued is that the toxicity wasn’t brought about due to the platforms’ negligence, but rather that the toxicity is inherent within our cultural systems and beliefs and this was embedded within the infrastructure of the web and its platforms from the very start (Jeong, 2015; Noble, 2018; Shaw, 2014). Acknowledging this again highlights the role of the people that create these structures and the agency that they have in this creation and brings back the role of the human in a world of non-human actants. Both need to be considered to have their own agency and processes of assembling, and this challenging of “the social” brings forth the assemblages of the human and non-human that structure reality itself (Latour, 2005). 249 The very arrangement of these groups across platforms and how they create these assemblages is also a sustainability tactic – they spread out, meaning that if one “station” is attacked and shut down, there is always another one poised and ready to take over in the event of this catastrophic failure. These assemblages are gathered together in order to bring about the effect of destruction (Deleuze & Guattari, 1987), one that is meant to topple mainstream culture, in order to enforce their own. In a sense, these groups are evolving the concept of an imagined community (Anderson, 2006), in that they are establishing a new kind of nation meant to act as political and social vehicles to cause social unrest. Using language, archives, and other modes of history and knowledge production, their practices point to this longing to not just be remembered, but also reproduced, and the consciousness and framing processes that occur within their communities are indicative of this longing (Anderson, 2006; Benford & Snow, 2000). The participatory nature of the web helps to break down the barriers to these processes, as well as providing the necessary infrastructure to do so, and the power to establish these claims to truth (Andrejevic et al., 2014; Jenkins et al., 2015; Kelty, 2012). The focus of these groups is not just preservation but transmission and sustainability, and their practices are meant to not just preserve but to become the culturally dominant discourse. These communities, through a visage of playfulness, are attempting to fundamentally alter social life and existence, and attempts to destroy or dismiss them often can have a backfire effect and sometimes make them stronger. Anticipating the consequences when enacting strategies will be crucial moving forward, and although nobody can predict with 100% accuracy what these will be, by analyzing these groups’ responses to failure better policies and strategies can be created. Further, the analysis of these communities and political and social movements warrants a larger discussion of the concept of reality, of social worlds, of epistemic production, and the very 250 kinds of knowledge and discursive forms they attempt to distribute and reproduce. Reality is a frequently contested metaphysical and physical space that has been the site of epistemic battle since the establishment of historical archives meant to preserve narratives and knowledge (Trouillot, 2015). In my dissertation, I engaged with not only conceptions of discourse and infrastructure but also theoretical approaches to knowledge, reality, and the construction of historical truths and the establishment of archives; culture and cultural forms as spaces that reify and reinforce power; the role of technology in the shaping of these historical narratives; and methodological approaches to analyzing these communities. By analyzing the theoretical and methodological in tandem, I attempted to highlight how these communities participate in the construction of knowledge and how the entangled forces of platform, infrastructure, and communities all participate in this process. What I discovered is that what connects these cases are the similar set of practices that they adopted in order to assure the substantiality and perseverance of their groups – off-site forums, using different platforms or applications for communication, creating archives, making official accounts, and aggressive moderation strategies in the face of platform policy changes were all some of the similarities across them. Of course, these similar practices are due to the constraints of the web infrastructures they are embedded in but one cannot notice the similarities between the groups being too alike to not think that they are learning from each other as well. It is these shared practices, and not necessarily shared beliefs (since members have varying degrees of intensity and adherence to the dogma), that connects the members within the specific cases together but also connects the seemingly disparate communities. But like other social movements, particularly those that are more progressive, these affordances of archiving, creation, production and reproduction extend to more extremist hate groups as well (De Kosnik, 251 2016). These practices, and their discourses, are in a way heteroglossic in how they speak to one another and also invoke a certain kind of historical past that justifies not just their beliefs, but their actions (Bakhtin, 1983; Jaffe, Koven, Perrino, & Vigouroux, 2015). This project sought to answer important questions about the nature of the web, the platforms, and the users within these communities and their practices that point to how these ideologies are being sustained. Like other forms that govern the social realities of people within any given society (Levine, 2015), the patterns and the rhythms that guide these groups’ practices are those that have been long practiced and in this case, subverted, in order to ensure the longevity of oppressive power structures. But in the case of far-right extremism, the attempt is not only to create something new but to also recreate the past in the present, and to project the past into the future, and this collapsing of time and the collapsing of space enabled by the web has fueled their movement. Final Thoughts Why do hate groups persevere? That was the main overarching question for this dissertation – particularly because despite the attempts of platforms to quell hate speech, harassment, and extremism, even after these changes the number of hate-related violent attacks and the number of hate groups continues to grow (Anti-Defamation League, 2019; Southern Poverty Law Center, 2018). The case studies in this dissertation support a body of literature that points to the Internet as a space where this radicalization occurs – but not just by demonstrating how this radicalization is taking place, but rather the practices of the groups to sustain their communities, which may shed light why on these groups not only persevere but, in some ways, thrive. Attention must be paid not just to the nature of the web and the platforms within its 252 infrastructural systems, but to the larger sociopolitical and cultural climate that produced these ideologies and their forms long before the Internet (Daniels, 2009b, 2018; Levine, 2015; Noble, 2018; Thiesmeyer, 1999). Deplatforming is an effective, but limited, method in halting the spread of hate speech and extremism – particularly because what is banned on one platform does not necessarily extend to all of the others but also because deplatforming only removes one node of a vast assemblage of extremist discourse. Although after certain figures are banned from one platform other platforms take note and do the same, the nature of the web allows for the continual migration and reestablishment of communities – and even allows for entirely new spaces to be built, like Gab or Voat. Thus, despite attempts to deplatform, many of these communities and ideologies persist because of the possibility for them to continue in some other space. Although they will lose members in each jump, the core group of users that sustain the community will continue to follow – and it is increasingly becoming more sophisticated and organized, like in the case of these communities in this dissertation. What this dissertation revealed is not just whether or not deplatforming affects the groups but rather how these groups respond to this mode of infrastructural failure – and by studying these responses and practices, attempted to discover why they manage to persist. This persistence is merely a mirror of how these discourses are facilitated in the “face-to-face” world, where these discourses have been able to reproduce and persist over time (Levine, 2015). As many scholars have noted before, the web and the Internet are not a world apart but rather serve as mirrors to reality and “face-to-face” life (Boellstorff et al., 2012; Couldry & Hepp, 2016; Nakamura, 2002; Shaw, 2014; van Dijck, 2013). The Internet has allowed for the creation of worlds within our world, and our reality, to break apart and play with reality, with identity, and with community in 253 general, shifting and changing our understanding of how identity formation and radicalization occurs (Black, 2006; Tan, Wang, & Gomes, 2016; Wojcieszak, 2010). But radicalization is not a passive process, and the dynamism of this process was revealed by the responses to failure by these groups. Their symbolic infrastructure, the politics and power of platforms and infrastructure, and the sociopolitical and cultural climate that simultaneously antagonizes and fuels their anger are all actors in this perseverance. In some regard, hate groups and extremism online are like the kudzu plant – an ivy-like plant that will overtake and continue to spread even after attempts to kill it, because it will only stop growing if it is destroyed at its source. The Internet and the platforms that enabled these hate groups are not the source of their movement, but rather have been the nurturing environment for them to flourish. These structures, and these practices, are embedded in one another, and the technicians who built these structures did so with a utopian vision that aligned with their hopeful version of the world – but without the realizations of how powerful this tool would be in creating so many unofficial versions of reality (Couldry & Hepp, 2016; Han, 2017; Noble, 2018). The possibilities of the Internet also became its consequences, and revealed the potential of its limits. As Audre Lorde famously said, “the master’s tools will never dismantle the master’s house,” (Lorde, 1984), and focusing strategies to combat extremism only on the platforms that they are visible on generates no future, no space, for radical change. To truly combat the rising tide of extremism, the root of hate must be identified, acknowledged, and destroyed to prevent its spread. This work is not the responsibility of platforms, nor even that of digital infrastructure, and must occur at a social and cultural level to eliminate its source - the root of which lives, breathes, and inflicts violence in and on the bodies behind the screens. 254 APPENDICES 255 4chan 8chan Alt Right Brigading APPENDIX A: Glossary An anonymous message board known for cultivating racist and misogynistic thought; but also dedicated to a number of other topics Image board website, similar to but not related to 4chan, that was started after the creator observed increasing surveillance and a loss of free speech online, has minimal moderation as a result A neo-conservative movement that differentiates itself from the political right by grounding their politics and ideology in white nationalism and white supremacy The practice of web forum users to enter other forums (particularly on reddit) and to generally create chaos within the target community by manipulating algorithms or saturating the board by posting content Dark Enlightenment An anti-democratic and reactionary movement that positions Dark Web Doxxing Incel Manosphere MGTOW Mod MRAsians Normies PAA Red Piller Reddit SJW Subreddit r/TheRedPill Voat themselves as antithetical to the Enlightenment; serves as the basis for the ideology of the alt right World Wide Web content that exists on darknets, which are overlay networks that use the Internet but require specific software or authorization to access; only makes up a small portion of the “deep web” The act of posting personal information for a target on online spaces to encourage harassment Short for “Involuntary Celibate,” a community of men (and women) online who are unable to successfully have sexual or romantic relationships A loosely connected online network of Men’s Rights groups, Pick Up Artists, Incels, etc. that is affiliated with the larger alt right Men Going Their Own Way, a group within the Manosphere Short for moderator (plural: “mods”) Portmanteau of Men’s Rights Activist and Asians The derogatory term for “normal people” used in Internet communities like 4chan, 8chan, etc. “Progressive Asian Activist,” used as a derogatory term in the MRAsian community for Asian activists on the progressive left The term for a user of r/TheRedPill A web forum and social news content aggregating website, known for serving as the hub for a number of extremist movements before their ban “Social Justice Warrior,” used as a derogatory term for those on the progressive left A subcommunity within the larger reddit community, often called “sub” for short A prominent community and subreddit in the Manosphere that combines men’s rights activism with pick up artistry An “alternative reddit” that was started after a series of bans on reddit in 2015 256 APPENDIX B: Relational Maps Figure 108. Relational Map of r/TheRedPill Figure 109. Relational Map of r/Incels 257 Figure 110. Relational Map of MRAsians and r/AznIdentity 258 APPENDIX C: Discourse Maps Figure 111. Discourse Map 1 for r/TheRedPill Figure 112. Discourse Map 2 for r/TheRedPill 259 Figure 113. Discourse Map 1 for r/Incels Figure 114. Discourse Map 2 for r/Incels 260 Figure 115. Discourse Map 1 for MRAsians Figure 116. Discourse Map 2 for MRAsians 261 BIBLIOGRAPHY 262 Alava, S., Frau-Meigs, D., & Hassan, G. (2017). Youth and violent extremism on social media: BIBLIOGRAPHY mapping the research. UNESCO Publishing. Anderson, B. (2006). Imagined Communities: Reflections on the Origin and Spread of Nationalism (3rd ed.). Verso Books. Andersson Schwarz, J. (2017). Platform Logic: An Interdisciplinary Approach to the Platform- Based Economy. Policy & Internet, 9(4), 374–394. Andrejevic, M., Banks, J., Campbell, J. E., Couldry, N., Fish, A., Hearn, A., & Ouellette, L. (2014). Participations: Dialogues on the participatory promise of contemporary culture and politics. International Journal of Communication, 8, 1089–1106. Anti-Defamation League. (n.d.). Echo. Retrieved May 13, 2019, from Anti-Defamation League website: https://www.adl.org/education/references/hate-symbols/echo Anti-Defamation League. (2019). Right-Wing Extremism Linked to Every 2018 Extremist Murder in the U.S., ADL Finds. Retrieved May 1, 2019, from Anti-Defamation League website: https://www.adl.org/news/press-releases/right-wing-extremism-linked-to-every- 2018-extremist-murder-in-the-us-adl-finds?fbclid=IwAR3DG9KtpdMmgHg7K5- mX0w0DrS_BSuD5TwGfKeoZ78zn_EW_cMIivdjtbA Armstrong, E. A., & Bernstein, M. (2008). Culture, power, and institutions: A multi-institutional politics approach to social movements. Sociological Theory, 26(1), 74–99. Asarch, S. (2018, September 28). Reddit has placed more than 20 controversial subs in quarantine. Retrieved March 8, 2019, from Newsweek website: https://www.newsweek.com/reddit-quarantine-subs-toxic-controversial-moderators- 1144663 Bacarisse, B. (2017, April 25). The Republican Lawmaker Who Secretly Created Reddit’s Women-Hating ‘Red Pill.’ The Daily Beast. Retrieved from http://www.thedailybeast.com/articles/2017/04/25/the-republican-lawmaker-who- secretly-created-reddit-s-women-hating-red-pill Baker, P. C. (2017, June 13). Hunting the Manosphere. The New York Times. Retrieved from https://www.nytimes.com/2017/06/13/magazine/hunting-the-manosphere.html Bakhtin, M. M. (1983). The Dialogic Imagination: Four Essays (Reprint edition; M. Holquist, Ed.; C. Emerson, Trans.). Austin, Tex: University of Texas Press. Bambina, A. (2007). Online social support: The interplay of social networks and computer- mediated communication. Cambria Press. 263 Banet-Weiser, S., & Miltner, K. M. (2016). #MasculinitySoFragile: culture, structure, and networked misogyny. Feminist Media Studies, 16(1), 171–174. https://doi.org/10.1080/14680777.2016.1120490 Barker, V. (2009). Older Adolescents’ Motivations for Social Network Site Use: The Influence of Gender, Group Identity, and Collective Self-Esteem. CyberPsychology & Behavior, 12(2), 209–213. https://doi.org/10.1089/cpb.2008.0228 Barnes, R. (2018). Lessons from# Gamergate. In Uncovering Online Commenting Culture (pp. 93–111). Springer. Bartlett, J., & Miller, C. (2010). The power of unreason: conspiracy theories, extremism and counter-terrorism. Demos, London. Retrieved from http://westernvoice.net/Power%20of%20Unreason.pdf Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1–13. https://doi.org/10.1080/1369118X.2016.1216147 Beer, D., & Burrows, R. (2013). Popular culture, digital archives and the new social life of data. Theory, Culture & Society, 30(4), 47–71. Benett, T. (2018, April 5). Gab Is the Alt-Right Social Network Racists Are Moving to. Retrieved October 6, 2018, from Vice website: https://www.vice.com/en_uk/article/ywxb95/gab-is-the-alt-right-social-network-racists- are-moving-to Benford, R. D., & Snow, D. A. (2000). Framing Processes and Social Movements: An Overview and Assessment. Annual Review of Sociology, 26(1), 611–639. https://doi.org/10.1146/annurev.soc.26.1.611 Bennett, W. L., & Segerberg, A. (2012). The Logic of Connective Action: Digital Media and the Personalization of Contentious Politics. Information, Communication & Society, 15(5), 739–768. https://doi.org/10.1080/1369118X.2012.670661 Benwell, B. (2014). Language and Masculinity. In The Handbook of Language, Gender, and Sexuality (pp. 240–259). https://doi.org/10.1002/9781118584248.ch12 Bernstein, M. (1997). Celebration and suppression: The strategic uses of identity by the lesbian and gay movement. American Journal of Sociology, 103(3), 531–565. Bernstein, M. (2005). Identity politics. Annu. Rev. Sociol., 31, 47–74. Bernstein, M., & De la Cruz, M. (2009). What are You?: Explaining Identity as a Goal of the Multiracial Hapa Movement. Social Problems, 56(4), 722–745. https://doi.org/10.1525/sp.2009.56.4.722 264 Black, R. W. (2006). Language, culture, and identity in online fanfiction. E-Learning and Digital Media, 3(2), 170–184. Blazak, R. (1999). Youth and hate. Intelligence Report, 96, 24–27. Boczkowski, P. J. (2007). Bridging STS and communication studies: Scholarship on media and information technologies. In The Handbook of Science and Technology Studies (3rd ed., pp. 949–977). Retrieved from https://www.scholars.northwestern.edu/en/publications/bridging-sts-and-communication- studies-scholarship-on-media-and-i Boellstorff, T. (2015). Coming of Age in Second Life: An Anthropologist Explores the Virtually Human. Princeton University Press. Boellstorff, T., Nardi, B., Pearce, C., & Taylor, T. L. (2012). Ethnography and virtual worlds: A handbook of method. Princeton University Press. Bolick, C. M. (2006). Digital archives: Democratizing the doing of history. International Journal of Social Education, 21(1), 122–134. Borgman, C. L., Edwards, P. N., Jackson, S. J., Chalmers, M. K., Bowker, G. C., Ribes, D., … Calvert, S. (2013). Knowledge infrastructures: Intellectual frameworks and research challenges. Bourdieu, P. (1977). Outline of a Theory of Practice (1st English Ed edition; R. Nice, Trans.). Cambridge: Cambridge University Press. Bourdieu, P. (1999). Language and Symbolic Power (Reprint edition; J. Thompson, Ed.; G. Raymond & M. Adamson, Trans.). Cambridge: Harvard University Press. Bowker, G. C., & Star, S. L. (1999). Sorting Things Out: Classification and Its Consequences. The MIT Press. Bowker, G. C., Timmermans, S., Clarke, A. E., & Balka, E. (2016). Boundary objects and beyond: Working with Leigh Star. MIT Press. boyd, danah. (2010). Social Network Sites as Networked Publics: Affordances, Dynamics, and Implications. In Z. Papacharissi (Ed.), Networked Self: Identity, Community, and Culture on Social Network Sites (pp. 39–58). Routledge. Brooks, A. (2017, November 10). Popping the Red Pill: Inside a digital alternate reality. Retrieved March 9, 2019, from CNNMoney website: https://money.cnn.com/2017/11/10/technology/culture/divided-we-code-red- pill/index.html 265 Brown, A. D. (2006). A narrative approach to collective identities. Journal of Management Studies, 43(4), 731–753. Buechler, S. M. (2011). Understanding Social Movements: Theories from the Classical Era to the Present (1 edition). Boulder: Routledge. Butler, B., Sproull, L., Kiesler, S., & Kraut, R. (2007). Community effort in online groups: Who does the work and why? In S. Weisband (Ed.), Leadership at a Distance (pp. 171–194). Hillsdale, NJ: Lawrence Erlbaum Associates. Butler, J. (1988). Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory. Theatre Journal, 40(4), 519–531. https://doi.org/10.2307/3207893 Caffier, J. (2017, September 11). Here Are Reddit’s Whiniest, Most Low-Key Toxic Subreddits. Retrieved from Vice website: https://www.vice.com/en_us/article/8xxymb/here-are- reddits-whiniest-most-low-key-toxic-subreddits Chandrasekharan, E., Pavalanathan, U., Srinivasan, A., Glynn, A., Eisenstein, J., & Gilbert, E. (2017). You can’t stay here: The efficacy of reddit’s 2015 ban examined through hate speech. Proceedings of the ACM on Human-Computer Interaction, 1(CSCW), 31. Charmaz, K., & Belgrave, L. (2012). Qualitative interviewing and grounded theory analysis. The SAGE Handbook of Interview Research: The Complexity of the Craft, 2, 347–365. Chau, M., & Xu, J. (2007). Mining communities and their relationships in blogs: A study of online hate groups. International Journal of Human-Computer Studies, 65(1), 57–70. https://doi.org/10.1016/j.ijhcs.2006.08.009 Chess, S., & Shaw, A. (2015). A Conspiracy of Fishes, or, How We Learned to Stop Worrying About #GamerGate and Embrace Hegemonic Masculinity. Journal of Broadcasting & Electronic Media, 59(1), 208–220. https://doi.org/10.1080/08838151.2014.999917 Chew, E. (2018). The Hypocrisy Of MRAsians And Why Their Harassment Derails Asian Advocacy. Retrieved November 26, 2018, from You Offend Me You Offend My Family website: https://www.yomyomf.com/the-hypocrisy-of-mrasians-and-why-their- harassment-derails-asian-advocacy/ Chiu, C.-M., Huang, H.-Y., Cheng, H.-L., & Sun, P.-C. (2015). Understanding online community citizenship behaviors through social support and social identity. International Journal of Information Management, 35(4), 504–519. https://doi.org/10.1016/j.ijinfomgt.2015.04.009 Chun, E. (2016). The Meaning of Ching-Chong: Language, Racism, and Response in New Media. In H. S. Alim, J. R. Rickford, & A. F. Ball (Eds.), Raciolinguistics: How Language Shapes Our Ideas About Race (pp. 81–96). Retrieved from 266 http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780190625696.001.0001/a cprof-9780190625696-chapter-5 Chun, E., & Lo, A. (2015). Language and Racialization. In N. Bonvillain (Ed.), The Routledge Handbook of Linguistic Anthropology. Routledge. Clarke, A. E. (2003). Situational analyses: Grounded theory mapping after the postmodern turn. Symbolic Interaction, 26(4), 553–576. Clarke, A. E., Friese, C., & Washburn, R. S. (2017). Situational analysis: grounded theory after the interpretive turn. Sage Publications. Clarke, A. E., & Star, S. L. (2008). The social worlds framework: A theory/methods package. In The handbook of science and technology studies (Vol. 3, pp. 113–137). Coston, B. M., & Kimmel, M. (2012). White Men as the New Victims: Reverse Discrimination Cases and the Men’s Rights Movement. Nevada Law Journal, 13, 368. Couldry, N., & Hepp, A. (2016). The mediated construction of reality. John Wiley & Sons. Coulson, N. S. (2005). Receiving social support online: an analysis of a computer-mediated support group for individuals living with irritable bowel syndrome. CyberPsychology & Behavior, 8(6), 580–584. Cozza, M., Gherardi, S., & Poggio, B. (2018). Narratives as boundary objects. European Congress of Qualitative Inquire. Crawford, K., & Lumby, C. (2013). Networks of governance: users, platforms, and the challenges of networked media regulation. International Journal of Technology Policy and Law, 1(3), 270–282. Daniels, J. (2009a). Cloaked websites: propaganda, cyber-racism and epistemology in the digital era. New Media & Society, 11(5), 659–683. Daniels, J. (2009b). Cyber Racism: White Supremacy Online and the New Attack on Civil Rights. Lanham, Md: Rowman & Littlefield Publishers. Daniels, J. (2013). Race and racism in Internet studies: A review and critique. New Media & Society, 15(5), 695–719. Daniels, J. (2018). The Algorithmic Rise of the “Alt-Right.” Contexts, 17(1), 60–65. https://doi.org/10.1177/1536504218766547 De Fina, A., & Perrino, S. (2017). Introduction: Storytelling in the digital age: New challenges. Narrative Inquiry, 27(2), 209–216. 267 De Kosnik, A. (2016). Rogue Archives: Digital Cultural Memory and Media Fandom (First Edition edition). Cambridge, Massachusetts: The MIT Press. Deleuze, G., & Guattari, F. (1987). A Thousand Plateaus: Capitalism and Schizophrenia (2 edition; B. Massumi, Trans.). Minneapolis: University of Minnesota Press. DeNardis, L. (2012). Hidden levers of Internet control: An infrastructure-based theory of Internet governance. Information, Communication & Society, 15(5), 720–738. Derrida, J. (1998). Archive Fever: A Freudian Impression (1 edition; E. Prenowitz, Trans.). Chicago, Ill.: University of Chicago Press. Dewey, C. (2014). Inside the ‘manosphere’ that inspired Santa Barbara shooter Elliot Rodger. Retrieved November 25, 2017, from Washington Post website: https://www.washingtonpost.com/news/the-intersect/wp/2014/05/27/inside-the- manosphere-that-inspired-santa-barbara-shooter-elliot-rodger/ Donovan, J. (2019, March 17). How Hate Groups’ Secret Sound System Works. Retrieved March 18, 2019, from The Atlantic website: https://www.theatlantic.com/ideas/archive/2019/03/extremists-understand-what-tech- platforms-have-built/585136/ Donovan, J., & boyd, danah. (2018, June 1). The case for quarantining extremist ideas | Joan Donovan and Dana Boyd. The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2018/jun/01/extremist-ideas-media- coverage-kkk Emirbayer, M., & Mische, A. (1998). What Is Agency? American Journal of Sociology, 103(4), 962–1023. https://doi.org/10.1086/231294 Emperor, T. (2018, August 30). A Tectonic Shift In the Discourse on Asian American Men. Retrieved April 1, 2019, from Medium website: https://medium.com/emperor- magazine/a-tectonic-shift-in-the-discourse-on-asian-american-men-1a630973abbb Escobar, M. L., Kommers, P. A., & Beldad, A. (2014). Using narratives as tools for channeling participation in online communities. Computers in Human Behavior, 37, 64–72. Evans, S. K., Pearce, K. E., Vitak, J., & Treem, J. W. (2016). Explicating affordances: A conceptual framework for understanding affordances in communication research. Journal of Computer-Mediated Communication, 22(1), 35–52. Faraj, S., & Azad, B. (2012). The materiality of technology: An affordance perspective. In P. M. Leonardi, B. A. Nardi, & J. Kallinikos (Eds.), Materiality and organizing: Social interaction in a technological world (Vol. 237–258, p. 258). 268 Faris, R., Roberts, H., Etling, B., Bourassa, N., Zuckerman, E., & Benkler, Y. (2017). Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election (SSRN Scholarly Paper No. ID 3019414). Retrieved from Social Science Research Network website: https://papers.ssrn.com/abstract=3019414 Fischer, M. M. J. (2007). Culture and Cultural Analysis as Experimental Systems. Cultural Anthropology, 22(1), 1–65. https://doi.org/10.1525/can.2007.22.1.1 Flesher Fominaya, C., & Gillan, K. (2017). Navigating the technology-media-movements complex. Social Movement Studies, 16(4), 383–402. Flinn, M. V. (1988). Mate guarding in a Caribbean village. Ethology and Sociobiology, 9(1), 1– 28. https://doi.org/10.1016/0162-3095(88)90002-7 Foucault, M. (1982). The Archaeology of Knowledge: And the Discourse on Language (3988th edition). New York, NY: Vintage. French, B. (2012). The Semiotics of Collective Memories. Annual Review of Anthropology, 41, 337–353. Futrelle, D. (2017a). Inside the Dangerous Convergence of Men’s-Rights Activists and the Alt- Right. Retrieved April 2, 2017, from The Cut website: http://nymag.com/thecut/2017/03/what-james-jackson-reveals-about-mgtow-and-the-alt- right.html Futrelle, D. (2017b, August 17). Men’s-Rights Activism Is the Gateway Drug for the Alt-Right. Retrieved March 9, 2019, from The Cut website: https://www.thecut.com/2017/08/mens- rights-activism-is-the-gateway-drug-for-the-alt-right.html Futrelle, D. (2018). Incels hail “our savior St. Nikolas Cruz” for Valentine’s Day school shooting [UPDATED]. Retrieved March 29, 2019, from We Hunted The Mammoth website: http://www.wehuntedthemammoth.com/2018/02/14/incels-hail-our-savior-st-nikolas- cruz-for-valentines-day-school-shooting/ Gamble, A. E. (2009). Hapas: Emerging identity, emerging terms and labels & the social construction of race. Stanford Journal of Asian American Studies, 2, 1–20. Gamson, W. A. (1992). The Social Psychology of Collective Action. In A. P. A. D. Morris, A. D. Morris, C. M. Mueller, & A. P. C. M. Mueller (Eds.), Frontiers in Social Movement Theory (pp. 53–76). Yale University Press. Gamson, W. A., Croteau, D., Hoynes, W., & Sasson, T. (1992). Media Images and the Social Construction of Reality. Annual Review of Sociology, 18(1), 373–393. https://doi.org/10.1146/annurev.so.18.080192.002105 269 Giddens, A. (1984). The Constitution of Society: Outline of the Theory of Structuration. University of California Press. Gillespie, T. (2010). The politics of ‘platforms.’ New Media & Society, 12(3), 347–364. Gillespie, T. (2017). Governance of and by platforms. In J. Burgess, T. Poell, & A. Marwick (Eds.), The SAGE Handbook of Social Media (pp. 254–278). Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. Ging, D. (2017). Alphas, Betas, and Incels: Theorizing the Masculinities of the Manosphere. Men and Masculinities, 1097184X17706401. https://doi.org/10.1177/1097184X17706401 Girard, A. L. (2009). Backlash or Equality? Violence Against Women, 15(1), 5–23. Gitlin, T. (2003). The whole world is watching: Mass media in the making and unmaking of the new left. Univ of California Press. Gotell, L., & Dutton, E. (2016). Sexual Violence in the “Manosphere”: Antifeminist Men’s Rights Discourses on Rape. International Journal for Crime, Justice & Social Democracy, 5(2), 65–80. https://doi.org/10.5204/ijcjsd.v5i2.310 Griffiths, M. (2017, April 19). Who Are the Women of Red Pill? Retrieved from Vice website: https://www.vice.com/en_nz/article/xyjyk7/who-are-the-women-of-red-pill Han, B.-C. (2017). In the Swarm: Digital Prospects (E. Butler, Trans.). Cambridge, MA: The MIT Press. Hardt, M., & Negri, A. (2017). Assembly. New York, NY: Oxford University Press. Harkinson, J. (2017). How Ann Coulter and the racist “alt-right” are using the lefty playbook to troll Berkeley. Retrieved May 22, 2019, from Mother Jones website: https://www.motherjones.com/politics/2017/04/ann-coulter-alt-right-berkeley-saul- alinsky-left-tactics-rules-for-radicals/ Hartley, J. (2015). Stories tell us? Political narrative, demes, and the transmission of knowledge through culture. Communication Research and Practice, 1(1), 5–31. https://doi.org/10.1080/22041451.2015.1042424 Hauser, C. (2018, January 20). Reddit Bans ‘Incel’ Group for Inciting Violence Against Women. The New York Times. Retrieved from https://www.nytimes.com/2017/11/09/technology/incels-reddit-banned.html 270 Hennessy-Fiske, M., Pearce, M., & Jarvie, J. (2018). Texas school shooter killed girl who turned down his advances and embarrassed him in class, her mother says. Retrieved March 30, 2019, from latimes.com website: https://www.latimes.com/nation/la-na-texas-shooter- 20180519-story.html Hill, K. (2015). The Disturbing Internet Footprint Of Santa Barbara Shooter Elliot Rodger. Retrieved March 9, 2019, from Forbes website: https://www.forbes.com/sites/kashmirhill/2014/05/24/the-disturbing-internet-footprint-of- santa-barbara-shooter-elliot-rodger/ Hine, G. E., Onaolapo, J., De Cristofaro, E., Kourtellis, N., Leontiadis, I., Samaras, R., … Blackburn, J. (2016a). Kek, Cucks, and God Emperor Trump: A Measurement Study of 4chan’s Politically Incorrect Forum and Its Effects on the Web. ArXiv:1610.03452 [Physics]. Retrieved from http://arxiv.org/abs/1610.03452 Hine, G. E., Onaolapo, J., De Cristofaro, E., Kourtellis, N., Leontiadis, I., Samaras, R., … Blackburn, J. (2016b). Kek, Cucks, and God Emperor Trump: A Measurement Study of 4chan’s Politically Incorrect Forum and Its Effects on the Web. ArXiv:1610.03452 [Physics]. Retrieved from http://arxiv.org/abs/1610.03452 Hockenson, L. (2015, July 9). What is Voat, the site Reddit users are flocking to? Retrieved March 2, 2019, from The Next Web website: https://thenextweb.com/insider/2015/07/09/what-is-voat-the-site-reddit-users-are- flocking-to/ Ingram, M. (2018). First it was Milo and Alex Jones, now platforms are being de-platformed. Retrieved May 7, 2019, from Columbia Journalism Review website: https://www.cjr.org/the_new_gatekeepers/gab-godaddy-deplatforming.php Jaffe, A., Koven, M., Perrino, S., & Vigouroux, C. B. (2015). Introduction: Heteroglossia, performance, power, and participation. Language in Society, 44(2), 135–139. https://doi.org/10.1017/S0047404515000019 Jarrett, L. (2017). “Pizzagate” shooting suspect pleads guilty. Retrieved March 8, 2019, from CNNPolitics website: https://www.cnn.com/2017/03/24/politics/pizzagate-suspect- pleads-guilty/index.html Jenkins, H., Ito, M., & boyd, danah. (2015). Participatory Culture in a Networked Era: A Conversation on Youth, Learning, Commerce, and Politics. John Wiley & Sons. Jensen, L. A. (1997). Different Worldviews, Different Morals: America’s Culture War Divide. Human Development, 40(6), 325–344. https://doi.org/10.1159/000278737 Jeong, S. (2015). The internet of garbage. New York, NY: Forbes Media. 271 Jin, J. (2017). Digital Platform as a Double-Edged Sword: How to Interpret Cultural Flows in the Platform Era. International Journal of Communication, 11, 3880–3898. Johnson, C. M. (2001). A survey of current research on online communities of practice. The Internet and Higher Education, 4(1), 45–60. https://doi.org/10.1016/S1096- 7516(01)00047-1 Kassam, A. (2018, April 26). Woman behind “incel” says angry men hijacked her word “as a weapon of war.” The Guardian. Retrieved from https://www.theguardian.com/world/2018/apr/25/woman-who-invented-incel-movement- interview-toronto-attack Kelly, A. (2017, August 15). The alt-right: reactionary rehabilitation for white masculinity [Text]. https://doi.org/info:doi/10.3898/136266217821733688 Kelty, C. M. (2012). From Participation to Power. In A. Delwiche & J. Jacobs Henderson (Eds.), The Participatory Cultures Handbook (1st ed., pp. 22–32). https://doi.org/10.4324/9780203117927-10 Kennedy, K. (2018). Here’s what we know about Nikolas Cruz, the Florida school shooting suspect. Associated Press. Kilgore, E. (2018, November 3). The Toxic Cycle That Keeps Republicans Focused on Culture War. Retrieved May 7, 2019, from Intelligencer website: http://nymag.com/intelligencer/2018/11/the-toxic-cycle-keeping-republicans-focused-on- culture-war.html Kitada, A. (2012). Japan’s cynical nationalism. Fandom Unbound: Otaku Culture in a Connected World, 68–84. Koebler, J., & Maiberg, E. (2018, August 10). Social Media Bans Actually Work. Retrieved October 3, 2018, from Motherboard website: https://motherboard.vice.com/en_us/article/bjbp9d/do-social-media-bans-work Koleva, S. P., Graham, J., Iyer, R., Ditto, P. H., & Haidt, J. (2012). Tracing the threads: How five moral concerns (especially Purity) help explain culture war attitudes. Journal of Research in Personality, 46(2), 184–194. https://doi.org/10.1016/j.jrp.2012.01.006 Koulouris, T. (2018). Online misogyny and the alternative right: debating the undebatable. Feminist Media Studies, 18(4), 750–761. https://doi.org/10.1080/14680777.2018.1447428 Koven, M. (2015). Narrative and Cultural Identities: Performing and Aligning with Figures of Personhood. In A. De Fina & A. Georgakopoulou (Eds.), The Handbook of Narrative Analysis (1st ed.). Chichester, West Sussex; Malden, MA: John Wiley & Sons. 272 Kraus, R. (2018). 2018 was the year we (sort of) cleaned up the internet. Retrieved May 7, 2019, from Mashable website: https://mashable.com/article/deplatforming-alex-jones-2018/ Lang, N. (2016). Trolling in the name of “free speech”: How Milo Yiannopoulos built an empire off violent harassment. Retrieved April 2, 2017, from Salon website: http://www.salon.com/2016/12/19/trolling-in-the-name-of-free-speech-how-milo- yiannopoulos-built-an-empire-off-violent-harassment/ Larkin, B. (2013). The politics and poetics of infrastructure. Annual Review of Anthropology, 42, 327–343. Latour, B. (1999). Pandora’s Hope: Essays on the Reality of Science Studies (1 edition). Cambridge, Mass: Harvard University Press. Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory. OUP Oxford. Lavin, T. (2018). The Neo-Nazis of the Daily Stormer Wander the Digital Wilderness | The New Yorker. Retrieved April 24, 2019, from The New Yorker website: https://www.newyorker.com/tech/annals-of-technology/the-neo-nazis-of-the-daily- stormer-wander-the-digital-wilderness Lee, D. (2016). Activist Archives: Youth Culture and the Political Past in Indonesia. Duke University Press. Lees, M. (2016, December 1). What Gamergate Should Have Taught Us About the “Alt-Right.” The Guardian. Retrieved from https://www.theguardian.com/technology/2016/dec/01/gamergate-alt-right-hate-trump Levine, C. (2015). Forms: Whole, Rhythm, Hierarchy, Network (1 edition). Princeton ; Oxford: Oxford University Press. Lewis, B., & Marwick, A. (2017). The Online Radicalization We’re Not Talking About. Retrieved October 31, 2017, from Select/All website: http://nymag.com/selectall/2017/05/the-online-radicalization-were-not-talking-about.html Lewis, R. (2018). Alternative Influence: Broadcasting the Reactionary Right on YouTube (p. 61). Retrieved from Data & Society website: https://datasociety.net/output/alternative- influence/ Lorde, A. (1984). The Master’s Tools Will Never Dismantle The Master’s House. In Sister Outsider: Essays and Speeches (pp. 110–114). Berkeley, CA: Crossing Press. Lorenz, T. (2018, October 27). The Pittsburgh Suspect Lived in the Web’s Darkest Corners. Retrieved March 30, 2019, from The Atlantic website: https://www.theatlantic.com/technology/archive/2018/10/what-gab/574186/ 273 Lyons, M. N. (2017). Ctrl-Alt-Delete: The Origins and Ideology of the Alternative Right. Retrieved from Political Research Associates website: http://www.politicalresearch.org/2017/01/20/ctrl-alt-delete-report-on-the-alternative- right/ Mair, J. (2012). Cultures of belief. Anthropological Theory, 12(4), 448–466. https://doi.org/10.1177/1463499612469588 Makuch, B., Koebler, J., & Mead, D. (2019, May 8). The Far Right Has Found a Web Host Savior. Retrieved May 8, 2019, from Vice website: https://www.vice.com/en_us/article/gy4yg9/the-far-right-has-found-a-web-host-savior Marantz, A. (2018, March 12). Reddit and the Struggle to Detoxify the Internet. The New Yorker. Retrieved from https://www.newyorker.com/magazine/2018/03/19/reddit-and-the- struggle-to-detoxify-the-internet Marques, I. S., & Koven, M. (2017). “We are going to our Portuguese homeland!”: French Luso- descendants’ diasporic Facebook conarrations of vacation return trips to Portugal. Narrative Inquiry, 27(2). Marshall, G. (2012). Whatever happened to Digg? Retrieved March 8, 2019, from TechRadar website: https://www.techradar.com/news/internet/web/whatever-happened-to-digg- 1093422 Marwick, A. E., & Caplan, R. (2018). Drinking male tears: language, the manosphere, and networked harassment. Feminist Media Studies, 0(0), 1–17. https://doi.org/10.1080/14680777.2018.1450568 Marwick, A. E., & Lewis, R. (2017). Media manipulation and disinformation online (pp. 1–106). Retrieved from Data & Society website: http://centerformediajustice.org/wp- content/uploads/2017/07/DataAndSociety_MediaManipulationAndDisinformationOnline .pdf Massanari, A. (2015). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807 Massanari, A., & Chess, S. (2018). Attack of the 50-foot social justice warrior: the discursive construction of SJW memes as the monstrous feminine. Feminist Media Studies, 18(4), 525–542. https://doi.org/10.1080/14680777.2018.1447333 Massanari, A. L. (2015). Participatory culture, community, and play.New York, NY. Peter Lang. Matney, L. (2015). Reddit Finally Bans Racist r/CoonTown And Other Hateful Communities, Updates User Policies. Retrieved March 2, 2019, from TechCrunch website: 274 http://social.techcrunch.com/2015/08/05/reddit-finally-bans-rcoontown-and-other- hateful-subreddits-updates-user-policies/ McGuire, P. (2014, May 26). Elliot Rodger’s Online Life Provides a Glimpse at a Hateful Group of “Anti-Pick-up Artists.” Retrieved May 8, 2018, from Vice website: https://www.vice.com/en_us/article/znwz53/elliot-rodgers-online-life-provides-a- glimpse-at-a-hateful-group-of-pick-up-artists McHenry, J. (2018, November 18). Constance Wu Responds to ‘Asian Incels’ Who Target Her Online. Retrieved April 1, 2019, from Vulture website: https://www.vulture.com/2018/11/constance-wu-responds-to-asian-incels-who-target- her.html McKay, T. (2018). Amazon Removes 9 Roosh V Books from Kindle Marketplace: Report. Retrieved May 13, 2019, from Gizmodo website: https://gizmodo.com/report-amazon- takes-down-nine-books-self-published-on-1828958245 McLaughlin, E. C., & Cullinane, S. (2018). Florida yoga studio shooter was arrested for groping women and for trespassing on FSU campus. Retrieved March 30, 2019, from CNN website: https://www.cnn.com/2018/11/05/us/florida-yoga-studio-shooting/index.html Menegus, B. (2017). Goodbye and Good Riddance to Voat, Reddit’s Gross Clone. Retrieved March 8, 2019, from Gizmodo website: https://gizmodo.com/goodbye-and-good- riddance-to-voat-reddits-gross-clone-1795337099 Mettler, K., & Selk, A. (2017). GoDaddy — then Google — ban neo-Nazi site Daily Stormer for disparaging Charlottesville victim. Retrieved April 24, 2019, from Washington Post website: https://www.washingtonpost.com/news/morning-mix/wp/2017/08/14/godaddy- bans-neo-nazi-site-daily-stormer-for-disparaging-woman-killed-at-charlottesville-rally/ Mezzofiore, G. (2018). “Incel rebellion”: The Toronto suspect apparently posted about it. Here’s what it means. CNN. Retrieved from https://www.cnn.com/2018/04/25/us/incel-rebellion- alek-minassian-toronto-attack-trnd/index.html Milan, S. (2015a). From social movements to cloud protesting: the evolution of collective identity. Information, Communication & Society, 18(8), 887–900. Milan, S. (2015b). When algorithms shape collective action: Social media and the dynamics of cloud protesting. Social Media+ Society, 1(2), 2056305115622481. Mills, C. (2015). It Seems Reddit Ex-CEO Ellen Pao Isn’t to Blame for Site’s Meltdown. Retrieved March 8, 2019, from Gizmodo website: https://gizmodo.com/it-seems-reddit- ex-ceo-ellen-pao-isnt-to-blame-for-sit-1717645639 Mortensen, T. E. (2016). Anger, Fear, and Games: The Long Event of #GamerGate. Games and Culture, 1555412016640408. https://doi.org/10.1177/1555412016640408 275 Nakamura, L. (2002). Cybertypes: Race, Ethnicity, and Identity on the Internet (1 edition). New York: Routledge. Neiwert, D. (2017). Alt-America: The Rise of the Radical Right in the Age of Trump. London ; New York: Verso. Newton, C. (2019, February 25). The secret lives of Facebook moderators in America. Retrieved April 21, 2019, from The Verge website: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator- interviews-trauma-working-conditions-arizona Ng, C. (2018, October 12). When Asian Women Are Harassed for Marrying Non-Asian Men. Retrieved November 26, 2018, from The Cut website: https://www.thecut.com/2018/10/when-asian-women-are-harassed-for-marrying-non- asian-men.html Nissenbaum, A., & Shifman, L. (2017). Internet memes as contested cultural capital: The case of 4chan’s /b/ board. New Media & Society, 19(4), 483–501. https://doi.org/10.1177/1461444815609313 Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press. Ohlheiser, A. (2016). Just how offensive did Milo Yiannopoulos have to be to get banned from Twitter? - The Washington Post. Retrieved March 9, 2019, from https://www.washingtonpost.com/news/the-intersect/wp/2016/07/21/what-it-takes-to-get- banned-from-twitter/?noredirect=on&utm_term=.a1a4e302c371 Olick, J. K. (1999). Collective memory: The two cultures. Sociological Theory, 17(3), 333–348. Papacharissi, Z. (Ed.). (2010). A Networked Self: Identity, Community, and Culture on Social Network Sites (1 edition). New York: Routledge. Parks, L. (2015). Stuff You Can Kick: Toward a Theory of Media Infrastructures. In P. Svensson & D. T. Goldberg (Eds.), Between Humanities and the Digital (pp. 355–373). MIT Press. Phillips, W. (2018). The Oxygen of Amplification (pp. 1–128). Retrieved from Data & Society website: https://datasociety.net/output/oxygen-of-amplification/ Plantin, J.-C. (2017). Mapping platforms as infrastructures: Participatory cartography, enclosed knowledge”. International Journal of Communication. Plantin, J.-C., Lagoze, C., Edwards, P. N., & Sandvig, C. (2018). Infrastructure studies meet platform studies in the age of Google and Facebook. New Media & Society, 20(1), 293– 310. https://doi.org/10.1177/1461444816661553 276 Polletta, F. (1998a). Contending stories: Narrative in social movements. Qualitative Sociology, 21(4), 419–446. Polletta, F. (1998b). “It was like a fever…” narrative and identity in social protest. Social Problems, 45(2), 137–159. Polletta, F. (2008). Culture and movements. The Annals of the American Academy of Political and Social Science, 619(1), 78–96. Polletta, F. (2009). It Was Like a Fever: Storytelling in Protest and Politics. University of Chicago Press. Polletta, F., & Jasper, J. M. (2001). Collective Identity and Social Movements. Annual Review of Sociology, 27(1), 283–305. https://doi.org/10.1146/annurev.soc.27.1.283 Postigo, H., & O’Donnell, C. (2017). The sociotechnical architecture of information networks. In The handbook of science technology studies (4th ed., pp. 583–608). Cambridge, MA. Postill, J., & Pink, S. (2012). Social Media Ethnography: The Digital Researcher in a Messy Web. Media International Australia, 145(1), 123–134. https://doi.org/10.1177/1329878X1214500114 Proferes, N. (2016). Web 2.0 user knowledge and the limits of individual and collective power. First Monday, 21(6). Reddit Announcements. (2018). r/announcements - Revamping the Quarantine Function. Retrieved March 8, 2019, from reddit website: https://www.reddit.com/r/announcements/comments/9jf8nh/revamping_the_quarantine_f unction/ RedditMetrics. (n.d.). /r/Incels metrics (“Incel” - Involuntary Celibacy). Retrieved May 25, 2019, from http://redditmetrics.com/r/Incels Renzi, A. (2015). Info-capitalism and resistance: how information shapes social movements. Interface: A Journal for and about Social Movements, 7(2), 98–119. Robertson, A. (2015, June 10). Reddit bans “Fat People Hate” and other subreddits under new harassment rules. Retrieved February 28, 2019, from The Verge website: https://www.theverge.com/2015/6/10/8761763/reddit-harassment-ban-fat-people-hate- subreddit Rogers, J. (2016, May 16). Reddit administrators accused of censorship [Text.Article]. Retrieved March 8, 2019, from Fox News website: https://www.foxnews.com/tech/reddit- administrators-accused-of-censorship Rogers, R. (2006). Information Politics on the Web. Cambridge, Mass.: The MIT Press. 277 Rothstein, A. (2015). How to See Infrastructure: A Guide for Seven Billion Primates | Rhizome. Retrieved May 16, 2019, from Rhizome website: http://rhizome.org/editorial/2015/jul/02/how-see-infrastructure-guide-seven-billion- primate/ Sacks, B. (2017). Reddit Is Removing Nazi And Alt-Right Groups As Part Of A New Policy And Some Users Are Confused. Retrieved March 9, 2019, from BuzzFeed News website: https://www.buzzfeednews.com/article/briannasacks/reddit-is-banning-nazi-and-alt-right- groups-as-part-of-a Salinas, S. (2018, September 6). Twitter permanently bans Alex Jones and Infowars accounts. Retrieved October 4, 2018, from https://www.cnbc.com/2018/09/06/twitter-permanently- bans-alex-jones-and-infowars-accounts.html Schafer, J. A. (2002). Spinning the Web of Hate: Web-based Hate Propagation by Extremist Organizations. Journal of Criminal Justice and Popular Culture, 9(2), 66–88. Schwartz, M. S. (2019). Facebook Bans Alex Jones, Louis Farrakhan And Other “Dangerous” Individuals. Retrieved May 7, 2019, from NPR.org website: https://www.npr.org/2019/05/03/719897599/facebook-bans-alex-jones-louis-farrakhan- and-other-dangerous-individuals Serradell, O., Cruz, I. S., & Mondejar, E. (2015). Can the men’s movement attract young men? The men in dialogue association. Journal of Gender Studies, 24(6), 677–688. https://doi.org/10.1080/09589236.2013.872556 Shackelford, T. K., Goetz, A. T., Guta, F. E., & Schmitt, D. P. (2006). Mate guarding and frequent in-pair copulation in humans. Human Nature, 17(3), 239–252. https://doi.org/10.1007/s12110-006-1007-x Shaw, A. (2014). The Internet Is Full of Jerks, Because the World Is Full of Jerks: What Feminist Theory Teaches Us About the Internet. Communication and Critical/Cultural Studies, 11(3), 273–277. https://doi.org/10.1080/14791420.2014.926245 Sherzer, J. (1987). A Discourse-Centered Approach to Language and Culture. American Anthropologist, 89(2), 295–309. Silverman, C. (2016, December 5). How The Bizarre Conspiracy Theory Behind “Pizzagate” Was Spread. BuzzFeed. Retrieved from https://www.buzzfeed.com/craigsilverman/fever- swamp-election Silverman, J. (2019, April 23). What’s the Best Way to Keep Incendiary, Violent Content Offline? The New Republic. Retrieved from https://newrepublic.com/article/153656/whats-best-way-keep-incendiary-violent-content- offline 278 Southern Poverty Law Center. (2018). The alt-right is killing people. Retrieved March 17, 2018, from Southern Poverty Law Center website: https://www.splcenter.org/hatewatch/2018/02/05/alt-right-killing-people Southern Poverty Law Center. (2019). A Problem of Epik Proportions. Retrieved May 15, 2019, from Southern Poverty Law Center website: https://www.splcenter.org/hatewatch/2019/01/11/problem-epik-proportions Spingarn, A. (2010). When “Uncle Tom” Became an Insult. Retrieved April 1, 2019, from The Root website: https://www.theroot.com/when-uncle-tom-became-an-insult-1790879561 Squirrell, T. (2017). Linguistic data analysis of 3 billion Reddit comments shows the alt-right is getting stronger. Retrieved March 21, 2019, from Quartz website: https://qz.com/1056319/what-is-the-alt-right-a-linguistic-data-analysis-of-3-billion- reddit-comments-shows-a-disparate-group-that-is-quickly-uniting/ Star, S. L. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377– 391. Star, S. L. (2010). This is not a boundary object: Reflections on the origin of a concept. Science, Technology, & Human Values, 35(5), 601–617. Star, S. L., & Ruhleder, K. (1996). Steps toward an ecology of infrastructure: Design and access for large information spaces. Information Systems Research, 7(1), 111–134. Starbird, K. (2017). Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter. International AAAI Conference on Web and Social Media, 230–239. Retrieved from https://www.aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15603 Statt, N. (2017, February 1). Reddit bans two prominent alt-right subreddits. Retrieved March 9, 2019, from The Verge website: https://www.theverge.com/2017/2/1/14478948/reddit-alt- right-ban-altright-alternative-right-subreddits-doxing Statt, N. (2019, January 23). Facebook says it will expand take downs of networks of pages that break its rules. Retrieved May 7, 2019, from The Verge website: https://www.theverge.com/2019/1/23/18194766/facebook-pages-punishments-fake-news- policies-publisher-crackdown Stern, K. S. (1999). Hate and the Internet. American Jewish Committee New York. Sterne, J. (2003). Bourdieu, Technique And Technology. Cultural Studies, 17(3–4), 367–389. https://doi.org/10.1080/0950238032000083863a Stoler, A. L. (2002). Colonial Archives and the Arts of Governance: On the Content in the Form. In Refiguring the Archive (pp. 83–102). https://doi.org/10.1007/978-94-010-0570-8_7 279 Stoler, A. L. (2008). Epistemic Politics: Ontologies of Colonial Common Sense. The Philosophical Forum, 39(3), 349–361. https://doi.org/10.1111/j.1467-9191.2008.00303.x Sundén, J., & Paasonen, S. (2018). Shameless hags and tolerance whores: feminist resistance and the affective circuits of online hate. Feminist Media Studies, 18(4), 643–656. https://doi.org/10.1080/14680777.2018.1447427 Swidler, A. (1986). Culture in Action: Symbols and Strategies. American Sociological Review, 51(2), 273–286. https://doi.org/10.2307/2095521 Tan, J., Wang, Y., & Gomes, D. (2016). Building National Resilience in the Digital Era of Violent Extremism: Systems and People. In Combating Violent Extremism and Radicalization in the Digital Era (pp. 307–327). IGI Global. Thiesmeyer, L. (1999). Racism on the Web: Its rhetoric and marketing. Ethics and Information Technology, 1(2), 117–125. Thorson, K., & Wells, C. (2016). Curated Flows: A Framework for Mapping Media Exposure in the Digital Age. Communication Theory, 26(3), 309–328. Tolentino, J. (2018, May 15). The Rage of the Incels. The New Yorker. Retrieved from https://www.newyorker.com/culture/cultural-comment/the-rage-of-the-incels Trouillot, M.-R. (2015). Silencing the Past (20th anniversary edition): Power and the Production of History. Boston, MA: Beacon Press. van Dijck, J. (2013). The Culture of Connectivity: A Critical History of Social Media. OUP USA. Velasquez, A., & LaRose, R. (2015). Youth collective activism through social media: The role of collective efficacy. New Media & Society, 17(6), 899–918. https://doi.org/10.1177/1461444813518391 Virilio, P., & Bratton, B. H. (2006). Speed and Politics (M. Polizzotti, Trans.). New York: Semiotext. Weill, K. (2018, October 4). The Alt-Right Outlives the Trolls Who Created It. Retrieved from https://www.thedailybeast.com/the-alt-right-outlives-the-trolls-who-created-it Whine, M. (1999). Cyberspace-A New Medium for Communication, Command, and Control by Extremists. Studies in Conflict & Terrorism, 22(3), 231–245. https://doi.org/10.1080/105761099265748 Wojcieszak, M. (2010). ‘Don’t talk to me’: effects of ideologically homogeneous online groups and politically dissimilar offline ties on extremism. New Media & Society, 12(4), 637– 655. https://doi.org/10.1177/1461444809342775 280 Ybarra, M. L., Mitchell, K. J., Palmer, N. A., & Reisner, S. L. (2015). Online social support as a buffer against online and offline peer and sexual victimization among U.S. LGBT and non-LGBT youth. Child Abuse & Neglect, 39, 123–136. https://doi.org/10.1016/j.chiabu.2014.08.006 281