SURVEILLANCE CAPITALISM, NEUROECONOMICS, AND THE USER EXPERIENCE: CHARTING INTERSECTIONS By Sydney Keenan A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Rhetoric and Writing – Master of Arts 2025 ABSTRACT This thesis uses a case study of Spotify Wrapped—a year-in-review (YIR) feature of the Spotify app made possible by the data Spotify passively collects from its users—to explore how surveillance capitalism, social psychology, neuroeconomics, and user-centered UX research and design intersect, overlap, and converge to create complex situations the modern consumer must now navigate. Informed by scholarship, methods, and frameworks from multiple disciplines, I designed and distributed a 20-question survey to a variety of participant demographic groups, the data from which I describe, analyze, discuss, and contextualize in this thesis. Throughout this process, one core point of inquiry guided my efforts: why do self-identified surveillance-opposed consumers actively and knowingly engage with products and services fueled by surveillance capitalism? Preliminary findings based on an application of neuroeconomics’ value-based decision-making framework reveal that consumers often value most other things—such as building social capital, ensuring social inclusion, and satisfying their curiosity about their music streaming habits—more than they value protecting their personal data and digital privacy. Results and their implications suggest a need for additional research that further examines situational complexities identified in this research and explores ways in which UX designers can build experiences that encourage users to protect themselves from surveillance capitalism. ACKNOWLEDGEMENTS It remains a mystery to me why the myth of the lonesome, solitary writer persists; plenty of us romanticize sequestering ourselves in a hidden nook to feverishly transcribe our slippery thoughts, I’m sure, but eventually we must emerge—one’s own mind is not a good or safe place to take up permanent residence. Thus, I share the accomplishment of this thesis with many people, some of whom helped me hand-carve the pieces of this puzzle, and others who simply, quietly, and unknowingly kept wind in my sails. I will try, but likely fail, to adequately thank all of you here. First and foremost, I would like to acknowledge my wonderful committee members: Dr. Casey McArdle, Dr. Stuart Blythe, and Dr. Kate Birdsall. Each of you played a unique, instrumental part in making this research worth something. Casey, I thank you for your unwavering support and forceful reminders about scope creep. Stuart, I thank you for always offering a listening ear and sharing your wisdom. Kate, I thank you for the now-recurring nightmares about to-be verbs and your willingness to join me in the existential ether. This project would have been weaker—if not impossible—without you, and I will always cherish your mentorship. Next, I say thank you to my cohort and fellow grads—my friends and peers. You are all brilliant, your friendship a salve that soothed the worst of the stress. I’m excited to watch you all flourish, even if only from afar. To my family, thank you for your love and support: my mom, a champion survey distributor still looking forward to the day she can call me “Doctor;” my dad, who spent plenty of his lived-for lunches listening to my anxious rambling like a podcast; and my sister, who tells it like it is and makes me a better person. 871. (Mac too, I suppose, with all two of his brain cells.) To Ronald, who became a regular character in every story, made sure to disrupt my most important nights of sleep, and cannot even read this: you’re lucky you’re cute, you gremlin. And last but certainly not least, thank you to my partner, Gus, whom this entire endeavor would have been impossible without. Not only did you follow me to this concrete desert with a smile on your face, you’ve eased every burden, taken up sword and shield against my imposter syndrome, acquiesced to my hostile takeover of every once-blank wall in our apartment, and supported me in ways I didn’t even know I needed. Es tevi mīlu. Es nevaru dzīvot bez tevis. iii There are far more people I could thank—the dozens of WRAC faculty and staff who have helped me along the way (you know who you are), Ben and Jerry, and all of my mom’s friends that took the time to fill out my “fancy” survey, just to squeeze in a few—but I must wrap this up. To anyone who has offered me support, assistance, or chocolate in the last two years, I remember, and you have my gratitude. iv TABLE OF CONTENTS INTRODUCTION…….…………………………………………………………………………..1 METHODS………………………………………………………………………………………..5 RESULTS………..………………………………………………………………………………..8 DISCUSSION…...…………….…………………………………………………………………36 CONCLUSION….……………….………………………………………………………………42 REFERENCES….…………………….………………………………………………………… 45 APPENDIX A: FULL SURVEY QUESTIONS LIST……………..…………………………… 49 APPENDIX B: SURVEY DISTRIBUTION LOG………………………………………………56 APPENDIX C: DESCRIPTION OF AFFINITY DIAGRAM PROCESS WITH IMAGES…… 57 v INTRODUCTION Discussions about the pragmatic consumer have dominated recent data privacy scholarship, and the existence of individuals exhibiting resigned pragmatism in the face of surveillance capitalism is well-documented. These individuals believe that a lack of privacy is inherent to internet use and that the collection and “misuse of their personal information is inevitable and unavoidable” (Draper, 2017, p. 246), becoming apathetic and resigned to the collection of their data as a result (Marwick & Hargittai, 2019; Afriat et al., 2021). Earlier findings expound upon this idea by describing a neutral-at-best response from consumers to “tradeoffs,” a marketing tactic used by companies to portray the consumer as powerful and willing to accept the collection of their data in exchange for discounts or other benefits (Turow et al., 2015). In reality, consumers are not responding positively to tradeoffs; they are simply experiencing resigned pragmatism, not responding to tradeoffs at all, and generally considering discounts an unfair cost for their nonconsensually-collected data. This—the overwhelming apathy people present in the face of economic surveillance— speaks to the heart of the privacy paradox, which acknowledges that consumers disapprove of companies’ surveillance practices yet do nothing about it (Hargittai & Marwick, 2016; Draper, 2017). In other words, these consumers are all bark, no bite—they directly contradict their disapproval of surveillance capitalism with a distinct lack of behavior. Some studies even point to a rise in “privacy cynicism,” an extension of resigned pragmatism and privacy apathy that shows individuals purposefully neglecting to take protective measures as a retaliative coping mechanism, rationalizing their lack of control as defiance (Hoffmann et al., 2016). As a result, our understanding of the privacy paradox and how it manifests in human behavior and digital interaction is this: people are uneasy, but also made to feel powerless to protect themselves or change their situation. Thus, they spend their time interacting with the online world fraught with fear, frustration, or indifference, maintaining a carefully curated mask of apathy that allows them to ignore the threat access to digitally mediated spaces poses to the security of their personal data. However, our current understanding of the privacy paradox and the resigned pragmatic individual does not explain a certain reciprocity now observable in users—reciprocity that shows individuals consenting to, participating in, and even asking for additional violations of their privacy. Consumers are now recognizing certain byproducts of economic surveillance and 1 assigning significant value to them—not in a way that validates the effectiveness of tradeoff tactics, but in a way that allows consumers agency they did not have before. By assigning new value to these byproducts, they are also—by extension—accepting, welcoming, and thinking positively of the surveillance practices by which those desirable byproducts are made possible. In other words, they feel that volunteering more of their personal information may somehow improve the overall experience or cause a specific desired outcome. Initial research that I conducted on this phenomenon situates this behavior within the definition of what I conceptualize as reactive opportunism, which is the idea that users “[react] to companies’ unavoidable collection and use of their data by leveraging their personal information into desirable opportunities for reward” (Keenan, 2024, p. 27). In essence, users are beginning to succeed the unactionable apathy we understand to be resigned pragmatism with opportunistic, actionable behavior that allows them to find alternative uses for the results of economic surveillance, or even to demand something of value in return for their personal data. Preliminary Steps I sought an explanation for the rise of reactive opportunism in users and thus conducted an exploratory case study of Spotify Wrapped, a year-in-review feature of the Spotify mobile app that annually furnishes each Spotify user with a unique synopsis of their listening data. Spotify Wrapped, as a service and feature, was of particular interest due to Spotify’s overwhelming presence in the music streaming space. According to Spotify Newsroom, “[Spotify is] the world’s most popular audio streaming subscription service with more than 675 million users, including 263 million subscribers in more than 180 markets” (“About Spotify,” 2025; “Spotify Reports,” 2025). As such, 675 million people worldwide receive a unique Spotify Wrapped—an in-app slideshow that summarizes their most listened to songs, artists, genres and more by distilling personal app usage data into eye-catching cards like those pictured in Figure 1. Spotify users can easily post these cards to social media to share and compare their results with others, an activity enabled by the design of Spotify Wrapped: the simplicity and uniformity of the summaries allows for direct comparison between the stats of individuals, and the “share” button integrated into the slideshow at various points, when clicked, imports the image into a social media post editor of the user’s choice for immediate sharing. Such a feature is made possible by continuous data collection by Spotify via surveillance- based methods, a fact that Spotify users are aware of and generally unconcerned about. Based on 2 the preliminary case study examination of a sample of Spotify- and Spotify Wrapped-related posts from multiple social media sites, attitudes toward Spotify Wrapped remain positive, with users even going so far as to accept and praise those surveillance practices for allowing them their Spotify Wrapped (and, by extension, the social participation and relevance it enables). Figure 1: A set of screenshots from Spotify’s blog “For the Record” presenting a sample Spotify Wrapped. This particular sample showcases the My Top Artist part of the yearly summary, which reports that this hypothetical Spotify user listened to this artist for 6,342 minutes total, making them part of the “top 0.1% of listeners worldwide.” Additionally, as I observed in my previous analysis, some Spotify users will alter the functionality of the Spotify app, their listening habits, or even the slideshow images themselves in order to optimize their Spotify Wrapped results and the social “buying power” of those results (Keenan 2024). It appears that people are beginning to treat their Spotify Wrapped results as a form of social capital, using it to gain membership to desired social groups or subcultures. Furthermore, individuals who do not use Spotify have been known to lament their lack of such currency, going to surprising lengths to possess something analogous, something that will give them the means to participate, even peripherally, in the Spotify Wrapped discourse and social moment (Keenan, 2024). Even more interestingly, I observed one last notable trend across the behavior of users, Spotify and non-Spotify alike: individuals—often in the same digital breath in which they 3 express their appreciation for Spotify Wrapped—claim to desire additional surveillance- facilitated rewards, both from Spotify and from platforms or contexts not related to Spotify in any way. For example, users claim to want more of their Spotify data collected and reported in their Spotify Wrapped, which often harmonizes with a similar expressed desire for Wrapped- esque summaries of emoji usage, miles driven, or other Spotify-unrelated and generally infeasible things. This speaks not only to users’ acceptance of economic surveillance but to the new value assigned to the products of that surveillance, therefore further contributing to the conceptualization of reactive opportunism. The observation of these trends prompted me to substantiate, quantify, and expand on my initial findings. I identified and characterized these observed trends in user behavior based on a small sample of social media posts that may or may not represent the true inclinations and behaviors of a more generalized sample. Thus, are these findings representative of a user majority, or simply a snapshot curated by social media? The latter is more likely, yes, but even so, a minority of users are exhibiting this behavior—why, specifically, are they doing so? What decisions are they making? How much are they willing to sacrifice, and for what? In response to these questions, I developed hypotheses informed by an interdisciplinary understanding of the nature of social inclusion and what we will do to achieve it. Current Study The next natural step was to test these hypotheses by gathering more information. This thesis describes how I conducted research in service of that objective and discusses my findings, their implications, and their limitations. I have described the contextual landscape and impetus of this research above, and will next discuss my data collection methods and intentional protocol design. Following the description of methods is a detailed summary of collected data, complete with introductory conclusions based on each section of the survey and the core topics they address. Lastly, I discuss all current findings in relation to each other as well as findings from my previous research, providing an updated and rearticulated understanding of consumer behavior as it relates to reactive opportunism by applying the relevant neuroeconomic concept of value-based decision making, ultimately concluding with a description of limitations and opportunities for future research. 4 Data Collection and Protocol Design METHODS Data was collected via an anonymous five- to ten-minute survey consisting of 22 questions on average per instance—a number made variable by the availability of multiple question paths triggered by certain answers to key questions (henceforth known as trigger questions). To reduce the likelihood of survey fatigue in respondents, the majority of these questions were multiple-choice questions, with a maximum of three questions total per instance (across all branches and pathways) requesting a unique qualitative response. These three free-response questions asked participants to elaborate on their reasoning behind engaging or not engaging in the three key behaviors identified in previous research: users’ proactive manipulation of their year-in-review results, users’ retroactive manipulation of their year-in-review results, and users’ willingness to use third party sites to retrieve more data than their year-in-review results alone could provide. Derived from the trends identified in my previous research (Keenan 2024), the answers to these questions became the heart of the case study, and thus were of paramount interest to this ongoing research. As a result, I designed the survey around them. In order to ensure opportunities for external application of this case study to broader concepts and issues, also included in the survey were sets of questions focused on participant engagement with social media trends and attitudes towards data privacy, resulting in a survey with four distinct sections: Trend Engagement, Music Streaming, Data Privacy, and Demographic Information. I organized and designed these sections strategically in a way that made the survey as short and approachable as possible while still providing the key data points needed for meaningful analysis. The survey sections are listed above in the order they appeared in the survey, with Demographic Information last. Asking respondents to answer demographic questions at the end of the survey (as opposed to at the beginning) was a purposeful rhetorical choice I made with the intent to build trust with respondents and avoid the perception that they were being reduced to their age and gender from the beginning of the survey.1 1 Wolf et al. (2016) suggest that survey questions should be worded in a way that makes them “non-threatening” to the participant, and Braun et al. (2020) suggest that, in social and/or qualitative research, potentially “threatening” survey questions should be placed at the end of the survey (as opposed to the beginning, where they could dissuade participants from continuing). I placed the demographic questions at the end of my survey because I consider them the most threatening—they are the ones most likely for participants to perceive as a violation of the survey’s promise of anonymity. 5 My decision to situate the Music Streaming section—the section that housed the three free-response questions—between the exclusively qualitative Trend Engagement and Data Privacy sections was also intentional. The goal was for participants to encounter these specific questions approximately midway through the survey’s duration, early enough that respondents would still be willing to give their free responses time and attention, but not so early that they set an artificially high expectation of participation for the respondents. However, upon retrospection, this placement might have not been optimal, as the Data Privacy section immediately following Music Streaming saw a small but notable number of respondents abandoning their survey after the third qualitative question, resulting in more unanswered questions in the Data Privacy section than any other. Based on this experiential knowledge and research findings that show improvement in data reliability from surveys where question order is randomized in some way, I infer that designing the survey in a way that shuffled the order in which respondents encountered the four main groups of questions could have reduced the sparseness of the data collected by the Data Privacy section, increased the survey completion rate, and generally strengthened the data collected (Wilson et al. 2021; Loiacono and Wilson 2020). Additionally, I chose to construct parallel and alternate question pathways and the implementation of trigger questions, allowing respondents the opportunity to partially customize their survey experience. At several instances throughout the survey where, based on their responses to deliberately-placed questions, participants diverted themselves onto a pathway most relevant to them. This allowed certain participants to skip large portions of irrelevant questions about social media or streaming habits, minimizing the risk of survey fatigue.2 More notably, I provided two different question sets for the Music Streaming section of the survey, only one of which participants would answer based on whether they reported Spotify as their primary music streaming platform. The distinction of these two separate but parallel pathways was meant to prevent the exclusion of certain individuals from the participation criteria, provide comparative data points between Spotify and non-Spotify users, and prevent respondents from needing to 2 Per its entry in the Encyclopedia of Survey Research Methods, survey fatigue is the phenomenon that occurs when survey respondents tire or become bored during survey completion, causing the quality or reliability of the data they provide to decrease. Survey fatigue can cause respondents to skip questions, stop reading all response options, default to responses like “NA,” select the first answer in every list, or abandon the survey altogether. As such, a complicated, poorly-worded, or time-consuming survey has the potential to harm the quality of the data it collects, and therefore researchers should strive to design their surveys in ways that minimize potential survey fatigue. For more information on survey fatigue and strategies for measuring or controlling its effects, see the following: (Lavrakas 2008; Jeong et al., 2023; Rolstad et al.. 2011; Le et al.. 2021). 6 answer questions not relevant to them. See Figure 2 below for a survey pathway visualization, and see Appendix A for the full list of survey questions and their response options. Figure 2: A visual representation of survey pathways with key. Numbers with dashed black outlines represent trigger questions. Survey Distribution The survey was distributed to several personal and professional networks in two primary waves, made available to potential participants through an anonymous link or QR code. During the initial distribution in November of 2024, the survey was sent to two departmental email lists at a large university in the Midwest, posted to the personal LinkedIn and Instagram pages of several individuals, and shared via flyer at two social gatherings and one in-person workplace.3 Halfway through the survey’s five-week lifespan, it was redistributed along these channels accompanied by a reminder that participation was still possible. In total, across all channels, I estimate that roughly 3,700 people were solicited as participants (see Appendix B for details).4 3 We (my mentors and I) chose how and where to distribute the survey based on a) which channels we had easy and continuous access to, and b) which channels presented the greatest potential for variability in participant age and technological literacy level. Heterogeneity in these demographic categories (as opposed to categories such as participant location or gender) would provide more relevant, comparable, and widely-applicable insights across generational groups. As such, please note that sample scope is likely limited to the Midwest United States. 4 I calculated this number based on the estimated maximum number of potential participants each channel had the potential to solicit. For a detailed breakdown of my calculations and notes, see Appendix B. 7 RESULTS The survey was open for six weeks in November and December of 2024, collecting a total of 175 responses. Of those 175 surveys, 154 (88%) of them were completed in their entirety, 4 (2.28%) were mostly completed (i.e. 80≤99% complete), and 11 (6.28%) were submitted or abandoned at various stages of significant incompleteness (i.e. 0≤79% complete). It should be noted that, of those 11 partially completed surveys, 6 of them yielded no viable data. Due to the volume of data and the limitations of data analysis tools used, these 6 responses remain part of the sample, therefore slightly affecting calculations made based on total sample size (avg. ±1.8% error). However, calculations included herein are not always based on the total sample size—specific analyses often saw the factoring out of incomplete or irrelevant responses and response types, as reflected in the variable sample sizes presented throughout the following analysis. Demographics Considering survey distribution methods that—by nature of conducting this research in a university setting—favored a student audience, the representation of generational groups in the data is more diverse than expected. The 18-24 age group represented the largest percentage (39.6%) of responses as expected, but they were not the overwhelming majority. 37% of respondents reported being over the age of 35, accounting for almost the same volume of responses (see Figure 3). The same cannot be said for gender diversity among respondents, as 61% of respondents reported to identify as female, 31% as male, and 7% as nonbinary; the other 2% abstained from providing such information. This disproportionate split may have been the result of survey distribution through certain personal networks of predominantly female composition.5 5 As the demographic information of participants is not of primary interest or importance to the overall findings about user behavior and serves only as a basis for identifying and comparing generational leanings, this uneven gender distribution is not cause for concern. For more on gender distribution in surveys like mine, see findings in Becker (2022) and discussion of methods in Akhter et al. (2022). 8 Figure 3: Visual breakdown of total survey participant sample by number and percentage of respondents per age group and self-identified gender. Trend Engagement As the discourse surrounding Spotify Wrapped is largely facilitated and perpetuated by social media and online social trend phenomena, this first section of the survey was designed to assess participants’ frequency of social media use, type of social media use, and value assignment to the awareness of and participation in social and online trends. When asked about the frequency of their social media use, the majority of respondents—104 out of 163, or 63.8%—reported using social media “daily” (i.e. several hours a day). Subsequent majorities bookend this group, reporting “constant” (i.e. most hours of most days) or “weekly” (i.e. several times a week) social media use. In this digital day and age in which using social media—problematically or otherwise—is as easy and as normalized as it’s ever been, these numbers are unsurprising. A 2021 meta- analysis by Cheng et al. reveals the overall prevalence of social media addiction across 32 nations is 24% based on the Bergen Facebook Addiction Scale (BFAS), which assesses social media addiction based on salience (the extent to which social media use dominates thinking and behavior) and several other criteria derived from clinical research (Cheng et al., 2021; Bowman 9 & Clark-Gordon, 2019).6 While Cheng et al.'s findings do not align perfectly with mine, they serve to illustrate my point that social media use is frequent to the point of clinically-defined addiction in about a quarter of the population assessed in Cheng et al.’s research. Furthermore, Pew Research center data from the past few years reports around 70% of Americans use social media, with adults over 30 “far more likely than their older counterparts to use many of the online platforms” (Gottfried, 2024, para. 12; Auxier, 2021). Pew’s social media use report from 2021 found that 84% of American adults between the ages of 18 and 29 self-report ever using social media—12% above the 72% average of all Americans (Auxier, 2021). More than half of the 19 participants who responded “constantly” belonged to the 18-24 age group, with the remainder—interestingly—distributed evenly across all other age groups (except 55+). This is interesting because the distinction between “daily” and “constantly” is largely subjective and based on the participant’s judgement of how “constant” their social media usage is. Based on research by Ernala et al. (2020) and Verbeij et al. (2021) that shows a tendency for younger age groups to overestimate the when self-reporting social media use, this may indicate a comparatively elevated level of self-awareness or increased willingness on the part of 18- to 24-year-olds—who are, per my new data, engaging the most with the behavior trends observed in the preliminary case study—to admit their frequent use of social media as opposed to participants in other age groups. I will continue to develop this idea in subsequent sections as the analysis progresses. Otherwise, a more balanced demographic split occurs across participants who reported using social media daily, weekly, monthly, rarely, and never. The same can be said for responses to the section’s question about social media activity, which suggests that most participants do not engage actively with social media sites, preferring to “lurk” instead: an average of almost 60% of each age group reported that they were “not very active” on social media (i.e. posting and commenting infrequently, instead spending the most amount of time simply scrolling their feed). Respondents’ preference for using social media only in an observational capacity also reflects 6 Published in 2012, an article by Cecilie Schou Andreassen, Torbjørn Torsheim, Geir Scott Brunborg, and Ståle Pallesen titled “Development of a Facebook Addiction Scale” introduces the Bergen Facebook Addiction Scale (BFAS), the original purpose of which was to assess addiction to Facebook specifically as subtype of general internet addiction. Since then, scholars—including Cheng et al., cited here—began using the BFAS to assess internet addiction generally, not just Facebook addiction, and additional scholarly discussion about such co-opting of the framework continues to circulate. To read more about the BFAS and discourse surrounding it, see Andreassen et al. (2012), Griffiths (2012), Andreassen and Pallesen (2013), and Bowman and Clark-Gordon (2020). 10 strongly in the responses to the question asking about the importance respondents placed in keeping up with and participating in social trends, particularly online trends on social media. The majority of respondents reported little (66%) or no (19%) interest in participating in “online trends and social moments,” as phrased by the survey.7 Overall, this data shows that the group of participants surveyed during this research does not feel strongly about actively participating in social media—they use it, yes, and quite abundantly, but do not place significant value in social media content generation or trend engagement. This could potentially represent the sentiments of a larger, more generally populated group beyond the bounds of the participant pool, but is—at the very least—relevant to the rest of the data and analysis discussed here, and therefore is worth mentioning. Music Streaming The second section of the survey—as the biggest and most robust—produced the most varied results. I intended for this this section of the survey to most directly address the key hypotheses produced by the initial case study, its purpose to collect information about participants’ music streaming habits: the platform they preferred and why; what things they prioritized in a music streaming platform and whether or not a year-in-review feature was one of them; and to what extent they prioritized this year-in-review results, assuming their preferred platform offered such a feature. This is also the section in which I asked participants three pairs of questions intended to assess their engagement in each of the three key behavior trends identified in my earlier research: proactive year-in-review results manipulation, retroactive year- in-review results manipulation, and intentional risk-taking with the specific intent to reap additional surveillance-based rewards. For each type of behavior, participants answered a yes-or- no multiple-choice question about whether they had ever engaged in such behavior and a qualitative free response that asked them to elaborate on their motivations for participating or abstaining. However, participants first encountered a question asking them about the frequency of their music streaming platform use. I included this question to gather data that would allow me to compare music streaming platform use among participants to other variables (like social media 7 Only one respondent out of the 162 that answered the question reported that this participation was “very important” to them, and—contrary to what is likely a popular assumption—this person was part of the 45-54 age group, not the 18-14 age group. This is notable only because of its status as an outlier, one that—had the research protocol and timeline allowed for it—I would have liked to interview individually to gather further insights. 11 use more generally) and to assess participant’s general familiarity with functionalities and features of music streaming platforms (such as Spotify Wrapped). The majority of 162 respondents reported using their preferred music streaming platform with great frequency—multiple times a week (59%) or multiple times a day (30%)—indicating that this sample of participants possesses technological literacy pertaining to music streaming platforms. This question also served as a trigger question that allowed the two participants who self-reportedly “never” streamed music (i.e. used CDs, vinyl records, or the radio exclusively) to skip the next portion of the survey about specific streaming habits that no longer applied to them, thus excluding them from this section’s sample size. Following this initial question, respondents encountered a second trigger question about the music streaming platform they preferred; their answer to this question would redirect them along the one of two major branches of the survey that best applied to their experience as a Spotify user or as a user of any other platform except Spotify (i.e. Apple Music, YouTube Music, Amazon Music, Tidal, etc.). Based on the 160 total answers to this trigger question, 71% of survey participants use and prefer Spotify as their music streaming platform, and 29% do not (see Figure 4). Of the 47 non-Spotify users, the majority (74%) of them use either Apple Music (53%) or YouTube Music (21%), which is relevant because both of those platforms offer a year- in-review “Rewind” feature similar to Spotify Wrapped. Figure 4: Visual breakdown of music streaming platform preferred by number and percentage of survey participants. 12 I draw attention to this split because a) participants navigated the remainder of the music streaming section of the survey based on their preferred platform, and b) it is relevant to my upcoming discussion and comparison of the results from the Spotify-user and non-Spotify-user branches of the survey. To simplify the remaining discussion of music streaming results—made more voluminous and varied by the multiple pathways available to participants at this point in the survey—I will prioritize findings of interest to the case study. As Spotify Wrapped is the focus of the case study, I will center results and data from Spotify users, and my presentation of the results and data gathered from non-Spotify users will primarily serve to create a point of comparison against which I can further examine the Spotify user data. Potential Financial Dimensions Once the 113 Spotify users were diverted from the section’s landing platform and sequestered in their own unique branch of the survey, they were asked whether they used the free version or paid for Spotify Premium. The inclusion of this question was purposeful and important because the free and paid versions of Spotify differ greatly in their functionality, with the Premium version offering considerably more features and freedom of use. As seen in Table 1, a Spotify Premium subscription allows users to (most notably) listen offline, remove ads, and customize their listening queue—all things that cannot be done with the free version. 13 Feature Feature Description Ad-free music listening Download to listen offline Listen to music without ad breaks Download your albums and playlists and take them anywhere your internet can’t go. You can download up to 10,000 different songs each on up to 5 different devices. Play songs in any order Full control over your listening: repeat songs, play your favorite parts again, and listen to albums in order. High audio quality Free Plan: Basic quality audio (~160kbit/s) Premium Plans: High quality audio (~320kbit/s) Listen with friends in real time Everyone’s invited, no matter where they are. Control what plays together. Friends can listen together and control what plays on a speaker. Plan Type Free Paid 🗴 🗸 🗴 🗸 🗴 🗸 🗴 🗴 🗸 🗸 Organize listening queue Add to, remove, or reorder what’s playing next. 🗴 🗸 Table 1: Features comparison between Spotify Free and Spotify Premium (“Spotify Premium,” n.d.). Knowing specifically what Spotify Premium users value in the performance and functionality of their music streaming platform—enough that they would pay a subscription when a free version exists—was of interest to this research because I was curious to see if there was a notable preference for features that allowed more freedom of use, thus making the proactive manipulation of Spotify Wrapped results more feasible. Features such as specific song selection and queue customization allow users to more purposefully curate their listening experience, and therefore their year-in-review results, which makes them important tools for users seeking to optimize, curate, or maximize the value of their year-in-review results. In other words, I wanted to know if the appeal of a perfectly curated Spotify Wrapped was in any way influencing the financial decisions of consumers. As such, the survey asked respondents to answer a multiple-choice question about their reason(s) for paying for Spotify Premium. This type of multiple-choice question allowed to select any number and combination of the following options: “Ad-free listening,” “Specific song 14 selection (as opposed to only shuffle play),” “Discounted rate (i.e. student or family pricing),” “Offline listening,” “Improved sound quality,” “Queue customization,” “Audiobook access,” and “Other” (fill in the blank). When building this survey question, I shuffled reasons of particular interest to this research—namely “Specific song selection” and “Queue customization”— randomly among other common reasons a participant might maintain a Premium plan, making sure to avoid putting them all next to each other at the top of the list. I did so to minimize the likelihood of the serial positioning effect creating artificial positive correlation trends in the data.8 Based on the data, the top two reasons for maintaining a Premium subscription among Spotify user participants were “Ad-free listening” and “Specific song selection.” This is unsurprising, as they are two of the most noticeable ways to increase freedom of use and active listening time. “Queue customization” found itself fifth on the list, after “Offline listening” and “Discounted rate.” Such results suggest no support for the idea that users are paying for a Spotify subscription with the specific intention to access Premium features that allow them to more easily proactively manipulate their Spotify Wrapped results, only that paying for Spotify Premium is simply a quality-of-life improvement. However, given the loose wording of the question and lack of specific data to support any concrete findings for or against the hypothesis, it is still within the realm of possibility that some users may value certain Premium features for their Wrapped-curation affordances. Attitudes Toward Spotify Wrapped: A Comparison At some point throughout their experience with the survey, almost every participant (with the exception of a handful of those who ended up on the “never streamed music” pathway) encountered a multiple-choice question that required them to choose the statement that most 8 First described by Hermann Ebbinghaus in the results of his 1913 self-administered memory research and since substantiated by countless other studies, the serial positioning effect refers to the tendency for individuals to pay less attention to and remember less well list items or words beyond the first four due to natural limitations of human memory and attention (Ebbinghaus, 1913; Jensen, 1962; Murdock, 1962). Modern findings from web-based eye- tracking research relate to this idea, showing that readers often skim text in an F-shaped pattern that neglects all quadrants of the screen except the top left (Pernice et al., 2021; Pernice, 2024). Based on this, I inferred that survey participants might more frequently and disproportionately select the first few options in the list of response options. To prevent this phenomenon from artificially supporting my hypothesis that reward-seeking Spotify users pay for Spotify Premium specifically to have access to reward value-maximizing features, I thus ensured the response options supporting that hypothesis did not appear exclusively at the top of the list, instead choosing to spread them evenly throughout. For more information on survey question response randomization strategies to reduce bias in survey data, see Wilson et al. (2021). 15 closely represented their feelings about Spotify Wrapped. Response options ranged from “I don’t know what it is” to “I eagerly await its release each year” with the intention to quantify and generalize the entire sample’s attitude towards Spotify Wrapped. As discussed earlier in this piece, my initial conclusions about the generally positive sentiments towards Spotify Wrapped was based on a sample of social media posts collected from X (formerly Twitter) and therefore had the significant potential to be skewed or otherwise curated. By asking every survey respondent this same question about their feelings towards Spotify Wrapped regardless of their age, gender, or platform of choice, I would gather more even, unbiased results that would allow me to substantiate or nullify this initial claim. Thus, I intentionally included this question in all possible branches and pathways of the survey, which resulted in three distinct groups of participants reporting their feelings about Spotify Wrapped: Spotify Users, non-Spotify users with access to a year-in-review summary from their preferred platform, and non-Spotify users without access to a year-in-review summary from their preferred platform. Non-Spotify users—to those of whom it was applicable—were also asked a similar question about how they felt towards their own preferred platform’s year-in- review service (provided it offered one), but they were asked to give their thoughts on Spotify Wrapped first. Please note that, while all survey participants answered the same base question about their attitude towards Spotify Wrapped, the response options varied depending on pathway. For example, one of the non-Spotify versions of the question (Question 50) offered respondents a fifth option that allowed them to self-identify feeling excluded from the online Spotify Wrapped discourse due to only having year-in-review results received from their preferred platform, not specifically a Spotify Wrapped. This option was not relevant to other user subsets, and therefore was not made available to them. For analysis purposes here, data points from this Q50 response option were condensed with data points from another similar Q50 response option, creating one data point that represents both functionally and thematically and evening out the number of response categories across all versions of the question. For more specifics on the differences between question pathways and response options, see Appendix A. Across the survey’s entire sample population, 157 people gave their assessment of Spotify Wrapped—112 Spotify users, 27 non-Spotify users with access to their own platform’s year-in-review results, and 18 non-Spotify users without access to personalized year-in-review 16 results. Overall, and as illustrated by Figure 5 below, Spotify users look forward to the release of Spotify Wrapped the most, with interest in Spotify Wrapped decreasing among non-Spotify user groups; this is consistent with expected findings, as well as findings gleaned from the initial social media post collection. Figure 5: Participant attitudes toward Spotify Wrapped visually summarized and broken down by music streaming platform user type. What is not consistent with the previous social media-based findings is that most participants reported only being curious about their Spotify Wrapped results (or the Spotify Wrapped results of their friends and online mutuals), but not investing much interest in it beyond that. Findings from my previous research suggest that a more even portion of individuals— especially among those who use Spotify—would report excitement and anticipation related to the release of Spotify Wrapped (Keenan 2024), but these new results show a mild interest among a majority of participants. This is interesting because it a) speaks to how social media can and does highlight extreme examples of behavior, and b) makes an interesting backdrop against which to examine more closely any trends that may arise in the data gathered from the qualitative questions about the three specific observed behaviors of interest. Will the survey participants that reported only a mild interest in their Spotify Wrapped results be the same participants spending the calendar year prior to the release of those results optimizing their listening habits in order to get the perfect year-in-review summary? Are the survey participants reporting no interest in Spotify Wrapped the same participants openly condemning the practice of editing one’s results 17 before posting? I will address these questions, among others, in subsequent sections as analysis compounds. Examining Value-Maximizing Behavior Trends of Interest At this point in the survey, participants were presented with three pairs of questions intended to uncover crucial information about their behavior related to three areas of interest: proactive year-in-review results manipulation, retroactive year-in-review results manipulation, and use of third-party sites. The first question in each pair was a multiple-choice question with simple yes-or-no answer options, and the second question in each pair was a free-response question that asked participants to elaborate on the motivation behind their action or inaction, depending on how they answered the previous yes-or-no question. Table 2 below displays the three value-maximizing user behaviors of interest and the quantitative-qualitative question pairs used to assess each of them respectively. Participants on both major survey pathways—Spotify user and non-Spotify user—were asked these questions in the same order, phrased almost identically; the only difference between the questions in the above list and those included as part of the non-Spotify user survey pathway is that I replaced “Spotify Wrapped results” with “year-in-review results” to accommodate a broader participant pool. Doing so allowed me to gather a dataset from both Spotify and non- Spotify users about the behavior they engage in with their year-in-review results, creating a basis on which to assess and compare how much value individuals assign to curated Spotify Wrapped results versus curated year-in-review results from a platform that isn’t Spotify. My goal in comparing the datasets this way was to determine whether all year-in-review results are created equal, or if Spotify Wrapped has a unique influence over user behavior. 18 Behavior Quantified Quantitative Question Text (Spotify Pathway Version) Qualitative Question Text (All Versions) Proactive YIR results manipulation “Have you ever altered your Spotify music streaming habits in order to curate a certain Spotify Wrapped result? (Examples of alteration include maintaining two accounts, retroactively removing songs from listening history, excluding playlists from taste profile, turning off smart suggestions, etc.)” “Please elaborate on the motivation behind your actions (or lack thereof):” Answer Choices: [Yes] [No] [Fill in the blank] Retroactive YIR results manipulation “Have you ever edited your Spotify Wrapped results before posting them to social media or otherwise sharing them with friends or followers?” “Please elaborate on the motivation behind your actions (or lack thereof):” Answer Choices: [Yes] [No] [Fill in the blank] Third-party site use “Have you ever used a third-party website like Fanalytics or volt.fm to retrieve more of your Spotify listening data than is provided by Spotify Wrapped?” “Please elaborate on the motivation behind your actions (or lack thereof):” Answer Choices: [Yes] [No] [Fill in the blank] Table 2: Question pairs used to quantitatively and qualitatively assess participant engagement in the three value-maximizing behaviors of interest to this research, shown as exact survey question text used. Overall, across all three behavior types, Spotify-user participants generally reported engagement in more value-maximization behavior than non-Spotify user participants in relation to their year-in-review results. As described in Figure 6 below, Spotify-user participants exceeded non-Spotify user participants—both in number and percentage—in self-reported participation in proactive results manipulation and third-party site use. This data suggests that, in general, Spotify users are more likely than non-Spotify users to attempt to optimize the perceived value of their Spotify Wrapped results by changing their listening habits throughout the year and giving third-party data-reporting sites access to their personal information (and potentially their Spotify account). Zero survey respondents reported having ever altered their year-in-review results retroactively, which was unlikely given the sample size and therefore surprising. 19 Figure 6: Visual summary of the number of participants by user type engaging in value- maximizing behaviors of interest. Even more interesting, however, than respondents’ participation in or abstinence from these value-maximization behaviors is the reasoning behind their choice, which was captured by the six qualitative free-response questions in this part of the survey—three from the Spotify pathway, and three from the non-Spotify pathway. These six questions alone yielded a considerable amount of qualitative data that I analyzed via the creation of affinity diagrams that combined Spotify and non-Spotify user responses to each core question, which allowed me to group all responses by keywords and themes in order to distill out several key areas of reasoning for each type of behavior. Thus, I created three affinity diagrams total, pictured in Appendix C. I will now discuss the results of each in turn. Reasons for Proactive YIR Results Manipulation: A Qualitative Analysis Overall, as seen in Figure 6, the majority of survey participants reported that they had never taken proactive action to optimize their year-in-review results; combining the data from both pathway versions of the question, 102 out of 124 (82%) survey participants answered “no” to their respective question on this topic. Furthermore, when asked to explain why they did not participate in such behavior, most respondents said they simply saw little point in doing so. Specific reasons why varied—some expressed that they didn’t care about their results enough, 20 some felt it took too much time or effort, and some had never even realized it was something they could do at all—but the general sentiment was that changing one’s music streaming habits enough with the expressed purpose of significantly affect change in one’s year-in-review results was more trouble than it was worth. The most notable responses that fell into this category of reasoning were those that said something to the effect of “I don’t need to curate it because I don’t plan to post it,” indicating that, for some, the primary way to use something like Spotify Wrapped is to turn it into social media content, and for those who do not post regularly on social media, year-in-review results therefore serve no purpose and do not deserve attention. Additionally, almost all responses to this question from non-Spotify user participants fell into this category, while Spotify user participant responses spanned a greater breadth of reasoning types. These other reasoning types ranged from neutral opposition on the basis of preserving authenticity to vehement opposition on the basis of morality. For some, the idea of changing one’s listening habits with the specific intent to manipulate the year-in-review results was borderline abhorrent, and several participants seemed to take offense at the survey’s perceived insinuation that they might ever behave this way. Multiple participants condemned the behavior, saying it was “dishonest,” “pathetic,” “sad,” “disingenuous,” and “fake” to attempt to alter the results in this way. Several participants in this group pushed back even further, taking pride in their commitment to accuracy and authenticity with phrases by claiming they had “nothing to hide.” This subset of participants seemed fixated on preserving their honesty and the integrity of their results with more passion than expected, sometimes by disparaging those who might not agree with them. One participant went so far as to say “I also think it is pretty vain/vapid/lame to change your listening pattern just so that you look a different way than you are.” This was particularly intriguing to me because this response and others like it felt more antagonistic than the majority of others; based on the phrasing and accusatory language these participants used, they seemed intent to other themselves from the hypothetical individual that would participate in this alteration behavior, and did so unprompted. I also identified a subset of users with this attitude in the affinity diagram charting responses about retroactive results manipulation as well, which I will elaborate on shortly. This desire for accuracy and authenticity manifested in less inflammatory ways as well. Most other participants that claimed to have no reason or desire to take proactive results 21 alteration action remained more neutral by expressing curiosity about their “real” or “true” results. These respondents seemed to place significant value in the ethos of statistics and data, their statements about wanting “authentic results” and “the true data” suggesting that they inherently assume accuracy and validity of the data gathered by a corporation via this economic surveillance practice.9 Thus, their resistance to purposefully taking any action that could alter their year-in-review results is simply a way of preserving this perceived accuracy, with multiple participants even suggesting that manipulating the results (and therefore the data) in any way would “defeat the purpose” of such a year-in-review feature. One of the most intriguing intersections of this analysis occurs here within this conversation about authenticity. Those wishing to preserve the authenticity of their year-in- review results rejected the idea of changing their listening habits in any way that would jeopardize or “ruin” that authentic representation of their music taste, but for others, that alteration was necessary to ensure their year-in-review results were an authentic representation of their identity. Several participants said they avoided “taking [song] requests” from their children or friends to prevent music outside of their taste from being included in their history or “ruining their algorithm.” Others claimed that music they listened to while working or studying (such as lo-fi or instrumental music for focus) was not an accurate reflection of what they listened to “by choice,” and therefore took steps to prevent it from appearing in their year-in- review results. Notably, the respondents participating in this kind of alteration behavior were almost exclusively Spotify users, so accomplishing this alteration was as easy as making use of a functionality of the Spotify platform that allows users to exclude certain genres or playlists from their “taste profile.”10 However, some participants described going so far as to use a completely different music streaming platform for their work or study music to prevent it from overlapping with their “real” music taste at all. One participant explained their use of a separate platform for listening to “outlier” music like show tunes and lo-fi by saying they did not want their “random phases” to “corrupt” their Spotify Wrapped, indicating that their listening habits as captured by 9 This adds another dimension to the privacy paradox and the privacy cynic’s role in it: the idea that consumers do not trust companies to ensure their privacy or responsibly use and protect their personal information, but do trust those companies to collect and return accurate usage data. 10 There are features built-in to the functionality of the Spotify platform that make customizing one’s listening experience (and therefore proactive manipulation of one’s Spotify Wrapped results) easier, such as “Exclude from taste profile” and “Private Session.” More research is needed to determine if the intended use of these features is being subverted more often than abided. 22 the platform did not represent their “real” music taste, despite being based on their natural inclinations and preferences. Participants seemed to do all these things in service of achieving year-in-review results that they felt accurately and authentically represented what they perceived as their true music taste or identity as a music consumer, which suggests that these individuals have a clear, strong understanding of who they are and what they enjoy and want their data to reflect that in a satisfactory way. They still place value in their year-in-review results, but do not trust the natural functionality of the platform or algorithm to give them results they feel represent them. This is in direct contrast to those who refuse to alter their listening habits due to the trust they place in the algorithm and the value they place on the empiricism of the data. This contrast gives insight into how participants’ views on identity might differ—whether they feel it is assigned from external sources and internalized, determined internally and projected, or some combination of those. Other themes observed among participant responses indicated that engagement in proactive results-altering behavior is, in addition to internally motivated as seen with a participant desire to preserve or ensure identity-representative year-in-review results, externally motivated as well. A handful of participants—all Spotify users—admit in their responses that they removed “embarrassing” or “weird” music from their taste profile and avoided listening to “problematic” artists to avoid being shamed or excluded socially by others. One participant even identified their motivation for this behavior specifically as “virtue signaling,” which the Oxford English Dictionary defines as “the action or practice of expressing one's views or acting in a way thought to be motivated primarily by a wish to exhibit good character, social conscience, political convictions, etc., or to garner recognition and approval” (Oxford University Press, n.d.). Whether this participant knowingly assigned a defined term to their behavior or simply chose a pair of words coincidental in meaning is unclear, but my assessment based on the rest of their response is that they knew the meaning of this phrase and chose it intentionally. The remainder of participants that indicated a willingness to alter their listening habits to affect change on their year-in-review results cited motivation slightly more neutral than those already discussed (identity-based motivation and social pressure-based motivation); they fell somewhere in between, showing what I assess to be a muddled mix of both internal and external motivation. This group of respondents seemed to have more specific goals, such as getting one specific song to rank highest on their year-in-review’s “top songs” podium, or trying to keep 23 their favorite band or musician as one of their “most listened” artists. These goals do not provide definitive insight into what—perhaps subconsciously—these users were trying to achieve with their modifications, creating more questions than answers. Is an individual’s self-proclaimed “need” to keep, for example, Taylor Swift at the top of their “most played artists” motivated more by a desire for identity authenticity or social inclusion? Are there additional motivations at play, such as simply having strong preferences? If this is the case, though, what are those preferences rooted in—enjoyment of the music, a parasocial relationship with the artist, or something else—and what encourages the desire for them to be represented in such a clear, empirical way? More research is needed to answer these questions. Figure 7: Summarized heatmap of themes in participant reasoning identified during the affinity diagram cross-analysis of responses to survey questions 9, 10, 51, and 52. Reasons for Retroactive YIR Results Manipulation: A Qualitative Analysis Unlike the mixed responses associated with the proactive results manipulation analysis, participants were unanimous in their inaction with regard to retroactive results manipulation— 100% of participants answered “no” when asked if they had ever edited their year-in-review summary in anticipation of sharing it with friends or on social media. Furthermore—and likely as a result—there was less variation in explanations for this behavior, with respondents often reporting that they saw no point in editing their results for one reason or another: it wasn’t worth 24 the effort, they didn’t plan to post them, they don’t care that much about what others think, or they’d just never thought to do so. Acceptance of as-is results in the form of an “it is what it is” attitude was one of the most common among participants in response to this question, though a notable split occurs within this group. Some participants indicated feeling this way due to self-assuredness; “I listen to what I want to,” says one participant, implying that they cared more about enjoying the music they listen to than their results. Other participants with this “it is what it is” attitude, though, seemed to become defensive when asked this question. As observed with responses to the question preceding this one, a subset of certain users chose to engage in hostility towards the hypothetical “other” implied by the question, making comments that read as passive aggressive. One response reads: “I like what I like lol, but what a wild concept.” Had this participant not included that second clause, this response would have a neutral connotation, but instead they chose to make known their irritation with those who might not be as confident—or, potentially, with the survey for questioning that confidence. Several other responses offer similar sentiments, with participants stating that editing year-in-review results is “so silly” and “dishonest,” or implying that someone would have to be “insecure” to edit their results. Another response reads “That’s crazy, if you’re that embarrassed keep it to yourself,” reinforcing the idea that some participants feel it is easier and more logical to simply refrain from posting sub-optimal results than go through the effort of editing them. The prevailing thinking among participants offering these prickly responses seems to be that, even if one’s results are embarrassing or unsatisfying, that person should “just own it,” staying true to themselves instead of taking the time and effort to edit their results. These participants value authentic self-representation more than an optimized year-in-review summary, and their defensive “it is what it is” mentality implies a certain level of implicit trust in the accuracy and authenticity of the year-in-review results. I will continue to discuss this in subsequent sections. Also present in the responses to this question is participant concern for preserving the accuracy of their year-in-review results. Several respondents reported a desire to preserve their “authentic” and “valid” results, citing it as their reason for not editing their year-in-review summaries. In a similar vein, one participant says: “Editing my results feels outside of the spirit of Spotify Wrapped. If my data is being used, I want the real, raw findings to laugh at with my 25 friends.” This response is interesting because it is one of the few that directly states the participant’s awareness of—in this case—Spotify’s data collection practices that fuel Spotify Wrapped, providing additional support for my claim that consumers are aware of economic surveillance and now seeking ways to benefit from it. This response shows that this participant has not only awareness of economic surveillance, but also expectations of it; they, in their own words, want something (i.e. the “real, raw findings”) as payment for the use of their data. However, despite the similar nature of the two questions asking about proactive and retroactive results manipulation, the volume of retroactive-manipulation responses that mentioned this concern was significantly less than among the proactive-manipulation responses. This is incongruent with my expectations, given my assumption that the two types of results manipulation are simply two different ways for users to accomplish the same thing: to possess year-in-review results that they feel are optimal according to their values, priorities, and goals. How users reach that goal is not as important as why that is their goal in the first place. As discussed, findings from this survey about proactive results manipulation suggest that accuracy and authenticity are primary concerns among participants—they avoid altering their natural listening habits to preserve the accuracy of their results, or they choose to curate their listening habits to ensure authentic results congruent with their identity—and are therefore potential explanations. However, these concerns are not mirrored in the results associated with retroactive results manipulation. Why do individuals engage more willingly with proactive manipulation more than retroactive manipulation, despite both courses of action having the same outcome? Why is authenticity more carefully considered proactively, when it could be achieved more easily retroactively? What makes editing one’s year-in-review summary more “stupid” than spending months carefully avoiding listening to “embarrassing” music on one’s primary music streaming platform? The answers to these questions are outside the scope of present research, and will therefore require additional research to address them. Finally, there is one final outlying response generated by this question about retroactive results manipulation worth discussing, as it hints at the existence of a subset of users not well represented by this data but still potentially present in general: individuals who wish to edit their year-in-review summaries but lack the skills, knowledge, or tools to do so. This response reads “I’m not that good at Canvas :/,” implying that this individual might edit their results if they knew how to convincingly use some sort of image editing tool like Canva, which is the tool I 26 assume they meant to reference but were foiled by autocorrect. While my interpretation of this response is circumstantial at best and based on several assumptions, I believe it is still worth mentioning as it identifies the possible presence of an interesting subgroup of users that may be worth seeking out and examining in future research. Figure 8: Summarized heatmap of themes in participant reasoning identified during the affinity diagram cross-analysis of responses to survey questions 11, 13, 54, and 55. Reasons for Third-Party Site Use: A Qualitative Analysis As seen in Figure 6, 21% of participants—all of which were Spotify users—reported seeking additional information about their music streaming habits from third-party sites (like Volt.fm or Fanalytics) that offer data-reporting services as their exclusive function. These sites request direct access to the user’s Spotify account, with which they can read that person’s Spotify usage data and use it to compile reports similar to Spotify Wrapped—only more in- depth. According to my previous research, the appeal of sites like this is the ability to generate additional, more descriptive Spotify Wrapped-like reports whenever and about whatever desired; for some, receiving one Spotify Wrapped slideshow each year is not enough, and these sites give users the opportunity to rectify the dissatisfaction they experience as a result. The minutiae of what exactly these third-party reports look like is less important than the general understanding that they are more detailed, more customizable, and more readily available than the original data summary offered by Spotify (i.e. Spotify Wrapped). Of interest here is the observed trend in user 27 behavior that shows individuals willingly offering up additional personal information to these third-party sites, essentially subjecting themselves to unnecessary economic surveillance beyond that they already experience from Spotify in order to achieve a more satisfactory year-in-review result. Reasons for doing this, as per the qualitative responses to the survey question about third- party site use, center around curiosity. Most participants cited intrinsically-motivated interest in what the third-party sites could offer, wanting nothing more than to visualize their personal tastes in a new way or learn more about themselves. People gravitate to these third-party sites for personal enjoyment more than anything else; only one participant mentioned using third-party sites for an extrinsically-motivated reason—participation in Instagram trends. Thus, not only are consumers playing an active role in the violation of their own privacy, they enjoy doing it. It is fun for them. The gratification they receive from the products of economic surveillance often outweighs the risks. Even among participants who reported never engaging with third-party sites in this way, only two respondents cited a concern for their personal privacy as their reason for abstaining from such behavior. Of the rest, some felt third-party site use was either not worth the effort, provided little to nothing of additional value, or was “silly to do,” and the majority simply did not know these third-party sites and services existed. However, notably, two participants claimed that they had been unaware of third-party sites and their function prior to taking the survey, but learning about them from the survey made them curious and inclined to use them in the future. I found this interesting as an intended consequence of my research—in studying the phenomena, I may have played a small part in perpetuating it. 28 Figure 9: Summarized heatmap of themes in participant reasoning identified during the affinity diagram cross-analysis of responses to survey questions 15, 16, 57, and 58. Data Privacy After spending several minutes in the Music Streaming section of the survey, all participants (regardless of diverging pathway previously occupied) were rerouted to the last major set of survey questions: the Data Privacy section. This section consisted of five questions intended to assess participants’ awareness of economic surveillance, determine their attitudes towards digital privacy and its achievability, and identify any actions they take to secure their personal information. Assessing the Privacy-Mindedness of the Participant Sample Generally, as seen in Figure 10, the majority of participants viewed the security of their personal data as “somewhat important,” which the survey quantified as having a middling amount of diligence regarding safety measures. This is congruent with expectations based on previous data privacy research (Draper, 2017; Marwick & Hargittai, 2019; Afriat et al., 2021). Participants that claimed their data security was “very important” were the second majority at 22%, though it is unclear whether they are reporting on the importance of their data security or their personal diligence. In hindsight, adding action-based quantifiers to the answer options may have muddied the data I intended to gather with this question. By forcing participants to mentally couple the importance of their data privacy with actions taken to secure it, I made it impossible for participants to report their stagnant, action non-dependent attitude about their personal privacy. 29 Because of this, I cannot accurately identify any discrepancies between participant attitudes (their claims) and behavior (their actual actions, or lack thereof) pertaining to the security of their personal data as I intended to in this analysis. Figure 10: Count and percentage of participants per personal data security quantifier subdivided by age. However, I maintain that this data still allows me to draw the conclusion that this sample of participants was—overall—more privacy-conscious than not; the majority of them (88%) indicated an actionable desire to protect the security of their digital data, reporting moderate to significant diligence based on the quantifiers. These findings are corroborated by data (represented in Table 3) collected by a subsequent question in this section of the survey that asked respondents to identify the steps they took as internet users to ensure the security of their personal data. Of the eight static options (i.e. all but “Other,” which was a fill-in-the-blank option), five of them were selected by more than half of participants: “I avoid giving my personal information to untrustworthy websites,” “I make strong passwords,” “I only allow websites to use necessary cookies,” “I set up two-factor authentication whenever possible,” and “I opt out of automatic data or issue reporting programs.” Of these, “I avoid giving my personal information to untrustworthy websites” was selected by the most participants—132 out of 155, or roughly 85%. 30 Security Action Taken (SS = 155) “I avoid giving my personal information to untrustworthy websites” “I make strong passwords” “I only allow websites to use necessary cookies” “I set up two-factor authentication whenever possible” “I opt out of automatic data or issue reporting programs” “I use a VPN’ “I don’t allow websites to use any cookies’ “I pay a company to remove my information from data brokerage sites” Other [FITB] Check Count % of Sample 132 112 111 109 82 32 21 5 12 85.2% 72.3% 71.6% 70.3% 52.9% 20.6% 13.5% 3.2% 7.7% Table 3: Number and percentage of participants (out of 155 total sample size) that selected each of the internet security actions presented by Q32, which allowed participants to “select all that apply.” However, please note that the survey did not specify what constituted an “untrustworthy” website—that was left up to participant discretion, as was what constituted a “strong” password or “necessary” cookies. This injects the data with more nuance, yes, but also allows for a clearer understanding of how participants perceive their efforts. Whether participants are—by objective best-practice standards—creating strong passwords matters less than if they believe they are making strong passwords, as capturing that assumption of success, competence, and ability to secure one’s personal data is also worth quantifying. As such, these results support my assertion that the privacy-conscious among this participant sample far outnumber the privacy-unconscious and privacy-apathetic. Assessing the Privacy Cynicism of the Participant Sample However, as established by previous research, a privacy-conscious person is often also a privacy-cynical person; this connects back to the idea that consumers would prefer to achieve digital privacy and security, but do not feel it is possible (Hoffmann et al., 2016). I designed a specific matrix of questions to assess this in survey respondents. Participants were presented with three statements about data privacy and asked to rate their agreement with them from “strongly agree” to “strongly disagree.” 31 Figure 11: Comparative visualization of number and percentage of participants that agreed, disagreed, and felt neutral about the three data privacy statements presented by question 33. The first of these three statements was “I am okay with companies collecting and using my personal information,” which participants met with general disagreement; as seen in Figure 11, 55 participants selected “disagree” and 40 selected “strongly disagree,” bringing the overall disagreement percentage to 62%. 15 participants reported agreeing with the statement, indicating they were “okay with” the collection and use of their personal information by companies. This group is the minority, but its existence is interesting, if not surprising, and of these 15, 9 (60%) belonged to the 18-24 age group—more on this later. 42 participants claimed to be “neutral” towards the statement; this group is of interest firstly because of its similarity in size to the “strongly disagree” group, and secondly because it is the one statement among the three that received the strongest neutral reaction. However, I would need more information to determine how participants self-defined neutrality; I suspect that the “neutral” option might have become the default answer for respondents who might have had conflicted feelings they could not sort out during the short duration of the survey. As such, it is difficult to say whether these participants felt truly neutral about these statements, or were unsure upon encountering questions like this for the first time. The second statement was “There isn’t much I can do to keep companies from collecting and using my personal information,” which was met with general agreement; the “agree” and 32 “strongly agree” groups account for 64% of participants. This aligns with scholarship’s understanding of the resigned pragmatist and privacy cynic as discussed early on in this paper, especially when considered in tandem with responses to the previous statement: people disagree with economic surveillance, but also feel powerless to protect themselves from it. Among participants in this research, however, there were a notable 25 (16%) respondents who rejected this powerlessness by disagreeing or disagreeing strongly with this second statement. When cross-analyzed with responses from an earlier question that asked participants to rate the importance of their personal privacy based on actionable protection measures they felt they could take, we can see that participants who disagreed with the second statement’s assertion that nothing could be done about one’s personal security more often reported valuing and—because of the way the response options were worded—staying diligent about the security of their personal information. In fact, based on the visualization of this overlap seen in Figure 12, very few people (i.e. two) who disagreed with the second statement in any way rated the security of their personal information anything other than “important.” This visualization also shows that, yes, the majority of respondents could safely label themselves resigned pragmatists or privacy cynics, but I assert that those occupying the second quadrant of this graph—those who do not feel entirely hopeless—should not be ignored. 33 Figure 12: Visualization of cross-analysis between Q33b (“Please rate the extent to which you agree with the following statement: There isn’t much I can do to keep companies from collecting and using my personal information”) and Q31 (“How important to you is the security of your personal digital data?”). The third and final statement participants were asked to assess their agreement with was “Companies collecting my personal information is the trade-off for using their products or services.” Among the three statements, assessments of this one were the most mixed; almost equal portions of respondents reported that they agreed (25%), remained neutral (27%), or disagreed (28%) (see Figure 11 above). This is again an instance where I cannot with any confidence determine how respondents defined their own agreement or neutrality, or why they chose to assess the statement the way they did, but Turow et al.’s understanding of the tradeoff fallacy may have an explanation for this even split. The authors assert that “a majority of Americans are resigned to giving up their data—and that is why many appear to be engaging in tradeoffs” (Turow et al., 2015, p. 3), meaning consumers do not actively respond to tradeoffs so much as they simply resign themselves to the fact that the collection of their data is inevitable and, as a result, do not engage with or even notice tradeoff opportunities. This conceptualization of the tradeoff fallacy applied to the lack of harmony among 34 assessments of this third statement leads me to believe that most participants in this sample had not considered the existence or benefits of tradeoffs as much as they had considered their feelings about the collection and use of the data in general. The remaining participants agreed that a lack of data privacy was a tradeoff for using products and services; by my assessment, this makes these participants the ones most likely to engage in reactive opportunism. Their awareness of their personal data as something they possess and can use as a bargaining chip in these digital privacy tradeoffs primes them to take advantage of those tradeoff opportunities—with a small shift in an individual’s intentionality, their personal data could go from something idly lost to currency willingly given, especially if that individual identifies the relinquishing of their personal data as a means to achieve something specific of value to them. A very weak correlation exists between survey respondents who agreed that their personal data was the tradeoff for online access to products and services and survey respondents that engaged in third-party site use, but additional research is needed to substantiate that connection. Taking the responses to all three of these statements into consideration at once, results indicate that this participant sample disagrees with economic surveillance practices, experiences resignation about their perceived lack of ability to do anything about it, and inconsistently considers the tradeoff value of their data, firmly cementing them within the definitional bounds of both the resigned pragmatist and the privacy cynic. Based on this, I consider this group of participants representative of a larger, more general consumer base—at least where consumer sentiments about data privacy and economic surveillance are concerned. 35 DISCUSSION Following such a lengthy description of findings, I find it necessary to recenter and reiterate the primary purposes of this research. As a reminder, my three core research questions were as follows: 1. Are music streaming platform users proactively and/or retroactively altering the results generated by the year-in-review feature of their preferred platform as my previous findings suggest? If so, what motivates this behavior? If not, why are they abstaining? 2. Are music streaming platform users volunteering their personal information to third-party sites that offer additional data reporting services as my previous findings suggest? If so, what motivates this behavior? If not, why are they abstaining? 3. To what extent are consumers exhibiting reactive opportunism, and how does such behavior present itself within the context of the Spotify Wrapped social trend? In working to answer the first two questions in the above list prior to carrying out the survey protocol, I developed one general hypothesis applicable to all three behavior types: the behavior was likely not as common as suggested by social media’s sensationalist algorithm, but still occurred often enough to warrant examining. I understood that the number of consumers appearing to participate in opportunistic behavior was inflated artificially by information collection methods, but I underestimated the discrepancy between what I’d learned from my initial scrape of social media and reality. In reality, based on the data collected from real people by the survey, consumers are altering their year-in-review results and using third-party sites significantly less freely and frequently than I expected. For example, this new survey data shows that less than 25% of all respondents reported participating in any of the three value-maximizing behaviors examined (see Figure 13). This is particularly unexpected in the case of retroactive year-in-review results manipulation, which occurred in 0% of survey participants. I did not come across an overwhelming number of examples of this type of behavior in my initial scouring of social media, but I did identify a non- zero number of occurrences, making this complete lack of behavior engagement from survey participants surprising. However, it makes sense in the context of these new findings, which suggest that individuals value their time, energy, and enjoyment of music much more than they value optimized year-in-review results. 36 Figure 13: A visual representation of the percentage of all participants engaging in each value- maximizing behavior of interest. An alternative visualization of the same data from Figure 5. Additionally, these new findings have encouraged me to rethink my understanding of both how and why individuals are exhibiting reactive opportunism and their reasons for doing so. In previous work, I suggested that one’s desire to alter their Spotify Wrapped results or further violate their own privacy with third-party site use was attributable to an instinctually-driven and biologically-encoded need for belonging and social acceptance, which is made easier and more likely by one’s by-any-means-necessary participation in online social trends. I also called on social identity theory and a disciplinary understanding of digital personas to explain the identity- based elements of the phenomena, claiming that “the motivation behind the changes in behavior—some users desire to improve the representation of their true identity, others wish to hide it entirely, and some just want to fit in—[varies], but the ultimate goal of social acceptance and upward mobility remains the same” (Keenan, 2024, p. 32). This idea informed my previous conceptualization of Spotify Wrapped (and other tangible year-in-review outcomes) as an identity artifact that individuals could use as social capital with which to “prove” or “purchase” membership to certain groups. I believe these initial conclusions are still—to a certain extent—relevant to the present analysis, but I now posit that the application of neuroeconomics’ value-based decision-making 37 theory allows for a more succinct, relevant, and foundational understanding of these observed behavior trends within the context of reactive opportunism. A Revised Understanding: Applying Value-Based Decision Making to Reactive Opportunism Centuries of psychology, neuroscience, and social science scholarship focused on understanding how humans make decisions and prioritize outcomes have shaped the concept of value-based decision making. At its core, value-based decision making refers to the series of mental calculations an individual undertakes when making a decision in order to maximize the expected benefits and minimize the potential costs of a decision-based outcome (Suri et al., 2020; von Neumann & Morgenstern, 2004). Variables involved in these calculations include expected net benefits, outcome probabilities, objective economic costs, potential intangible costs, personal goals and preferences, individually-held core values, and more (Rangel et al., 2008; Brosch & Sander, 2013). Applied value-based decision-making research suggests that individually-held core values or beliefs are one of the most important variables, and that they are “an integral part of an individual's behavior, particularly early in the decision-making process because they form the foundation of an individual's perspective” (Hall & Davis, 2007, p. 1589; Inzlicht et al., 2009; Jost & Amodio, 2011). Neuroeconomics is the discipline tasked with charting and understanding these complex interactions and their neurobiological implications, and the potential applications of neuroeconomic value-based decision-making frameworks are seemingly endless (Rangel et al., 2008). However, the idea most important to the present discussion is an (admittedly simplified) understanding that a) human decision making involves the complicated evaluation and comparison of an ever-changing and situationally-dependent number of variables, and that b) chiefly considered among those variables are a cost/benefit analysis of any given outcome and an individual’s core values, beliefs, and preferences. Applying this conceptualization of value-based decision making to my initial conclusions about the increase in reactive opportunism exhibited by consumers allows me to develop a deeper and more meaningful understanding of their implications generally, but especially within the context of this new data. These new survey results seem to entirely “debunk” my initial assessment that consumers are reacting opportunistically to economic surveillance by engaging consistently in behavior intended to maximize the social buying power of resultant identity artifacts (like year-in-review summaries)—the data clearly denotes not only a lack of 38 engagement in such behavior among survey participants, but also a lack of desire for engagement in this behavior as well. However, in examining these new findings through a value-based decision-making lens, I now conclude that consumers are reacting opportunistically to economic surveillance (or any other unwelcome occurrence) by engaging consistently in whatever behavior allows them to minimize the most risk and achieve the most benefits, with risks and benefits being determined by their personal values. Attitude, behavior, and reasoning patterns in survey participant responses take on new meaning. Those who chose to use third-party sites appear to value access to additional data more than the security of their personal information. Those who took offense to the idea that anyone would edit their year-in-review results appear to value honesty and integrity more than the potential social capital of optimal year-in-review results. Those who avoided doing anything that could “ruin the accuracy” of their year-in-review results appear to value the perceived “truth” of surveillance-based products more than the opportunity to curate an identity artifact, and vice versa: those who altered their music streaming habits in a way that excluded “embarrassing” or “problematic” music from their year-in-review summary appear to value the ability to do so more than the raw data. The survey results are rife with examples like this, and although I would need more information—ideally in the form of direct interviews with participants—to determine the accuracy of my interpretations, I find them robust enough to build on further. So consumers are reacting opportunistically in situations brought about by economic surveillance, just not often with the explicit intention to “get something” in return for their data as I previously thought. Instead, most of them are simply navigating economic surveillance as they would any other uncertain, unfamiliar, or less-than-ideal situation: guided by their personal goals, values, and beliefs. Generally, their behavior was not exploitative or retaliatory or even considerate of economic surveillance. For example, most of the survey participants that chose to use third-party sites did so because they were curious about themselves, not because they felt the need to bleed a surveillance-provided opportunity dry of its potential benefits. The self-violation of privacy in service of achieving additional, desired, surveillance-reliant outcomes characteristic of reactive opportunism is there, yes, but not in focus. What is in focus is what that individual cares about, and it is very seldom their digital privacy. 39 This makes sense in the context of my introductory discussion of the privacy paradox, resigned pragmatism, and privacy cynicism: consumers, in general, resign themselves to a lack of data privacy, often because they feel there is not much they can do to stop it or otherwise protect themselves from economic surveillance. My survey data corroborates these findings as well. As summarized by Figure 11, survey participants—in general—disapproved of economic surveillance while simultaneously reporting a perceived lack of ability to do anything about it: participants who disagreed/disagreed strongly with economic surveillance and agreed/agreed strongly with a feeling of hopelessness accounted for almost 40% of the sample, as observed in Figure 14. In other words, more respondents reported experiencing resigned pragmatism than not. Figure 14: A four-quadrant chart visually representing responses to two of the three agreement Likert-scale Data Privacy questions: “I am okay with companies collecting and using my personal information” and “There isn’t much I can do to keep companies from collecting and using my personal information.” In their research on the privacy paradox and privacy cynicism, Marwick and Hargittai also discuss the previously-identified concept of “privacy calculus,” which Hoffman et al. define 40 as “a rational decision individuals take about disclosing personal information on the Internet when weighing benefits against costs and potential risks” (Hoffman et al., 2016, p. 2; Marwick & Hargitai, 2019). This definition, notably, is exceptionally similar to that of value-based decision making, which describes a similar calculus involved when making any decision, not just ones about privacy. This comparison serves to further illustrate my conclusion that reactive opportunism is a manifestation of value-based decision making—one involving a certain amount of privacy calculus as well as value calculus—in which individuals value surveillance-based rewards more than they value their own privacy. 41 Limitations CONCLUSION A tight schedule limited almost all aspects of this research. As I noted at multiple points in discussing my results, several aspects of this research could have benefitted from follow-up interviews with participants, which would have allowed me the opportunity to gather more specific, tailored data and prompt participants to expound further on their attitudes and other more subjective responses. Such additional information would have provided clarity in instances where room for interpretation of survey questions and responses resulted in muddied or unclear findings. Additionally, as I touched on in the methods section, the design of the survey potentially limited its effectiveness in collecting relevant, valid, and accurate data. In future research, I plan to explore additional research-supported survey-building strategies—such as item and group randomization—as potential ways to reduce survey abandonment rates and generally optimize the survey experience. Furthermore, the shortened timeline also limited the completeness of my analysis. I would have liked to complete some deeper and more intricate cross-analyses between additional variables or specific survey responses, which may have allowed for slightly more salient findings and discussion. Implications Despite its limitations, I believe this research makes important interdisciplinary connections that can and will hopefully continue to benefit all discourse communities implicated. At its core, this research seeks to understand human behavior—namely how individuals assess and navigate threats to their personal privacy and security—and I assert that such a practice never fails to produce interesting insights, especially when one prioritizes human-centricity in research and design. Malleable and unlimited in application, I would argue that any effort to gain a deeper understanding of how individuals think, make decisions, and prioritize the things they value could benefit a vast variety of humanities- and social science-based human-subject research—namely user experience research and design. Striving to learn more about how individuals approach complex situations with innumerable variables to negotiate holds great promise as a strategy for facilitating ethical research and design; the more we know, the more humanistic our efforts become. 42 Opportunities for Future Research As I mentioned in previous sections, interviews present the most obvious opportunity for a direct continuation of this case study, but the scope limitations of this research have also generated a multitude of related research avenues untouched by this thesis but still of great interest. For example, above I introduced the idea of human behavior research as a facilitator for ethical user experience research and design, and from there I arrive at the following unknown: do UX designers have a responsibility to build experiences that allow or encourage users to protect themselves from surveillance capitalism? If a designer makes choices that conceal the economic surveillance practices at work, is that deceptive design? No matter how one answers those questions, one could still analyze more closely how best practices in UX research and design currently handle and present surveillance capitalism to users and how those design choices direct users into or away from surveillance situations. Additionally, I see many opportunities for related research centered around authenticity, social capital, the evolving purpose of digital personas, active versus passive social media use, and more. Table 4 below outlines several topics of interest, findings from this case study that sparked additional inquiry, and a handful of associated starting-point questions. Answering any of those questions would require considerable additional research, and I look forward to conducting that research in the future. 43 Topic Generative Finding w/ Research Questions Changes in how people are using social media and engaging with trends Cognitive dissonance associated with surveillance capitalism Authenticity The majority of survey participants accessed social media daily yet remained mostly inactive on the platform (i.e. scrolling and consuming content as opposed to making or interacting with it). → Does this point to a shift in the primary purpose of social media platforms from active social connection to inactive consumption or entertainment? → Do “rewards” like Spotify Wrapped have the power to encourage active (as opposed to inactive) social media use? → If so, does surveillance capitalism have anything to do with this change? How, and to what extent? → If the purpose of social media use has shifted from active social connectivity to passive content consumption, why do people still feel the need to curate digital personas? How much does it have to do with surveillance avoidance (versus protecting interpersonal privacy or ensuring authentic presentation of self)? Less than 15% of survey participants reported an interest in engaging with or participating in social media trends. → Is this finding representative of a larger phenomenon? If so, what allows trends to remain “trends” with such low engagement? Does the participation of the few serve as entertainment for the many? Some qualitative responses implied that some participants trusted and/or expected their platform of choice to return accurate data and still valued that data despite disagreeing with the surveillance of the service. → How can trust in the accuracy of the data and distrust of the entity that provided it exist simultaneously? Participants showed more willingness to proactively manipulate their YIR report to ensure authentic results, with some associating the alternative (i.e. retroactively altering the report to ensure results) with moral transgression. → Why is authenticity more carefully considered proactively, when it could be achieved more easily retroactively? Table 4: A summary of potential research avenues I identified based on findings of interest. 44 REFERENCES Afriat, H., Dvir-Gvirsman, S., Tsuriel, K., & Ivan, L. (2021). “This is capitalism. It is not illegal”: Users’ attitudes toward institutional privacy following the Cambridge analytica scandal. The Information Society, 37(2), 115–127. https://doi.org/10.1080/01972243.2020.1870596 Akhter, S., Robbins, M., Curtis, P., Hinshaw, B., & Wells, E. M. (2022). Online survey of university students’ perception, awareness and adherence to COVID-19 prevention measures. BMC Public Health, 22(1). https://doi.org/10.1186/s12889-022-13356-w Andreassen, C. S., & Pallesen, S. (2013). Facebook addiction: A reply to Griffiths (2012). Psychological Reports, 113(3), 899–902. https://doi.org/10.2466/02.09.pr0.113x32z6 Andreassen, C. S., Torsheim, T., Brunborg, G. S., & Pallesen, S. (2012). Development of a Facebook Addiction Scale. Psychological Reports, 110(2), 501–517. https://doi.org/10.2466/02.09.18.pr0.110.2.501-517 Auxier, B. (2021, April 7). Social Media Use in 2021. Pew Research Center. http://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/ Becker, R. (2022). Gender and Survey Participation: An Event History Analysis of the Gender Effects of Survey Participation in a Probability-based Multi-wave Panel Study with a Sequential Mixed-mode Design. Methods, data, analyses: a journal for quantitative methods and survey methodology, 16(1), 3-32. https://doi.org/10.12758/mda.2021.08 Bowman, N. D., & Clark-Gordon, C. V. (2019). Bergen Facebook Addiction Scale. In Graham, E. E., & Mazer, J. P. (Eds.), Communication Research Measures III: A Sourcebook (pp. 187–189). https://doi-org.proxy1.cl.msu.edu/10.4324/9780203730188 Braun, V., Clarke, V., Boulton, E., Davey, L., & McEvoy, C. (2020). The online survey as a “qualitative” research tool. International Journal of Social Research Methodology, 24(6), 641–654. https://doi.org/10.1080/13645579.2020.1805550 Brosch, T., & Sander, D. (2013). Neurocognitive mechanisms underlying value-based decision- making: From core values to economic value. Frontiers in Human Neuroscience, 7, 1–8. https://doi.org/10.3389/fnhum.2013.00398 Cheng, C., Lau, Y., Chan, L., & Luk, J. W. (2021). Prevalence of social media addiction across 32 nations: Meta-analysis with subgroup analysis of classification schemes and cultural values. Addictive Behaviors, 117, 1–8. https://doi.org/10.1016/j.addbeh.2021.106845 Draper, N. A. (2017). From privacy pragmatist to privacy resigned: Challenging narratives of rational choice in digital privacy debates. Policy & Internet 9(2), 232-251. https://doi.org/10.1002/poi3.142 Ebbinghaus, H. (1913). On memory: A contribution to experimental psychology. Teachers College, Columbia University. 45 Ernala, S. K., Burke, M., Leavitt, A., & Ellison, N. B. (2020). How well do people report time spent on Facebook? Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376435 Gong, X., Ye, Z., Liu, B., Yu, S., & Yan, Y. (2021). How does social currency influence prosocial behavior? The role of collective self-esteem and communication network heterogeneity. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.672505 Gottfried, J. (2024, January 31). Americans’ Social Media Use. Pew Research Center. http://www.pewresearch.org/internet/2024/01/31/americans-social-media-use/ Griffiths, M. D. (2012). Facebook addiction: Concerns, criticism, and recommendations—a response to Andreassen and colleagues. Psychological Reports, 110(2), 518–520. https://doi.org/10.2466/01.07.18.pr0.110.2.518-520 Hall, D. J., & Davis, R. A. (2007). Engaging multiple perspectives: A value-based decision- making model. Decision Support Systems, 43(4), 1588–1604. https://doi.org/10.1016/j.dss.2006.03.004 Hargittai, E., & Marwick, A. (2016). “What can I really do?” Explaining the privacy paradox with online apathy. International journal of communication, 10, 21. https://ijoc.org/index.php/ijoc/article/view/4655 Hoffmann, C. P., Lutz, C., & Ranzini, G. (2016). Privacy cynicism: A new approach to the Privacy Paradox. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 10(4), 1–18. https://doi.org/10.5817/cp2016-4-7 Inzlicht, M., McGregor, I., Hirsh, J. B., & Nash, K. (2009). Neural markers of religious conviction. Psychological Science, 20(3), 385–392. https://doi.org/10.1111/j.1467- 9280.2009.02305.x Jensen, A. R. (1962). An empirical theory of the serial-position effect. The Journal of Psychology, 53(1), 127–142. https://doi.org/10.1080/00223980.1962.9916559 Jeong, D., Aggarwal, S., Robinson, J., Kumar, N., Spearot, A., & Park, D. S. (2023). Exhaustive or exhausting? Evidence on respondent fatigue in long surveys. Journal of Development Economics, 161, 1–20. https://doi.org/https://doi.org/10.1016/j.jdeveco.2022.102992 Jost, J. T., & Amodio, D. M. (2011). Political ideology as motivated social cognition: Behavioral and neuroscientific evidence. Motivation and Emotion, 36(1), 55–64. https://doi.org/10.1007/s11031-011-9260-7 Keenan, S. (2024). Defining reactive opportunism: An exploration of the social phenomenology of Spotify wrapped amidst growing digital privacy concerns. Proceedings of the 42nd ACM International Conference on Design of Communication, 26–36. https://doi.org/10.1145/3641237.3691648 46 Lavrakas, P. J. (Ed.). (2008). Respondent Fatigue. In Encyclopedia of Survey Research Methods (pp. 743–743). essay, SAGE Publications, Inc. https://doi.org/10.4135/9781412963947.n480 Le, A., Han, B. H., & Palamar, J. J. (2021). When National Drug Surveys “Take too long”: An examination of who is at risk for survey fatigue. Drug and Alcohol Dependence, 225, 108769. https://doi.org/10.1016/j.drugalcdep.2021.108769 Loiacono, E., & Wilson, V. (2020). Do we truly sacrifice truth for simplicity: Comparing complete individual randomization and semi-randomized approaches to survey administration. AIS Transactions on Human-Computer Interaction, 12(2), 45-69. https://doi.org/10.17705/1thci.00128 Marwick, A., & Hargittai, E. (2019). Nothing to hide, nothing to lose? Incentives and disincentives to sharing information with institutions online. Information, Communication & Society, 22(12), 1697-1713. https://doi.org/10.1080/1369118X.2018.1450432 Murdock, B. B. (1962). The serial position effect of free recall. Journal of Experimental Psychology, 64(5), 482–488. https://doi.org/10.1037/h0045106 Pernice, K. (2024, February 2). F-shaped pattern of reading on the web: Misunderstood, but still relevant (even on mobile). Nielsen Norman Group. https://www.nngroup.com/articles/f-shaped-pattern-reading-web-content/ Pernice, K., Whitenton, K., & Nielsen, J. (2021). How people read online: The eyetracking evidence (2nd ed.). Nielsen Norman Group. Rangel, A., Camerer, C., & Montague, P. R. (2008). A framework for studying the neurobiology of value-based decision making. Nature Reviews Neuroscience, 9(7), 545– 556. https://doi.org/10.1038/nrn2357 Rolstad, S., Adler, J., & Rydén, A. (2011). Response burden and questionnaire length: Is shorter better? A review and meta-analysis. Value in Health, 14(8), 1101–1108. https://doi.org/10.1016/j.jval.2011.06.003 Spotify. (n.d.). 2023 wrapped. For the Record. https://newsroom.spotify.com/media-kit/2023- wrapped/ Spotify. (n.d.). Spotify Premium - Spotify (US). Spotify. http://www.spotify.com/us/premium/ Spotify. (2025, February 4). Spotify Reports Fourth Quarter 2024 Earnings. For the Record. http://newsroom.spotify.com/2025-02-04/spotify-reports-fourth-quarter-2024-earnings/ Spotify. (2025, February 7) About Spotify. For the Record. https://newsroom.spotify.com/company-info/ 47 Suri, G., Gross, J. J., & McClelland, J. L. (2020). Value-based decision making: An interactive activation perspective. Psychological Review, 127(2), 153–185. https://doi.org/10.1037/rev0000164 Turow, J., Hennessy, M., & Draper, N. (2015). The tradeoff fallacy: How marketers are misrepresenting American consumers and opening them up to exploitation. SSRN Electronic Journal, 1–24. https://doi.org/10.2139/ssrn.2820060 Oxford University Press. (n.d.). Virtue Signalling | Virtue Signaling, n. In Oxford English Dictionary. https://doi.org/10.1093/OED/9463796881 Verbeij, T., Pouwels, J. L., Beyens, I., & Valkenburg, P. M. (2021). The accuracy and validity of self-reported social media use measures among adolescents. Computers in Human Behavior Reports, (3), 1–11. https://doi.org/10.31234/osf.io/p4yb2 von Neumann, J. & Morgenstern, O. (2004). Theory of Games and Economic Behavior. Princeton: Princeton University Press. https://doi.org/10.1515/9781400829460 Wilson, V., Srite, M., & Loiacono, E. (2021). The effects of item ordering on reproducibility in Information Systems Online Survey Research. Communications of the Association for Information Systems, 49(1), 760–799. https://doi.org/10.17705/1cais.04940 Wolf, C., Joye, D., Smith, T. W., Fu, Y., & Smyth, J. D. (2016). Designing Questions and Questionnaires. In The SAGE Handbook of Survey Methodology (pp. 218–235). SAGE Publications Ltd. https://doi.org/10.4135/9781473957893.n16 48 APPENDIX A: FULL SURVEY QUESTIONS LIST The table below includes all potential versions of all survey questions as encountered by participants on all pathways of the survey, separated by survey section. Questions were assigned unique numbers by Qualtrics, the data collection and analysis tool used, which is why some numbers are missing or out of order. I have chosen to keep Qualtrics’ numbering scheme to avoid confusion. Assume the participant proceeds through these questions in the order they appear in this list unless otherwise noted by a [jump to:] indicator. Abbreviations KEY: Question types • MC(s) = multiple choice, single answer (participants choose 1 of ≥2 options) • MC(m) = multiple choice, multiple answer (participants choose ≥1 of ≥2 options) • LAT = long answer text (participants type a unique response ≤20,000 character) • LM = Likert matrix (participants give each variable a rating on a 1-5 Likert scale) Other • FITB = fill in the blank (participants type a unique response ≤20,000 characters) Survey Pathway KEY: Spotify User Non-Spotify User w/ YIR Non-Spotify User w/o YIR SECTION 0: Consent Q# & Type Question & Response Option Text Notes 61 MC(s) Are you 18 years or older? • Yes • No [jump to: End Card] 62 MC(s) Please select: • Yes, I consent—take me to the survey! • No, I do not consent—take me home (this will end the survey) [jump to: End Card] SECTION 1: Social Media Use & Trend Engagement Q# & Type Question & Response Option Text Notes 1 MC(s) How often do you use social media? • Constantly—I use social media most hours of most days • Daily—I use social media for several hours daily • Weekly—I use social media several times a week • Monthly—I use social media several times a month • Rarely—I use social media when I need it, which is not often • Never 49 2 MC(s) When you do use social media, how active are you? • Very active—I post, comment, or otherwise actively interact with the content of both friends and strangers frequently • Somewhat active—I post and comment on my friends’ posts often • Not very active—I don’t post often, but spend lots of time scrolling my feed and liking any interesting posts I come across • Not active—I exclusively scroll my feed, never posting, liking or commenting on anything 3 How important is it to you to participate in online trends and social moments? • Very important—It is often one of my top priorities • Somewhat important—I keep track of trends, but only participate occasionally • Not very important—I notice trends, but never care enough to participate • Not important at all—I have minimal awareness of trends and/or lack the desire participate in them SECTION 2: Music Streaming Q# & Type Question & Response Option Text Notes 4 MC(s) How often do you use your preferred music streaming platform? • Very often—multiple times a day • Often—multiple times a week • Occasionally—less than once a week • Rarely—once a month or less • Never—I don’t stream music (i.e. you exclusively use sources that do not require internet access—like CDs, vinyl, the car radio, etc.—instead) [jump to: Q31] 5 MC(s) Is Spotify your primary and/or preferred music streaming platform? • Yes • No [jump to: Q18] 6 MC(s) Do you pay for Spotify Premium? • Yes, I have a subscription • No, I use the free version [jump to: Q8] 50 7 MC(m) Please select the reason(s) you maintain a Premium subscription: • Ad-free listening • Specific song selection (as opposed to only shuffle play) • Discounted rate (i.e. student or family pricing) • Offline listening • • Queue customization • Audiobook access • Other (FITB) Improved sound quality 8 MC(s) Which statement most accurately represents your feelings about Spotify Wrapped, Spotify’s year-in-review feature? Compare to Q50 • • • • I eagerly await its release each year—I can’t wait to see and share my results I think it’s cool to look at when it comes out, but I don’t give it much thought beyond that I don’t care about it at all—it doesn’t affect me I don't know what it is—I’ve never heard of or interacted with it 9 MC(s) Have you ever altered your Spotify music streaming habits in order to curate a certain Spotify Wrapped result? (Examples of alteration include maintaining two accounts, retroactively removing songs from listening history, excluding playlists from taste profile, turning off smart suggestions, etc.) Compare to Q51 • Yes • No 10 LAT Please elaborate on the motivation behind your actions (or lack thereof): 11 MC(s) Have you ever edited your Spotify Wrapped results before posting them to social media or otherwise sharing them with friends or followers? • Yes • No 13 LAT Please elaborate on the motivation behind your actions (or lack thereof): 15 MC(s) Have you ever used a third-party website like Fanalytics or volt.fm to retrieve more of your Spotify listening data than is provided by Spotify Wrapped? Compare to Q51 Compare to Q54 Compare to Q55 Compare to Q57 • Yes • No 51 16 LAT Please elaborate on the motivation behind your actions (or lack thereof): Compare to Q58 18 MC(s) Which music streaming platform do you primarily use instead? • Apple Music • YouTube / YouTube Music • Amazon Music • SoundCloud • Pandora • Tidal • Other [FITB] 19 MC(s) Does your preferred music streaming platform offer a year-in- review feature or end-of-year breakdown similar to Spotify Wrapped? (Examples include Apple Rewind, YouTube Rewind, etc.) • Yes • No [jump to: Q36] 49 MC(s) Which statement most accurately represents your feelings about the year-in-review feature of your preferred music streaming platform (i.e. Apple Rewind, YouTube Rewind, etc.)? • • • • I eagerly await its release each year—I can’t wait to see and share my results I think it’s cool to look at when it comes out, but I don’t give it much thought beyond that I don’t care about it at all—it doesn’t affect me I don't know what it is—I’ve never heard of or interacted with it 50 MC(s) Which statement most accurately represents your feelings about Spotify Wrapped, Spotify’s year-in-review feature? • • • • • I eagerly await its release each year—I can’t wait to see and comment on everyone’s results, regardless of what music streaming platform they use I like seeing everyone else’s results, but I often find myself feeling left out because I don’t have Spotify Wrapped results of my own to share I sometimes compare my results from my preferred platform to my friends’ Spotify Wrapped results for fun, but I don’t give it much thought beyond that I don’t care about it at all—It isn’t relevant or important to me I don't know what it is—I’ve never heard of or interacted with it 52 Compare to Q8, Q36 51 MC(s) Have you ever altered your music streaming habits in order to curate a certain year-in-review summary result? (Examples of alteration include maintaining two accounts, retroactively removing songs from listening history, using separate music streaming platforms for separate purposes, turning off smart suggestions, etc.) Compare to Q9 • Yes • No 52 LAT Please elaborate on the motivation behind your actions (or lack thereof): Compare to Q10 54 MC(s) Have you ever edited your year-in-review summary results before posting them to social media or otherwise sharing them with friends or followers? Compare to Q11 • Yes • No 55 LAT Please elaborate on the motivation behind your actions (or lack thereof): Compare to Q13 57 MC(s) Have you ever used a third-party website like Fanalytics or volt.fm to retrieve more of your music streaming and listening data than is provided by your year-in-review summary? Compare to Q15 • Yes • No 58 LAT Please elaborate on the motivation behind your actions (or lack thereof): Compare to Q16 Compare to Q8, Q50 36 MC(s) Which statement most accurately represents your feelings about Spotify Wrapped, Spotify’s year-in-review feature? • • • • • I eagerly await its release each year—I can’t wait to see and comment on everyone’s results, regardless of what music streaming platform they use I like seeing everyone else’s results, but I often find myself feeling left out because I don’t have Spotify Wrapped results of my own to share I sometimes compare my results from my preferred platform to my friends’ Spotify Wrapped results for fun, but I don’t give it much thought beyond that I don’t care about it at all—It isn’t relevant or important to me I don't know what it is—I’ve never heard of or interacted with it 53 SECTION 3: Data Privacy Q# & Type Question & Response Option Text Notes 31 MC(s) How important to you is the security of your personal digital data? • Very important—I’m very diligent about making sure my privacy is protected • Somewhat important—I take some steps to protect myself, but I could be more diligent • Not very important—I take some precautions, but only when I am forced to (i.e two factor authentication) • Not important at all—I don’t think about or act in the interest of the security of my personal data 32 MC(m) If any, what steps do you take as an internet user to ensure the • • • • • • security of your personal data? (Select all that apply.) I only allow websites to use necessary cookies I don’t allow websites to use any cookies I opt out of automatic data or issue reporting programs I use a VPN I set up two-factor authentication whenever possible I pay a company to remove my information from data brokerage sites I make strong passwords I avoid giving my personal information to untrustworthy websites • Other (FITB) • • 33 LM Please rate the extent to which you agree with the following statements: • • • “I am okay with companies collecting and using my personal information” “There isn’t much I can do to keep companies from collecting and using my personal information” “Companies collecting my personal information is the trade-off for using their products or services” SECTION 4: Demographics Q# & Type Question & Response Option Text Notes 34 MC(s) • What is your age? • 18-24 • 25-34 • 35-44 • 45-54 • 55+ 54 35 MC(s) What is your gender? • Male • Female • Nonbinary • Prefer not to say Table 5: Full list of survey questions and answer options by section, complete with notes. 55 APPENDIX B: SURVEY DISTRIBUTION LOG The table below illustrates the survey’s stages, dates, and channels of distribution. Also included is the estimated maximum number of potential participants reached via each channel (EMP).11 All digital distribution was done with the survey URL link, and all in-person distribution was done with a printed flyer that included a QR code to the survey. Phase/Action Where & How Date & Time EMP OPENED Link and QR code activated in Qualtrics 11/8/24 @ 3:00pm — Test Dist,12 Link shared to personal group chat #1 11/9/24 @ 9:30am (11/9/24) Link shared to personal group chat #2 11/9/24 @ 10:00am 6 8 Mass Dist. Link shared in university classroom 11/11/24 @ 8:00am 24 (11/11/24 – 11/15/24) QR code posted staff lunchroom 11/11/24 @ 11:00am 80 Link emailed to university listservs 11/11/24 @ 12:30pm 450 QR code posted at local yoga studio 11/12/24 @ 6:00pm QR code posted at local gym 11/13/24 @ 5:00pm 25 40 Link posted to three personal LinkedIn pages (one post reposted my two others) 11/13/24 @ 10:00pm 900 Link posted to one personal Instagram page 11/14/24 @ 2:00pm 125 QR code shared with staff of local business 11/15/24 @ 4:00pm 10 Follow-Up Dist. Link posted to one personal Facebook account 12/8/24 @ 3:00pm 350 (12/8/24 – 12/9/24) Link re-emailed to university listservs 12/9/24 @ 1:00pm 450 CLOSED Link and QR code disabled in Qualtrics 12/13/24 @ 6:45pm — Table 6: Breakdown of distribution timeline, channels, and estimated participants reached. 11 I calculated the EMP for social media-based distributions using the number of people “following” each social media page the survey was posted to, as this was the only semi-reliable way to quantify these digital social networks. Because it is unlikely that every single follower sees each post, the true number of people reached this way is likely much lower, but as EMP is an estimated maximum these numbers still are still relevant. 12 This small initial distribution to my friends and family served as a “dry run” that allowed me to troubleshoot any major issues with the survey before mass distributing it to less manageable channels. To preserve anonymity, I made sure to send it to several groups of people (as opposed to specific individuals) and did not check the incoming responses until after the first mass distribution. No issues were reported, and I proceeded with the mass distribution as planned. 56 APPENDIX C: DESCRIPTION OF AFFINITY DIAGRAM PROCESS WITH IMAGES Process Notes In order to identify trends and themes across the qualitative responses to questions 10, 13, 16, 52, 53, and 58, I created an affinity diagram for each of the three behaviors I wanted to examine: proactive YIR results manipulation, retroactive YIR results manipulation, and third-party site use. In making each diagram, I did the following: 1. Isolated relevant qualitative responses from the spreadsheet and formatted for easy printing a. Diagram 1 → responses to fill-in-the-blank questions Q10 and Q52 b. Diagram 2 → responses to fill-in-the-blank questions Q13 and Q 53 c. Diagram 3 → responses to fill-in-the-blank questions Q16 and Q58 2. Printed and cut responses apart so that each response had its own slip of paper (or “card”) a. White cards → Spotify user survey pathway responses b. Orange cards → non-Spotify user survey pathway responses 3. Labelled all cards with their corresponding participant response numbers from the spreadsheet, color coding each number based on each participants answer to the corresponding multiple choice question a. Diagram 1 → coded participants who answered “no” to Q9 or Q51 in orange, coded participants who answered “yes” to Q9 or Q51 in green b. Diagram 2 → coded participants who answered “no” to to Q11 or Q54 in pink c. Diagram 3 → coded participants who answered “no” to Q15 or Q57 in blue, coded participants who answered “yes” to Q15 or Q57 in yellow 4. Categorized the cards by similarities in verbiage, repeated keywords, and common themes in subject matter, making sure to keep cards only with other cards coded in the same participant number color and making theme subcategories by labeling groups with small Post-it notes 5. Organized the newly-created groups in relation to each other based on potential connections between groups, loosely following a “yes-neutral-no” scale when negotiating the visual layout and making theme categories by labeling groups of small groups with large Post-it notes 6. Tagged and took notes on any intersections of interest 57 Diagram 1: Proactive YIR Results Manipulation Figure 15: A composite image of the affinity diagram I made to chart themes in qualitative responses from Q10 (Spotify Users) and Q51 (Non-Spotify Users). Diagram 2: Retroactive YIR Results Manipulation Figure 16: A composite image of the affinity diagram I made to chart themes in qualitative responses from Q13 (Spotify Users) and Q55 (Non-Spotify Users). 58 Diagram 3: Third-Party Site Use Figure 17: A composite image of the affinity diagram I made to chart themes in qualitative responses from Q16 (Spotify Users) and Q58 (Non-Spotify Users). 59