A changing relationship

The relationship between media and audiences continues to change. New technology and media convergence has changed the nature of media production, distribution, consumption and reception. Although new and traditional forms of media continue to exist side by side in a hybridised landscape, new technologies and platforms have changed the power dynamic between media and audiences, challenged power hierarchies, given increased agency to audiences and, as Henry Jenkins notes, changed the way audiences fit into “networks of capital, both economic and cultural.” In The Culture Industry and Participatory Audiences, Emma Keltie notes in a potential future of participatory culture “the space between producer and consumer lessens and the positions of power become closer.”

It is useful to think about this changing relationship by thinking about changes in production, distribution, reception and consumption. 

Production: Institutions, individuals and crowds

The way media is produced is changing. In the twentieth century, media production was dominated by large commercial media institutions. According to Eli Noam in Media Ownership and Concentration in America, the emergence of broadcast radio and television was initially dominated by large commercial companies. During this time, however, it is important to note that motivated individuals and communities were creating their own media, including newspapers, fanzines, community radio and community television.  

The rise of new media technology and social media platforms means that the tools to produce media are available to more people than ever before, enabling individuals to create their own media. While audiences have always exerted some agency what they use and how they use it, social media provides greater opportunities for creating and sharing media. Platforms such as Facebook and YouTube provide individuals with the ability to publish and distribute, potentially reaching an audience of millions. In Confronting the Challenges of Participatory Culture, Henry Jenkins calls this participatory culture and writes that it is characterised by “relatively low barriers to artistic expression and civic engagement, strong support for creating and sharing one’s creations, and some type of informal mentorship whereby what is known by the most experienced is passed along to novice.” According to  Jenkins, the practice of creating and sharing media, is nothing new.  In the nineteenth century, the Amateur Press Association allowed people to create and share their own publications through a national distribution network. This tradition continued through the twentieth century with science fiction fanzines, punk rock publications and the riot grrrl movement. Recent developments in technology have simply amplified these practices. Dr Axel Bruns of Queensland University of Technology coined terms like ‘produsage’ to describe the power audiences now have to create their own content. He argues that the distinction between producers and users is now virtually insignificant: “Users are always already necessarily also producers of the shared knowledge base,” he explains, “regardless of whether they are aware of this role – they have become a new, hybrid, produser.” 

In 2008, writer and lecturer Clay Shirky released the book Here Comes Everybody: The Power of Organizing Without Organizations. In the book, Shirky explores how individuals use online tools to work collaboratively without organisational structures. In 2009, Clay Shirky explained to WNYC radio:

“What we’re increasingly seeing—with models like Wikipedia, the collaboratively created encyclopaedia, or open source software like the Linux operating system—is that groups of people operating without financial motive and outside of an institutional framework that directs their work, are able to create an enormous amount of value.”

Professor of media studies at University of Brighton, Tara Brabazon, has criticised the idea for omitting any discussion of the barriers that exist to creating content. In ‘Nothing to lose but our mobiles’, she writes: “The great absence in the book is any mention of information literacy, or the excluded “everybodies” that may not have the latest phone or the time and ability to join, gather and collectivise online because they are managing analogue injustice. His assumption that “we” can learn about technology from technology – without attention to user-generated contexts rather than content – is the gaping, stunning silence of Shirky’s argument.”

Further reading

Distribution: Distribution, contagion, stickiness and spreadability

During the twentieth century, large media organisations exerted control over the distribution of media products. The equipment and infrastructure required to distribute media to a mass audience was expensive: television broadcasts, for example, required expensive broadcast towers. While ideas of contagion, stickiness and spreadability all help to explain how modern media circulates, it is important to remember that media institutions continue to play an important role in media distribution. By early 2021, Netflix had 207 million paid subscribers across the globe. New media platforms, such as Facebook and TikTok, also exert significant control, shaping the flow of content on their sites through algorithmic curation and content moderation. In the contemporary media landscape, distribution is a hybrid of top down distribution and grassroots sharing.

There are several ideas and theories that help to describe how content circulates on social networks, including: contagion, stickiness and spreadability.

Contagion refers to the idea that media messages spread like viruses. The metaphor of viral contagion has a long history. A number of theorists and writers have promoted the idea, including Douglas Rushkoff in Media Virus!: Hidden Agendas in Popular Culture and Seth Godin in Unleashing the Ideavirus. Rushkoff argues that media messages that capture audience attention and spread are viruses, they have protein shell—“an event, invention, technology, system of thought, musical riff, visual image, scientific theory, sex scandal, clothing style or even a pop hero”—that catches the attention of the audience. In Unleashing the Ideavirus, Godin explores how messages spread from the perspective of marketers and advertisers suggesting that they should “tonight consumer networks and then get out of the way and let them talk.” Godin suggests that smoothness is required for the successful spread of an ideavirus. Smoothness could refer to the click of a button or a particular phrase because a product that is “easy to recommend is often a product that’s easy to get hooked on.” In the book, he suggests that ‘sneezers’ are people who are more likely to spread a message and that marketers should actively try to target these people. Sneezers are trusted friends who have credibility because they are trusted friends who wouldn’t recommend something that doesn’t make the lives of their friends better. 

Stickiness is an idea, primarily used in advertising and marketing, which describes media products that engage audiences. As Malcolm Gladwell notes in The Tipping Point, stickiness is “primarily a property of the message.” In the book, Gladwell calls this The Stickiness Factor. He suggests that some content creators have become “specialists in stickiness” and are capable of creating engaging content that will spread on social networks. Gladwell emphasises that there is a difference between stickiness and contagion. Stickiness describes the product whereas contagion is a “function of the messenger”.

In Spreadable Media, Henry Jenkins disagrees with how the concept of stickiness, as interpreted by advertisers and marketers, because it devalues the role that audiences play in spreading media:

“It privileges putting content in one place and making audience come to it so they can be counted. Such “destination viewing” often conflicts with both the dynamic browsing experience of individual internet users and, more importantly, with the circulation of content through the social connections of audience members.”

Instead, Jenkins promotes the idea of spreadability. He dismisses notions of virality because it suggests that audiences are reduced to “involuntary “hosts” of media viruses” while promoting the notion that “media producers can design “killer” texts which can ensure circulation by being injected directly into the cultural ‘bloodstream.’” According to Jenkins, in this model “audiences play an active role in spreading content: their choices, their investments and their actions determine what gets valued.” Jenkins and Green go onto explain that media can be reinterpreted and re-contextualised when it is spread:

“As content spreads, then it gets remade, either literally through various forms of sampling and remixing, or figuratively via its insertion into ongoing conversations and interactions. Such repurposing doesn’t necessarily blunt or distort the original communicator’s goals. Rather, it may allow the message to reach new constituencies where it would otherwise have gone unheard. Yet by the same trek, it is also not necessarily reproduced uncritically, since people have their own varied agendas for spreading the content. No longer “hosts” or “carriers”, consumers become grassroots curators and advocates for personally and socially meaningful materials.”

Spreadability represents a shift in the power dynamic between media and audiences. As Jenkins notes in ‘Henry Jenkins: Spreadable content makes the consumer king’: “It’s not the agency of the network that pushing the content, it’s the consumer that’s engaging other consumers with that content.” In Spreadable Media, Jenkins explains that peanut butter is a good metaphor for how media spreads: it is inherently sticky but still requires users to spread it. 

Further reading

Reception and Consumption

Surveillance capitalism

Surveillance capitalism refers to the extraction and commodification of personal data in order to create accurate prediction products that reveal our current, future, and anticipated behavior. This concept, coined by Shoshanna Zuboff, highlights the growing influence of digital platforms like Facebook, Instagram, and TikTok, which now possess the power to shape and manipulate our actions for profit.

  • Asymmetries in knowledge and power. One of the key aspects of surveillance capitalism is the significant power asymmetry between these platforms and their users. While these companies possess extensive knowledge about us, their operations are intentionally designed to remain opaque and unknowable to us. They amass vast amounts of data from us, but this knowledge is not used to benefit us; instead, it is leveraged to enhance their own profitability.
  • The extraction imperative. The extraction imperative is another crucial component of surveillance capitalism. We, as users, are treated as raw materials for the extraction, prediction, and sale of our behavioral data. When platforms invest in new technologies or innovations, it is not simply a far-sighted investment in the future; rather, it is an opportunity to gather even more data about our behaviors. As Zuboff points out, the notion that “if it’s free, you are the product” is misleading. In reality, we are the abandoned carcasses from which surplus value is extracted.
  • The shadow text. The concept of the shadow text emerges as a result of surveillance capitalism. It represents the ever-growing accumulation of behavioral surplus and its analyses, revealing more about us than we could possibly know about ourselves. Moreover, it becomes increasingly challenging, if not impossible, to avoid contributing to this shadow text, as it automatically feeds on our everyday social interactions and routines.
  • Threats to individual sovereignty. Individual sovereignty is eroded under surveillance capitalism in two ways. Firstly, it challenges our right to shape the future. Zuboff argues that authority over the digital future should lie with the people, their laws, and democratic institutions, rather than being concentrated in the hands of powerful corporations. Secondly, it erodes our right to privacy and sanctuary. Privacy serves a crucial psychological function, allowing us to engage in contemplation, creativity, and personal growth. Without the space for these experiences, we cannot flourish as individuals or meaningfully contribute to our communities and society.
  • Behaviour modification. Behaviour modification is a fundamental goal of surveillance capitalists. These companies have learned that the most effective way to ensure certainty for advertisers is to actively intervene in our lives. They employ strategies such as nudging, coaxing, tuning, and herding our behavior towards outcomes that are profitable for them. Numerous examples illustrate the real-world impact of behavior modification by surveillance capitalists. The Facebook Mood Experiment in 2012 manipulated users’ emotions to examine their effect on online behavior. In 2016, Pokemon Go utilized augmented reality and location data to drive users to specific physical locations. A leaked document in 2017 revealed that Facebook was aware of users’ emotional vulnerability and exploited it for targeted advertising. In 2018, FBLearner Flow, a Facebook tool, predicted users’ intentions to switch to competing brands. The Cambridge Analytica scandal in the same year demonstrated how personal data was used to influence political campaigns.

Algorithmic culture

Algorithmic culture—the algorithmic curation and personalisation on sites like TikTok, Spotify and Netflix—is changing media consumption and the very nature of culture itself. The term was coined by media theorist Ted Striphas, whose work examines the way algorithms shape the way culture is produced, distributed and consumed. He suggests that over the last fifty years, human have increasingly delegated the work of culture to “computational processes”.

In the book Filterworld, Kyle Chayka explores the impact of algorithms on media consumption and the way audiences experience culture. He argues that the nature of consumption has fundamentally changed by algorithmic curation. As Chayka puts it, “Today, it is difficult to think of creating a piece of culture that is separate from algorithmic feeds, because those feeds control how it will be exposed to billions of consumers in the international digital audience.” Some of the changes to consumption that he explores in the book include: 

  • Flattening of culture. Chayka discusses the outcome of such algorithmic gatekeeping, which he refers to as ‘flattening’. By this, he means a homogenization and reduction into simplicity: “The least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.” This flattening leads to the fragmentation and atomisation of a shared cultural experience: “Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.”
  • Fast, passive consumption. The speed and passive consumption of algorithmic feeds means that audiences don’t linger on cultural artefacts long enough to develop their own personal taste. Chayka notes: “We make constant decisions to listen to, read, or wear one thing instead of another. These choices are intimate, reflecting our ephemeral moods and the slow building of our individual sensibilities—of our senses of self.” When consuming media on platforms with recommendation algorithms, however, if audiences find something boring they “just keep scrolling, and there’s no time for a greater sense of admiration to develop—one is increasingly encouraged to lean into impatience and superficiality in all things.”
  • Pigeonholing. In Filterworld, Chayka notes that every interaction on a platform like Netflix pushes audiences “deeper into a pigeonhole” and makes them unlikely to see material that is challenging or confrontational. As Niko Pajkovic wrote, “Feedback loops reinforce a user’s pre-existing preferences, diminishing their exposure to a diverse range of cultural offerings and denying art, aesthetics and culture of its confrontational societal role.”
  • Collection. The rise of streaming services means that audiences no longer collect cultural products, such as books, albums, and films. In many cases, the possession of these cultural artefacts help to form a person’s identity.  Chayka is highly critical of a media environment that exists so relentlessly in the “present tense”. He notes that extensive freedom of choice and an endless array of options means “instils a sense of meaninglessness”. The action of collecting a cultural artifact, he suggests, “etches it a little deeper into our hearts and creates a context around the artifact itself, whether text, song, image, or video.”
  • Ambient culture. Algorithm-driven culture often becomes ambient, designed to be consumed passively or ignored. Chayka suggests that the sort of culture that audiences consume on digital platforms like TikTok, Spotify and Netflix essentially becomes background noise ”designed to produce a sensory void or to be flattened into the background of life, an insidious degradation of the status of art into something more like wallpaper.”

Enshittification

Cory Doctorow’s theory of Enshittification describes a process by which digital platforms evolve in ways that ultimately degrade the user experience. The theory outlines a three-step cycle that digital platforms go through:

  • Initial User Attraction: Platforms start by prioritizing user satisfaction to attract a large user base. This phase involves offering valuable, user-friendly services that attract and retain users.
  • Monetization Shift: Once a critical mass of users is achieved, the platform begins to focus more on monetizing its user base. This often involves prioritizing the interests of advertisers and other paying customers over the users’ needs and preferences.
  • Degradation of User Experience: As the platform increasingly caters to paying customers, the quality of the user experience declines. Users may be subjected to more intrusive ads, lower-quality content, and manipulative practices designed to maximize revenue rather than enhance user satisfaction. This shift ultimately leads to user dissatisfaction and attrition.

Facebook provides a good demonstration of what Doctorow labels enshittification or platform decay:

  • Initial user attraction. When Facebook was launched in 2004, its primary goal was to create a user-friendly social networking platform. The site focused on connecting people, facilitating communication, and building communities. Features like the News Feed, introduced in 2006, were designed to enhance user engagement by showing relevant content from friends and family. During this phase, Facebook prioritized user experience, resulting in rapid growth and widespread adoption.
  • Monetisation shift. As Facebook’s user base expanded, the platform began to shift its focus towards monetization. In 2007, Facebook introduced its first advertising system, Facebook Ads, which allowed businesses to target users based on their profile information and activity.It also lured media companies onto the site with the promise they could share content to audiences. 
  • Degradation of User Experience.
    • Increased Ad Load: Users started seeing more ads in their News Feeds and throughout the platform. This not only cluttered the user interface but also interrupted the social experience that initially attracted users.
    • Algorithmic Manipulation: Facebook’s algorithms began prioritizing content that would generate more engagement, often at the expense of content quality. This led to the proliferation of clickbait, sensationalist news, and divisive content, which, while engaging, also contributed to user dissatisfaction and the spread of misinformation.
    • Data Privacy Concerns: Facebook’s aggressive data collection practices for targeted advertising raised significant privacy concerns. The Cambridge Analytica scandal in 2018 highlighted how user data was being exploited, leading to a massive backlash and a loss of trust among users.
    • Manipulative Practices: Features designed to maximize time spent on the platform, such as endless scrolling and frequent notifications, began to feel manipulative. These tactics were intended to keep users engaged for longer periods, increasing ad impressions and revenue but often at the cost of user well-being

Further reading

Photo by Ola Dapo from Pexels.