Platform Desires
Fetishizing the Algorithm
Sorry for the delay in posting my lecture this week—I had some connectivity issues with my VPN yesterday that took most of the morning to resolve, as well as a personal family issue that is still ongoing.
The persistent problem is the fetishization of “algorithms” themselves without widening the perspective to include the many ways in which algorithms are rarely stable and always in relation with people: that is, both in flux and embedded in hybrid spaces.
—Kate Crawford, “Can an algorithm be agonistic? Ten scenes from life in calculated publics.” Science, Technology, & Human Values, 1, 77–92.
This move “fetishizes” the algorithm, and data-related expertise …, yet the ideology of advice is ultimately not disruptive or based on data from original experiments. Rather, this advice speaks to algorithmic lore, or stabilizing logics, designed to manage risk by homogenizing content, fitting within set genres and ascribing to the logics of marketing calendars.
—Sophie Bishop, “Algorithmic Experts”: 8.
What does it mean to describe the algorithm as a “fetish”, or to speak of “fetishization” in relation to it?
Sophie Bishop and Tanya Kant, “Algorithmic autobiographies and fictions: A digital method”
Sophie Bishop, “Algorithmic Experts: Selling Algorithmic Lore on YouTube”
Kyle Chayka, “Introduction” (Filterworld)
Kyle Chayka, “The Rise of Algorithmic Recommendations”(Filterworld, ch. 1)
Taylor Lorenz, “Internet ‘algospeak’ is changing our language in real time, from ‘nip nops’ to ‘le dollar bean’” (Washington Post, 8 April 2022)
Although in popular culture the term “fetish” has a strongly sexual connotations, this is not what we mean here. Historically, the term originates in anthropology, where it is closely related to animistic thinking and a belief in the magical properties of objects, animals, and so on, that makes them worthy of veneration or even worship. The concept was adapted by Karl Marx in his critique of capitalism and what he called the fetishization of commodities: the phenomenon whereby commodities just seem to appear in markets as if by magic, completely separated from their original process of production. (It’s important not to confuse this meaning with the popular understanding of commodity fetishism in consumer society as referring to the almost erotic desire projected onto manufactured objects, encouraged of course by advertising.)
So from this standpoint, to speak of the algorithm as a fetish of contemporary technoculture refers to a different kind of separation from the one that Marx was referring to, by which the technical mechanisms of algorithms remain separate from both the users and producers who experience their effects, and are therefore shrouded in mystery. Even the common metaphor of the “black box” is symptomatic of the kind of fetishism that Kate Crawford and Sophia Bishop are referring to in the quotations above.
The project of both researchers is in a sense to demystify algorithms, but not in the sense that the male algorithmic “experts” (I use the quotation marks deliberately) that are the subject of Bishop’s article (“Algorithmic Experts: Selling Algorithmic Lore on YouTube”) purport to do: that is, by claiming to have unlocked the magic box and reveal the algorithmic secrets hidden inside—which, as so often turns out to be the case, amount to nothing more than strategies for optimizing visibility based solely on claimed knowledge of the technical mechanisms and principles by which the algorithm prioritizes such visibility. As you will have seen if you did the reading, Bishop critiques this technical knowledge—as it is framed by the male “algorithmic experts” ubiquitous on YouTube–as no more than a kind of folklore (or “folk theories” as she also calls them). What is problematic about these theories is how they reduce understanding of algorithms to a kind of technical optimization, such as how the visibility of a video depends on length, rather than the cultural biases (around language, gender, race, ethnicity, etc.) that are inscribed within and inherent to the design of algorithms themselves. Ultimately, Bishop argues algorithmic experts don’t reveal any secrets that are not already known, in terms of the marketing imperatives of the platform that algorithms implement; what they are doing is simply training content producers to conform more closely to these imperatives and the economic desires of the platform and the advertising companies that sustain it. So what the article amounts to is a feminist critique of the cultural biases inscribed within the discourses of algorithmic experts, and how both the male experts themselves and the algorithms that they claim to understand reinforce the hegemonic values of neoliberalism.
Bishop’s article on algorithmic experts thus makes for a fascinating encounter between the two types of theory that I described in the first lecture last week: it’s a critique of the instrumental theory of algorithmic experts from the standpoint of cultural studies, with its very different emphasis on representation and socio-political inequality. This is a different way of framing “algorithmic visibility” in terms different from simply how to optimize it on a technical level.
Although Bishop’s article is focusing only on algorithms, it’s interesting to think about how her critique is also applicable in other areas of today’s social mediascape. Computer programming itself is riven with gender ideologies, as detailed in the documentary Coded Bias (Shalini Kantayya, 2021; available on Netflix). Another area that has seen a proliferation of self-styled (and as Bishop would point out, overwhelmingly male) cultural intermediaries (in Pierre Bourdieu’s term) is generative AI, another field shrounded in mystery that is currently develping a new generation of folklorists. A few channels to check out (there are, of course, many more):
Theoretically Media
Pixaroma
Sneaky Robot
Nerdy Rodent
As I’m sure you know, this concept of cultural intermediaries seems to describe a vast proportion of today’s YouTube content, with self-appointed middlemen (and we are talking about men) seeking to insert themselves everywhere in the relationship between technologies and users, providing a range of expertise from programming a website to everyday “tips” on how to use your iPhone.
Connecting the Dots: The Algorithmic Self
I’ve focused here only on the second reading assignment for the week (Sophie Bishop’s article on algorithmic experts) rather than on the first article co-written with Tanya Kant about the “algorithmic selves” workshop that they’ve been running for the past few years. I’m very curous to hear what you thought about this workshop and whether you found it to be a useful exercise in terms of trying to reclaim some agency in the face of the surveillance capitalism (in Shoshana Zuboff’s widely-used term) that we’re all aware of now but can’t seem to do much about. Do you think the approach that Bishop and Kant describe—in terms of constructing what they call an “algorthmic autobiography” from their advertising profiles on Facebook, Instagram, and Google—is a useful activity? Or should the workshop have gone further by taking a more net-activist approach, for example by providing information about how to be anonymous online, or about open-source, zero-data apps and services?
It’s interesting, of course, that the one platform that isn’t included in the workshop is the elephant in the room: TikTok. For those of you more familiar with TikTok than I am, how would you go about constructing a portrait of your algorithmic self from your TikTok data? Is it possible, in ways similar to the profiles on the older platforms that the workshop focused on? More generally, does the concept of the algorithmic self as a kind of commercial alter ego constructed from our data points make sense in itself?
If you like, one idea for the first Commentary assignment due at the end of next week would be to explore this idea of the algorithmic self and construct your own, based on a study of your TikTok profile (or from another platform). What did you learn about the operations of the algorithm on that platform from studying your profile on that platform? I’ll be interested to see what you come up with!
I haven’t discussed Kyle Chayka’s chapters from his recent book Filterworld here because they are from a popular book rather than academic sources and are pretty readable. If you’re interested in hearing my take on them, though, I’m including a link to a different lecture about the chapters from an earlier course.
Online sources (referenced in “Algorithmic Experts” article)
Fine Brothers Entertainment. (2017). “FBE PODCAST—MatPat reacts: The YouTube algorithm hour! (Ep #9).”
The Game Theorists. (2014). “Game theory: Yes, PewDiePie. YouTube IS Broke.”
The Game Theorists. (2016a). “Game theory: Is YouTube killing Pewdiepie and H3H3 . . . and everyone?.”
The Game Theorists. (2016b). “Game theory: The REAL reason YouTube is broken.”
The Game Theorists. (2017). “Game theory: Beyond fidget spin- ners—How to create a YouTube trend.”
TubeFilter. (2017, June 22). “Cracking YouTube in 2017: The new research that cracks the code on YouTube’s algorithms..”