The Algorithms of Nostalgia

Nostalgia has become a template for the serial production of more content, a new income stream for copyright holders, a new data stream for platforms, and a new way to express identity for users. And there’s so much pop culture in the past to draw from, platform capitalism will seemingly never run out. We’re told our data is collected in an attempt to predict what we want, but this isn’t quite true. In attempting to predict our tastes, streaming services work to produce them in its image. Since algorithms are trained on the past, they aren’t merely transmitting nostalgia through neutral channels; they’re cultivating nostalgic biases, seeking to predispose users to crave retro. 

Even as Silicon Valley positions itself as progressive, its algorithms are stuck in the past.

Grafton Tanner, writing in Real Life Magazine

The algorithmic feedback loop

Users keep encountering similar content because the algorithms keep recommending it to us. As this feedback loop continues, no new information is added; the algorithm is designed to recommend content that affirms what it construes as your taste.

Reduced to component parts, culture can now be recombined and optimized to drive user engagement. This threatens to starve culture of the resources to generate new ideas, new possibilities. 

If you want to freeze culture, the first step is to reduce it to data. And if you want to maintain the frozen status quo, algorithms trained on people’s past behaviors and tastes would be the best tools.

The goal of a recommendation algorithm isn’t to surprise or shock but to affirm. The process looks a lot like prediction, but it’s merely repetition. The result is more of the same: a present that looks like the past and a future that isn’t one. 

Grafton Tanner, writing in Real Life Magazine

Junk Algorithms

Despite the weight of scientific evidence to the contrary, there are people selling algorithms to police forces and governments that claim to ‘predict’ whether someone is a terrorist or a pedophile based on the characteristics of their face alone. Others insist their algorithm can suggest changes to a single line in a screenplay that will make a movie more profitable at the box office. Others boldly state — without even a hint of sarcasm — that their algorithm is capable of finding your one true love.

There's a trick you can use to spot the junk algorithms. I like to call it the Magic Test. Whenever you see a story about an algorithm, see if you can swap out any of the buzzwords, like ‘machine learning’, ‘artificial intelligence’, and ‘neural network’, and swap in the word magic. Does everything still make grammatical sense? Is any of the meaning lost? If not, I'd be worried that it's all nonsense. Because I'm afraid — long into the foreseeable future —  we are not going to ‘solve world hunger with magic’  or  ‘use magic to write the perfect screenplay’ any more than we are with AI. 

Hannah Fry, Hello World

The Optimization Mindset

When developing an algorithm, computer science courses often define the goal as providing an optimal solution to a computationally-specified problem. And when you look at the world through this mindset, it’s not just computational inefficiencies that annoy. Eventually, it becomes a defining orientation to life as well. As one of our colleagues at Stanford tells students, everything in life is an optimization problem.  

The desire to optimize can favor some values over others. And the choice of which values to favor, and which to sacrifice, are made by the optimizers who then impose those values on the rest of us when their creations reach great scale. 

Rob Reich, Mehran Sahami and Jeremy M. Weinstein, System Error