19 Comments

Jason, I've been a fan of yours since before John Dies was published for the very first time, and I've loved everything you've written. Your insights into our bizarre modern digital world really gets me thinking. Keep doing what you're doing, sir, and I'll keep reading.

Also, fuck youtube, and fuck algorithms.

Expand full comment

Lol, saying "fuck algorithms" is a bit of a strange way of phrasing things. An algorithm is nothing more than a series of instructions in order to achieve a certain goal. A long division is an algorithm. A recipe for paella is an algorithm. The rules of a football game are an algorithm. There's nothing wrong with algorithms: without them, literally nothing would get done. Saying "fuck algorithms" because YouTube's recommendation engine is a bit out of control is like saying "fuck water" because some reckless driver ran over your cat. Because from biology, we know the driver was mostly (about 60% if I recall correctly) water.

What is an issue though, is what media today prematurely calls "AI", but which is nothing more than applied statistics or iterative refinement of parameters in a neural network. We are now capable of writing programs that when fed pictures of cats and dogs, can neatly sort them into their respective categories. But we don't know how they do it. Nor can we ask them, because they're not intelligent like us. They're just single-purpose things that receive pictures on one and, and spit out "cat" or "dog" on the other, even when fed a picture of a moose.

And for such frivolous applications, it doesn't really matter that we don't know how the darn thing arrives at its conclusion. But that's where the problem with "algorithms" lies: the moment you start using machine learning to decide over things that have a real impact on the world and the individuals in it, it's a real fucking problem if you don't know how it arrives at its decisions. Which I assume is what you were able to express so much more succinctly than me by just condensing it to "fuck algorithms". Someone should make that into a bumper sticker. :)

Expand full comment

Okay, fair point. It was a bit of a blunt and tactless response. I really meant YouTube's algorithms, and engagement algorithms in general. A lot of youtubers have to work really, really, really hard to get any money from youtube. From what I heard, the algorithm is difficult to understand and seemingly randomly demonitizes videos

Expand full comment

Cracked was never the same once you left... I've read all of your articles there and loved them. You really shaped my view on the world. Thanks from the bottom of my heart!

Expand full comment

I unfortunately was born too late to witness the golden age of Cracked. By the time I had gotten a copy of JDATE, the great Firing Everyone had already happened. But now I can kind of understand why text articles used to get viral. Shame that doesn't seem to happen anymore. Engagement algorithms are absolutely horrifying.

Expand full comment

I feel bad. I went through high-school during the cracked Golden age. Hell I shamelessly used the interesting talking points to kickstart my first actual relationship. Also to feed my ever hungry information addicted brain.

Expand full comment

Fascinating, if completely unsurprising, revelation. Thank you, Jason.

Expand full comment

i guess you know what that means . . .. jason . .. . ;)

Expand full comment

I spend a lot of time on guitar guy YouTube, and videos started popping up of a woman playing, but unlike the male videos where there's a crafted thumbnail with clickbaity text ("$5,000 Custom Shop Guitar vs $100 Budget Guitar, is There REALLY A Difference?!?!?!?!?!" 30 times a day ) hers is just her playing with her guitar strap between her boobs, or playing in a position where a boob has to sit on the guitar. Tons of views. Tons of dirty comments. Nothing relating to her actual playing (which is great).

Expand full comment

The algorithm is obviously pushing content creators into weird, bizarrely horny directions, but the algorithm wasn't created in a vacuum. It doesn't take a genius to see why a flash of cleavage or bare thigh might drive viewership, but surely forcing it into every video is surely not want viewers wanted, or the writers of the algorithm originally intended.

We are collectively in this bizarre feedback loop that feels similar to the current crisis we have with western diets: big companies identified customers preferred a certain type of content. For fast food it was high fat/high sugar content food that is easy to digest. Companies that pushed that sort of food pushed others out of the market, but also helped set the norm for consumers. 50+ years later they're serving the types of meals that company founders surely didn't envision, and early consumers wouldn't have been able to tolerate.

Youtube and social media took the easy (read: financially rational) method to driving traffic: they found the easiest methods to trigger the little hits of dopamine to keep us watching and trimmed away everything else. Now it's the lions share of the media diet we consume.

Expand full comment

Yeah, somewhere along the line we forgot that technology was supposed to serve us, not the other way around.

The dystopian sci-fi notions of machines ruling over their pet-like humans may seem a bit far-fetched... But, aren't we seeing the beginnings of something that may very well lead us there? We're already letting computers make more and more decisions that impact the real world. And these are dumb computers, not the intelligent sort you often encounter in sci-fi settings. But already we're getting used to the notion that these machine learning algorithms are a black box that we can't scrutinise, but somehow seems to achieve good results most of the time. How long before that turns into "no need to understand; the machine knows best", while only the techies who build those systems have a vague notion of how they work? And assuming we ever achieve sentient AI, and those techies are no longer needed for technical progress (nor an efficient way of achieving it), what then?

Expand full comment

It's not really a matter of us serving the technology; it's just the age-old problem of pandering to the lowest common denominator. That's been the case with modern advertising for a century and half (although the earliest examples go back to the Roman empire), and with film and television for decades. No different on the Internet.

"Sex Sells" is one of the oldest and best-known adages in the entertainment and marketing industries, and is just as true today as it was a century ago. Youtubers have simply re-discovered that principle, and the Youtube algorithm fully embraces it.

Expand full comment

In this particular case, sure, the reasons are "sex sells" and pandering to the lowest common denominator. But the effect is still that the people behind that channel have altered their behaviour in order to give the "algorithm" something it'll be more willing to promote. The channel operators are not pandering to the audience. They're changing their content to "please the algorithm", which functions as a gatekeeper between them and the audience they're trying to reach. "Sex sells" may be part of the cause, but "they're serving the algorithm" is still the effect.

Expand full comment

That's just an overly-abstracted way of saying "pandering to the lowest common denominator". The algorithm promotes sexualized content, because sex sells. No other reason. "Pleasing the algorithm" simply means sex sells. It's adding complications where none are necessary to explain the phenomenon.

Literally the only difference between what the Youtube algorithm is doing vs. what marketers used to do is that Youtube uses a computer instead of pen-and-paper spreadsheets and slide rules. That used to be how algorithms were calculated. It's the same process, same algorithm, just bigger and faster.

Expand full comment

Knowing which cause led to a certain effect does not invalidate the effect. I say "the apple is on the ground", and you respond "no, the apple is not on the ground, because I happen to know for a fact that it fell from the tree".

And keep in mind, I'm not just talking about this one particular case described in the article. The notion that we're becoming servants to technology is not exactly new. Think about the clerk who informs you he can't help you because the computer won't let him, phrasing it as if he requires the computer's permission. Or your car's stability control kicking in when you want to drift around the corner. Again the machine is limiting the actions available to you, not because it's incapable of doing so (plenty of horses under the hood), but because it decides you're not allowed to.

And these are just examples of traditional technology, rules put in by other humans who didn't foresee all possible use cases, or deemed them invalid. But with machine learning algorithms, the rules aren't being made up by humans anymore. All we did was decide some parameters, e.g. how many layers our neural network needs to have, how many nodes per layer, and selecting the training data to feed the algorithm. After that, the thing takes on a life of its own, especially the ones that keep evolving after initial training is done. In the end you end up in a world where decisions are being made that affect the real world, by these black boxes that we can't scrutinise, based on rules that they came up with and that are unknown, and in fact unknowable to us.

And where does that leave the end user? Jumping through hoops enforced on them by the system in hopes of achieving the desired result, in cargo cult like fashion, because they don't know the rules of the game. This end user is no longer the master over the machine. It's the other way around.

Expand full comment

That's a long-winded way of saying you don't know how any of this works.

In every one of those cases, the processes and limitations were designed by humans, and modified by humans to serve the desires of a small number of humans; and in some cases, humans using machines as excuses for the limitations of policies enacted by humans.

This "slaves to machines" is and has always been little more than technophobia and "golden age" conservativism. Machines do what humans tell them to do, and if humans don't like what the machines do, then humans can quite easily change them.

It's not the machines that are in charge, it's the humans who own the machines. The Youtube algorithm is the way it is because it makes a hell of a lot of money for a small number of humans, and a small amount of money for a larger number of humans. It can be changed in degrees large or small, or turned off entirely, at the whim of humans.

Machines are not masters, they're tools, nothing more. Its the humans who design the tools, and use the tools for their own personal enrichment, who are the masters. The likes of Page and Bezos and Ellison and Gates Musk and Koch who are the "enemies", not the tools they create. Those are the ones we should be looking toward in our desire to change the situation we are currently in, because they design the algorithms and run the machines.

Expand full comment

I guess I knew, in the back of my mind, that this is how it worked, but thank you for laying it out to show just how disgusting the whole thing is. That the algorithms are training us -- including the most vulnerable -- to give into the base desires of the internet, is terrifying. So much money is exchanging hands that I'm sure there is no incentive to change the way it works. And, to your point, this lady's channel would still be obscure if she never "cracked the code," so it might be easy for people to argue that she's just tricking horny idiots into clicking on her videos. It's scary that children are growing up in a world that will shape them in this way, making them think their worth is based on whatever they have to do to get those like/view numbers up.

Expand full comment

I don't know, the whole premise seems suspect. I have seen several other channels where this doesn't hold up. All the vids will have a pitiful amount of views and one or two random ones (no cleavage thumbnails or otherwise) will have over a million or more. I don't think it's as simple as Jason wants it to be. The algorithm seems to like videos that are years old, as well. This trend can be seen in the comments section where there's quite a few people wondering if the algorithm brought them all there because the video was otherwise unrelated to content they normally consume.

Expand full comment

Those are cases where the video has gone viral for some other reason; usually a mention or link from a more popular Youtuber, Instagram influencer, and so on. Most commonly because they've become part of some Internet dramafest, or are linked to some current fad. These are outliers, the exceptions. For the most part, the article is accurate. Youtube even has a series of presentations explaining how to structure content and presentation to maximize views. The presentations don't directly address sexualizing content, but it's implied, and anyone who is familiar with creating content is fully aware of just how effective sexualized thumbnails and clickbait titles are at gaining views. There are articles about it from more established journalists and media critics, including big names like Wired.

Expand full comment