We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Micah Speaks To Kyle Chayka About The Filter World

Micah Speaks To Kyle Chayka About The Filter World

2024/1/31
logo of podcast On the Media

On the Media

AI Deep Dive AI Chapters Transcript
People
K
Kyle Chayka
M
Micah Loewinger
Topics
Micah Loewinger:在过去于纽约的唱片店工作期间,同事们提供的个性化音乐推荐比现在Spotify算法推荐更有启发性。Spotify等应用的便捷性是以牺牲更深层次的体验为代价的,这种感觉促使他阅读了Kyle Chayka的《Filter World》。 Kyle Chayka:我们应该对算法能够真正了解个人品味的观点保持怀疑态度。大多数推荐算法是黑盒,公司故意不公开其工作机制。许多推荐算法通过衡量内容的各种变量(点击次数、收藏、转发、观看时间等)来决定推广哪些内容。算法推荐并非基于质量,而是基于注意力,它无法判断Bach比Mozart更好。 Netflix的个性化推荐并非真正意义上的个性化,而是平台为了推广特定内容而进行的操纵,例如通过操纵影片缩略图来影响用户的观看选择。Spotify算法倾向于推荐用户反复收听的内容,以保持用户的持续使用,这会限制用户的音乐探索。 与算法推荐相比,人工筛选(例如DJ)能够在文化内容中构建意义并引导消费者,带来更具启发性和深度的文化体验。算法推荐的“环境”特性使其容易被忽略,这不利于对艺术作品的深入理解。 过去的人工筛选机制(如电台DJ、唱片公司高管)也存在自身的偏见和缺陷。但算法推荐有时能够绕过传统筛选机制的偏见,例如“城市流行音乐”的复兴。 应该对算法进行监管,以提高透明度并防止有害内容的传播。算法监管可以包括限制推荐内容类型、提高透明度以及允许用户选择退出推荐等措施。欧盟在算法监管方面领先于美国。对算法的恐惧是人类对技术变革普遍恐惧的一部分。我们应该从算法生态系统中适当抽离,并采取措施直接支持喜欢的创作者。

Deep Dive

Chapters
Micah Loewinger reflects on his experience working at record stores and how personal recommendations from experts were more impactful than algorithmic suggestions from platforms like Spotify.

Shownotes Transcript

Translations:
中文

This episode is brought to you by Progressive Insurance. Whether you love true crime or comedy, celebrity interviews or news, you call the shots on what's in your podcast queue. And guess what? Now you can call them on your auto insurance too with the Name Your Price tool from Progressive. It works just the way it sounds. You tell Progressive how much you want to pay for car insurance and they'll show you coverage options that fit your budget. Get your quote today at Progressive.com to join the over 28 million drivers who trust Progressive.

Hey, it's Michael Lohinger. You're listening to the On The Media Podcast Extra. Before I landed a job at this show, I worked for a few years on and off at a couple record stores around New York City. And some of my favorite albums to this day were recommended to me by my coworkers. Many of them were called On The Media Podcast Extra.

Men and women who I consider to be archivists, not just of old formats like vinyl records, cassettes, and CDs, but of underappreciated artists and niche genres. A knowledge of music history that can only come from a lifetime of obsessive listening, research, and curation.

Nowadays, I pay for Spotify. I try to learn about music off the app and then save it for later listening on Spotify. But sometimes I find myself just letting its recommendation algorithm queue up the next track and the next. And it definitely works. Spotify has helped me discover great music, but it's never been as revelatory as a personal recommendation from a friend or an expert at a record store or an independent radio station.

This feeling that I've traded convenience for something deeper is what made me want to read Filterworld, How Algorithms Flattened Culture by Kyle Chayka, a staff writer at The New Yorker.

Chayka says apps like Spotify and TikTok are great at studying user behavior, but that we should be suspicious of the idea that they can really know your taste. The algorithm as your best friend, as the intimate knower of your innermost secrets is definitely what Spotify, Facebook, X, TikTok would love for it to be.

You forget what you do on these platforms. You are not aware of every time you click into a Spotify track. You're not aware of when you favorite an album. On TikTok, you're absolutely not aware of every microsecond that you flip up a video or what you pay attention to a tiny bit longer than something else. So it doesn't know exactly what you're doing. And it doesn't forget that one time that you...

lingered too long on a shower tiling video. And it's like, you remember those shower tiling videos? Like, let me give you some more. Exactly. That happens all the time for me on Spotify, where it will recommend something to me and it'll be based on patterns that I'm not perceiving. It would be good to define our terms a little bit. I know Spotify's algorithm is a trade secret. We don't know exactly how it works, but

But as you write in your book, there are clues based on literature about the development of recommendation algorithms that might tell us how it likely works. Most recommendation algorithms are black boxes, not because they're impossible to figure out, but because the company itself does not want you to know how it works, because you might game it, because that might disrupt how it works, and that would ruin their product.

But a lot of them work along the same lines, essentially measuring a bunch of variables about the content that's on their platform. How many times people have clicked it, what the faves are, what the retweets are, what the time watched is, and then using that to figure out what to promote more and what to kind of push off to one side or another. You referenced this 1995 MIT Media Lab paper that described

social information filtering. Can you describe that concept? - Social information filtering is this thing where, let's say the tastes of two users are compared. So for example, there was another recommendation engine called Ringo that works with music recommendations, and this was another early mid-90s thing. It asked all of its users to build lists of their tastes in music and name 100 bands that you like,

And then the recommendation algorithm compared the profiles of all of those people and determined which people were more like each other. So who has similar tastes to you?

And then using that comparison, social information filtering says, since your taste is like this person and this person likes this band, but you don't yet, you may be likely to enjoy this band. And you can see that kind of scaled up in the digital platforms that we use now. So really grouping us with other users like that's the solution because Spotify doesn't know what's good or bad.

Right? That's a fundamental thing. Algorithmic recommendation is not about quality. There is no essential metric of quality. There is only attention. What is listened to more and what is listened to less? And that's all the data it can take in. It can do thumbs up, thumbs down, but it can't be like, oh, Bach is better than Mozart. Spotify's recommendation algorithm is just one part of what you call filter world. It's the name of your book, but it's also a concept. Can you describe it?

Yeah, filter world for me was this single term to describe the entire ecosystem of algorithmic feeds that we exist in. So when we're on the internet today, we are moving across all these different platforms, whether Facebook or TikTok or Instagram, that are all driven by algorithmic recommendations that are constantly trying to guess what we might like and put the next piece of content in front of us based on what we've consumed before. I

I mean, I personally felt totally enclosed by this kind of sphere almost of algorithms. And I couldn't find something or listen to something without facing that surveillance and recommendation of what I was doing. I want to dig into some examples of this feeling of being boxed in

by algorithms at the same time as feeling that they provide us with the things that will fill our time and our hearts, you know, TV shows, movies, albums. Let's talk about Netflix, for instance. When I open up the app on my TV or laptop, it feels like I'm being given a wide range of shows and movies tailored to me. But what's really happening there? The homepage is supposed to be a thing that reflects your taste and filters through things that you're going to like.

But more often, these categories are so broad and the kind of labels are so vague that they don't actually promise personalization. There's like a top 10 or there's a popular right now.

And those shows are just what's convenient for Netflix to promote at a given time, what's popular with a certain segment of the users, and what they can most conveniently convince you to watch in a way. Like Netflix has this algorithmic system to change the thumbnail of a show. Yeah, this is so creepy. When you go on Netflix...

the images of every TV show and movie are tailored to your preferences. So, for instance, in 2018, there was, like, a controversy where a bunch of people were being promoted the film Love Actually, a pretty safe film to promote to a lot of people. Very popular. But...

It turns out some people were being recommended with the prominent imagery of the Black actor Chiwetel Ejiofor, who plays only a minor part in the film. It's so manipulative if I, the Netflix algorithm, know that you watch a lot of movies with Black actors.

then I am going to present every movie as if it focuses on Black actors. So in the case of Love Actually, which absolutely does not focus on Black actors, I will highlight one of the few scenes that has this man in it in an effort to get you to watch it. Not because you definitely like Love Actually, not because you're going to love Hugh Grant dancing through the halls of the government or whatever, but...

but because it would be convenient for Netflix if you watched this movie. - This is the breakdown of the illusion of the recommendation algorithm, which claims to be shaping the viewing options for you, but is really just convincing you that you should like something it wants to show you anyway.

Right, that you might not actually like or that might not fall into your taste categories at all. There's this other academic term, corrupt personalization, which I think is exactly what's going on with Netflix. And corrupt personalization is the image of personalization, the idea of personalization without the actual reality of that. That's an egregious example of the bait and switch.

I guess I want to talk about another theme in your book. You're talking about something slightly more pernicious, which is a recommendation algorithm like Spotify's that in maybe the largest library of legal music ever created. I am subtly encouraged to listen to the same stuff that I like over and over. How?

How is that happening? There are a lot of knobs and variables that can change in these formulas. But for Spotify, it feels pretty conservative a lot of the time, I think, if you put on an album and then let it go. Usually within a few songs, I think it serves you up something that you've listened to

constantly, that it knows you are not going to turn off in order to lull you into that hypnotic state of just listening to the infinite playlist and not thinking about it. And this gets into what you want out of a listening experience or what you want out of a library of culture. My own sort of leaps forward in music curiosity has come from listening to the radio. Big shock, you know,

I work for a public radio show and I really like radio, but I think of WFMU, the independent radio station in New Jersey, or WKCR, Columbia's radio station. I remember for the first time hearing the Indian music show and hearing a 30-minute raga, and then somebody explained why it was interesting at the end. And I had never encountered that kind of music before. It opened some avenues of inquiry. There is something that you argue in your book that is lost when we take curation away from humans. Yeah.

Yeah, human curation and that idea of a DJ, a human person who has selected this raga. And even though it's 30 minutes, that person is like, you are going to like this. It's important. You should listen to it with me in a way. I will guide you into this culture. I will show you that it's important. I'll explain it to you after that.

And that's such a different encounter with a piece of culture than what you get on Spotify or what you get in a YouTube recommendation. The job of human curators, like a DJ, like a museum curator or a librarian, is to build meaning through juxtaposition and then guide the consumer into it in a way that helps them kind of

broadened their own horizons, as you said, or like brings them to a new place of taste or thought. And we just don't get that from a machine. That said, I'm sure listeners right now are like, but Discover Weekly has like delivered some great stuff to me, or I keep a close eye on some of the high profile Spotify playlists that are curated by humans. So this is not a pure either or, right? Right.

The internet is not the same thing as algorithms. There are many digital platforms that are not algorithmic. There are also ways of using Spotify that are not guided by algorithms.

We can't blame algorithms so much. Like, they fulfill a really important function in sorting information. But I think we can take back some of our agency. This pernicious idea that with recommendation algorithms, we feel like we have agency when we really are having it taken away. One famous example of this is TikTok's recommendation algorithm. The For You page, TikTok's great innovation, you say can have this kind of numbing effect on you. The feed...

is so hypnotic. It slots one piece of content after the next, after the next. They're not too different, but they're a little different. They're ephemeral. They're over within a minute or two. You compared it to Brian Eno's 1978 album Music for Airports. ♪

A lot of people consider to be one of the first ambient music albums. Yeah, he literally came up with the phrase ambient music. And what he meant by ambient was music that you could either ignore or pay attention to. So music for airports, you can just let it fade into the background. It's completely ignorable. It's a kind of nice wash of sound. But if you do pay attention to it, there is something there. You can think about his conceptual gestures. You can

listen to the details of the synth washes or how things move in and out. I think the problem with the ambient quality of algorithmic feeds is that they're too often just ignorable. It's like the lo-fi chill hip-hop beats to study/relax to problem that is designed to be ignored all the time.

The idea that you'd never have to think about who made something or go deeper into the musician who made a song or the artist who made a painting and you just consume the feed as the producer of the work, that is pernicious because you almost forget the idea of the artist. But in comparing past ways of consuming music, say, like through the radio, to what you call filter world, we do run the risk of being overly nostalgic.

The tastemakers of old, the radio DJs, the record store clerks, the critics, they had their own blind spots and biases. DJs of top 40 radio stations were swayed by money, pressure from labels, whatever the public at large they thought would respond to. That's not exactly for the pure love of music, right?

The old algorithms were human gatekeepers who made decisions about what culture should be promoted and what shouldn't. Magazine editors, record label executives, the DJs who might be influenced by, you know, payola. So I don't think that's like inherently good culture.

I do think in the best examples, like an indie radio DJ who's not overseen by corporate overlords, that can create really beautiful moments of curation and the transmission of culture. But so can a YouTube recommendation. I've gotten really interesting stuff from a YouTube recommendation that I wouldn't have known a person who could give me, and I wouldn't have known to seek it out. Give me an example of the algorithm...

serving up something that got around the calcified biases of the old gatekeepers. An example of something that I personally like is the Japanese genre of city pop. ♪

which was this kind of music that was made in the 70s and 80s mostly. It's this very ebullient R&B, big orchestra, propulsive beats, big, bold, crazy music, and it's really fantastic.

And it was hidden away for a long time. Japanese people were not listening to it much after the 80s. And then in the 2000s, some record DJs brought it up and then it hit YouTube where it just blew up because for some reason it worked for the recommendation engine. A lot of people were listening to this music. They were liking it. They were engaging with it. They were seeking out more of it.

And so City Pop became associated with YouTube. And that way, the kind of mathematical quality of it did circumvent a lot of human tastemakers. YouTube registered that this music was getting popular with an American audience long before a record label executive could do anything or even a radio DJ. It was a kind of democratic revival of the genre of music online, which I think is really cool.

with the so-called democratization promised by social media is amplification and all of the problems that it introduces.

algorithms picking things up to go viral that otherwise might not have, and that any regulation of algorithms, which you explore in your book, should mandate greater transparency around what gets pushed into people's feeds. Tell me a little bit about why we should regulate algorithms and what you see as the potential avenues for that.

Well, right now, there are essentially no rules about what an algorithmic feed can recommend to you or how it can interact with you. You can regulate what kinds of content gets algorithmically recommended. You could say that problematic content that promotes violence or self-harm cannot be subject to an algorithmic recommendation. And if that was blanket illegal, as it

may soon be in the European Union, then social networks would be much less likely to even touch that kind of material in its feed. All of a sudden, you could only find that stuff if you opted into it. It would not get pushed out to more people.

So there's regulation about what kinds of content can be recommended or promoted. There's regulation around transparency for algorithmic feeds, which means that we could see how something works and know what variables are being taken into account when something is promoted to us.

And there's regulation that mandates you be able to opt out of recommendation. And when you say regulation, you mean that you are seeing lawmakers, academics, and so forth propose these ideas. But are they likely to pass and be implemented? Well, in the European Union, they have passed the General Data Protection Regulation, which has caused that wave of pop-ups that say, please let me give you cookies. And the Digital Services Act,

more recently, which does mandate things like algorithmic transparency and opting out of feeds. In the U.S., we're way, way, way behind that. Some of these companies like Facebook are changing how their feeds work based on the European regulations. But in the U.S., we don't actually have any of those rules anymore.

And the few efforts that have been made in government have just not gotten very far at all. In your conclusion, you acknowledge that the intersection between art and culture and technology has always been fraught. Cameras and radios sparked fear. So did the telephone. In fact, who was lamenting what was lost when streetlights were introduced in Tokyo at the late 1800s and early 1900s.

Are algorithms fundamentally upending how our world works, or is this part of a larger fear we have about change?

We do always fear what technology does to culture. Culture is threatened by a thing like the camera, like recorded music, like the radio, and then artists find a way to carry on and make great things, and then we adapt and reframe our idea. No one's going to say that recorded music is a sin.

or that we should go back to only live music because that would be more authentic. There's no pure culture. But I do think pendulums swing. We've gone so far into this algorithmic ecosystem that I think we desire to retreat from it a little bit. The same way that we had to make up regulations for seatbelts and car safety rather than people flooring it down the road and having no safety checks in place. The financial exchange is so different too. How art is sustained

Before, a musician would sell you their album and they would make money. Now it's mediated by this huge algorithmic platform of Spotify and they only make money based on certain metrics, based on streams.

So I think one way to retreat from that and to go back a little ways is just finding ways to directly support the voices that you like. A designer, even a curator or a DJ who makes cool playlists. The best way we can ensure the survival of those kinds of voices

relationships is just to pay them money. It's like more expensive than a Spotify subscription. You're not going to get an infinity of music. But getting that infinity of music for $10 a month means that musicians have a really hard time making a living. Even though it's nice to be on the TikTok feed and see who you like to see. That's ultimately a hard way for your favorite creators of whatever to make a living. Kyle, thank you very much.

Thank you for having me. Kyle Chayka is a staff writer for The New Yorker covering technology and culture on the Internet. His latest book is Filter World, How Algorithms Flattened Culture. That's it for the Midweek Podcast Extra. Tune into The Big Show this weekend to hear my interview with MSNBC's Chris Hayes about his counterintuitive approach to covering Donald Trump.

Subscribe to our subreddit r slash on the media and follow us on Instagram and threads to keep up with what we're reading throughout the week. Thanks for listening.