We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The Man Who Predicted the Downfall of Thinking

The Man Who Predicted the Downfall of Thinking

2025/3/6
logo of podcast Your Undivided Attention

Your Undivided Attention

AI Deep Dive Transcript
People
L
Lance Strate
S
Sean Illing
T
Tristan
Topics
Tristan: 本期节目探讨了已故媒体理论家尼尔·波斯特曼的思想,他预言了技术对我们社会的影响,特别是注意力碎片化、消极文化和民主的衰落。他认为,技术不仅是工具,更是塑造文化和社会的力量。 Sean Illing: 我曾经相信科技只有好处,但波斯特曼的作品让我意识到科技的负面影响,特别是社交媒体带来的注意力碎片化和消极文化。 Lance Strate: 理解媒体生态学视角改变了我对政治的理解,让我意识到媒体技术变革会带来新的修辞形式、习惯和思维方式,从而对社会和既有秩序造成破坏。波斯特曼的著作让我意识到媒体不仅仅是工具,更是塑造文化的力量。 Neil Postman: 提出了评估新技术的七个核心问题:该技术声称解决什么问题?这是谁的问题?解决旧问题会产生什么新问题?哪些人和机构将受到最大伤害?正在推广什么样的语言变化?可能导致哪些经济和政治力量的转变? Sean Illing: 波斯特曼的《我们消遣自己的死法》批判了电视,认为电视将美国文化从印刷文化转变为电视文化,对公共话语、参与、民主和教育造成了负面影响。电视时代注重娱乐,而互联网和社交媒体时代则更注重注意力。在电视时代,政治家需要有吸引力和讨人喜欢,而在互联网和社交媒体时代,他们需要能够吸引和保持注意力,这导致了对奇观、煽动和表演性愤怒或美德的依赖。 Lance Strate: 现代民主制度是在印刷媒体环境下形成的,而电视正在逆转其许多特征。电报和摄影技术加速了信息传播速度,并使图像和个性成为主要的沟通方式,这导致了对名人和表象的关注超过了对实质内容的关注。 Tristan: 信息过载是一个新问题,它产生于解决信息匮乏问题之后。我们需要区分信息和知识,知识主要来自书籍,而信息则来自电子媒体,它可能并非总是真实或虚假的。我们需要关注技术对语言的影响,因为语言的变化会改变我们的思维方式。 Lance Strate: 波斯特曼的《技术专制》探讨了文化对技术的屈服,认为技术创新被视为目的本身,而非手段。效率成为唯一的价值,这使得我们很难拒绝新的技术创新。我们需要重新引入“应该”(ought)的概念,在评估技术时,不仅要考虑“能否”(can),还要考虑“应该”(ought)。 Sean Illing: 互联网的爆炸式发展带来了许多好处,但也导致了信息环境的完全解构,以及对权威和信任的破坏。我们需要将基于参与的商业模式与互联网本身区分开来,并探索设计不同的互联网协议和网络,以奖励那些能够丰富和提升人类本性的媒体。 Tristan: 我们需要评估每种媒体的适当用途和不当用途,并有意识地设计社会结构、社会规范和文化,以奖励适当用途并惩罚不当用途。我们需要一个能够批判性地看待技术的社会,在采用新技术之前,要先问波斯特曼提出的七个问题。发明者和公众都应该对技术进行批判性思考,并对技术发展方向进行有意识的选择。

Deep Dive

Shownotes Transcript

Translations:
中文

Hey everyone, it's Tristan, and welcome to Your Undivided Attention. The great late media theorist Neil Postman liked to quote Aldous Huxley, who once said that people will come to adore the technologies that undo their capacity to think. He was mostly talking about television. This was before the internet or personal computers ended up in our homes or rewired our societies. But Postman could have just as easily been talking about smartphones, social media, and AI.

And for all the ways television has transformed us in our politics, our eating habits, our critical thinking skills, it's nothing compared to the way that today's technologies are restructuring what human relationships are, what communication is, or how people know what they know.

As Postman pointed out many times, it's hard to understand how the technology and media we use is changing us when we're in the thick of it. And so now as the coming wave of AI is about to flood us with new technologies and new media forms, it's never been more important to have critical tools to ask of technology's influence on our society. And Postman had seven core questions that we can and should ask of any new technology. And I'll let him tell you in his own words.

What is the problem to which a technology claims to be the solution? Whose problem is it? What new problems will be created because of solving an old one? Which people and institutions will be most harmed? What changes in language are being promoted? What shifts in economic and political power are likely to result?

Now, I think about these questions often, and it may not surprise you to hear that today's episode is one I've been wanting to do for quite a long time, since Neil Postman has by far been one of the most influential thinkers for my own views about technology. His ideas have been so clear-eyed, prescient, starting in the 1980s, about the role of technology in shaping society that I wanted to dedicate a full hour to exploring them.

So today we invited two guests who've thought deeply about Neil's work. Sean Illing is a former professor who now hosts the Gray Area podcast at Vox and has often written and discussed Postman's relevance to our current cultural crisis. We also have Lance Strait, a professor of communication at Fordham University. He was actually a student of Postman's at NYU and spent his career developing the field of media ecology that Postman helped create. Sean, Lance, thanks for coming on your invited attention.

Glad to be here. Thank you. So I'm just curious, you know, for me, Neil Postman has been such a profound influence on our work. So in 2013, when I was kind of having my own awakening at Google, that there was just something wrong in the tech industry. There was something wrong about the way we were going to rewire the global flows of attention and something wrong with the scrolling, doom-scrolling culture that I saw in the Google bus. And I, you know, used to be someone who really deeply believed just in this kind of, you

you know, tech is only good. We can only do good with it. It's the most powerful way to make positive change in the world. And it was this friend of mine, Jonathan Harris, who is an artist in Brooklyn, who first introduced me to Neil Postman's work and, you know, his books, Technopoly and Amusing Ourselves to Death.

And I just could not believe just how prescient and just precise he was in his analysis. And I have been wanting to bring Neil Postman's, you know, just really critical insights to our audience who include a lot of technologists for such a long time. So I'm just very grateful to have both of you on and hope we can have like a really rich conversation. So just to sort of open with that. That's great. I think I got Postman pill back in

2016 or 2017. I came up as a political scientist, political theorist. That was my education. And we didn't really encounter any of this stuff. But once I sort of internalized the media ecological way of seeing things, it really kind of changed how I understood all of politics. It's pretty profound. What was your entree into Postman's work and what you see as kind of his critical insights?

In 2016, I was invited by a former classmate of mine to give a talk at Idaho State. This was sort of right in the beginning of the Trump era and all the chaos involved with that. And I gave my little talk and then I went for a hike with my buddy who's a media theorist. And we got to talking. And at the end of that, he sort of introduced me to Postman and media ecology. And that was sort of the germ of the book that we ended up writing together, The Paradox of Democracy, which came out in 2010.

But before that, I had never really encountered media ecology, Neil Postman. And for me, the value of these great media ecologists is that you really...

force us to stop looking at media as just a tool of human culture and instead to see it as much more as a driver of human culture. And this changed the way I looked at the political world. I mean, what you discover when you look at the history of democracy and media is that all of these revolutions in media technology, the printing press, the telegraph, radio, film, TV, the internet, it's not so much that these technologies are bad,

It's that they unleash new rhetorical forms and new habits and new ways of thinking and relating to the world and each other. And that's very disruptive to society and the established order. And we're sort of living through that. I could go on, but I'll pause and let Lance speak. Lance, how about you? How did you first get into this work and starting with your being a student at Postman's?

Well, I mean, I could go back to the 70s as an undergraduate in a class on educational psychology. Postman's first big book, Teaching as a Subversive Activity, was on the reading list. And that was when he was still following McLuhan with the argument that we need to adjust ourselves to the new media environment.

Just a note for the audience, Marshall McLuhan is another very influential media ecology thinker from Canada who famously coined the idea that the medium is the message. And you'll hear his name throughout this conversation. But I first read him...

I guess in 79 with teaching as a conserving activity, which was also when I first met him. And that's where he did his about face, although maintaining the media ecology outlook, but arguing that we needed to counter the biases of television because we're inundated with it and that

When Postman introduced the idea of media ecology, he gave it a very simple definition that it's the study of media as environments.

And once we understand that, then it's no longer just a tool that we choose to use or not use and we have complete control over, but rather it's like the environment that surrounds us and influences us and changes us. And we look at democratic society and democratic politics differently.

That was shaped, modern democracy was shaped by a typographic media environment and that television is reversing so many of those characteristics. And it's really a question about what will survive of the various institutions that grew up within the media environment formed by print culture.

So there's just already so much to dig into here. So let's set the table a little bit for listeners. Let's start by talking about Neil Postman's book, Amusing Ourselves to Death, which is really a critique of television and how the medium of television and taking over society and transitioning us from, Lance, what you were just talking about, of a typographic culture to a television culture would completely shift and transform public discourse, you know, participation, democracy, education. Does one of you want to take a stab at kind of the

Cliff Notes version of Postman's argument before we dive into specifics? Well, I mean, it really is the shift from typographic era to the television era, and that that has undone a lot of key elements of American culture. As you may know, I did a book that followed up on Amusing Ourselves to Death called Amazing Ourselves to Death. And I don't think Postman quite

made it overt in amusing ourselves to death, but he has four case studies, you know, and they're the news, politics, religion, and education, and how each one has been transformed in a negative way by television. And what I tried to explain is that what Postman hit upon there are the four legs that the table of American culture stands on. Politics, democratic elections, obviously, change.

Journalism as the First Amendment and the way that makes possible democratic participation, absolutely. Often overlooked, but religion forms the kind of moral and ethical basis that our republic was founded upon. And then education as the promise that people will be literate. Like the bottom line of education is reading, writing, and arithmetic, that people will be literate enough

to be able to govern themselves, to get access to information and think rationally and make good decisions.

Sean, do you want to add to that? There's so much here. I mean, when people talk about, you know, typographic culture versus televised culture, let's just zoom into what do we really mean? Because so much of Postman and Marshall McLuhan is essentially a kind of holding up a magnifying glass to the invisible. When we say it structures, you know, the way that we think, like, what do we actually mean by that? What's the phenomenology of reading text on a page that's so different from watching this podcast in a video right now? Well, for me, I mean,

The point in all of this is to get us to really see how every medium of communication is acting on us by imposing its own biases and logics. And they are different. Postman talks about the printed word. What is it to read a book? What is the exercise of reading? It's deliberative. It's linear. It's rational. It's demanding.

What is TV? It's visual, it's discursive, it's entertaining. It's all about imagery and action and movement. What is social media? It's reactionary, it's immediate, it's algorithmic. It kind of supercharges the discontinuity of TV and monetizes attention in new and powerful ways, right?

Once you have this media ecology framework, you look at the eras of politics that coincided with these communication technologies. You can see it in the language of politics. You can see it in the kinds of people that win elections, how they win those elections, how they appeal to people. You can see it in the movements and the dominant forces at the time. Like I was saying, it still blows my mind that I made it through a graduate education in political theory and we never managed to

to read any media ecology because it really is, especially in a free and open society where people can speak and think and persuade one another, it's a kind of master variable that's not often seen as that, but it should be.

So let's dive into that just for a second. So, you know, people think, okay, we live in a democracy, you have candidates, those candidates debate ideas, they have their platforms, they talk about themselves, and then voters are sitting there and they kind of take in all those arguments and they make a rational choice. And that's just democracy. And democracy is democracy. It doesn't change over the last 200 years. So let's just explain, Sean, what you were just saying. Maybe, Lance, you want to do this. In what way does media define the winners and losers of our political world? I mean, Postman gives so many examples, but...

Well, I think we have to start with the fact that democracy was founded on the idea that

People can have enough access to information to make decisions. But it also presupposes that people will talk to one another and be able to speak in a rational way. Postman's kind of wonderful illustration is of how people went to listen to the Lincoln-Douglas debates for hours upon end...

And I can imagine there was a carnival-like atmosphere, but still that people were willing to sit and listen for whatever, six hours of debating going on, whereas today everything is reduced to these sound bites, you know, these 10-second sound bites. And, you know, Postman points to two key technologies in the 19th century that start the ball rolling away from typography and ultimately bring

come together with television. One is the telegraph, because just by speeding things up, we have no time to think and reflect. And that really is harmful. So just the speed at which we're moving, right, which we see today where we've, you know, in this moment, we feel overwhelmed and there's like a new story every few hours, some new thing happening.

And we don't know what to do. And the other thing is the image, the photography of the 19th century becomes the dominant mode of communication. So between the two, it's all about appearance and personality that's communicated over the televised image and this rapid turnover that favors celebrity and fame over substance. Yeah. Can I just say something real quick? Yeah.

The telegraph is such a good example of, like a practical example of McLuhan's, you know, the medium is the message, you know, that how the medium itself, the technology itself doesn't just influence content. It really dictates what it actually means. You know, I was going back and I was reading Thoreau, actually, when I was researching my book. And, you know, Thoreau was talking about the telegraph as a kind of,

proto-social media that it was actually he's arguing that it's actually changing what constituted information that with the telegraph it became a commodity to be sold and bought we get the birth of the penny presses and tabloid journalism and for him that was sort of the end of the idea that information was something that was definitionally important or actionable it just became another source of entertainment it became a consumer product

So much of our work in this podcast and at CHT, obviously, it's like this question of why does any of this matter? Why are we here talking about this? And it's because technology and media are having a bigger and bigger influence on constituting our culture. People always say, if culture is upstream from politics, then now technology is constituting the culture that is upstream from politics.

I was just at Davos in Switzerland, and I would say the most popular question being asked, and it was right on Inauguration Day, January 20th, and basically the dinners I was at, people said, what do you think will matter more in the next few years? The choices of political leaders or the choices of technology leaders and companies? And especially when you tune into AI. And so I just want to ground this for listeners of like, why are we even talking about this? It's because technology is going to structure things

what a human relationship is, what communication is, how people know what they know, the habits of mind. So I just want to just make sure we're returning to kind of set the stakes of why this is so important. Because so often, I think the thing that's problematic for me about Postman is it just feels so abstract. McLuhan, the medium is the message. It doesn't hit you about how significant that idea is. So I just want to return, Lance, to the thing you were saying about the Douglas Lincoln debates in the 1800s.

I think most people don't know, we kind of wheezed past it, they debated for three hours each, I believe it was one guy took three hours, then the next guy took three hours, then there was like an hour rebuttal. Can you imagine seven hours of political debates that are long-form speeches in front of live audiences?

And just what a different notion of the word democratic debate. So here we are, we're using this phrase democratic debate, but the meaning of it has completely shifted of what constitutes those two words in the year 2025 than the year 1862. And so let's just dive into, I think, another aspect of why this matters, which is the power that media confers in the way it sets up what kinds of people win or lose. Sean, you look like you're trying to jump in.

What's interesting is that for Postman, the TV era was all about entertainment, right? Everything that unfolded on or through that medium had to be entertaining because that's what the laws of TV demand. But this era, where TV...

is still around. It still matters, but not nearly as much. There's much more of a convergence with other mediums like the internet and social media, which are now more dominant, really, culturally and politically. And on these mediums, it's not about entertainment so much as attention. The attention economy is mastered now, right? So in the TV era, politicians really had to be attractive and likable. They had to play well on TV. Now they just have to

know how to capture and hold attention, which means leaning into spectacle and provocation and performative outrage or virtue, as the case may be. They dictate a different kind of political skill set to win.

One of the reasons why both Postman and McLuhan are so prescient, at least that people think of them that way, you know, that what they were talking about largely television, and yet it seems to apply so well to today. And for many people, really, it seems to better fit today is that their analysis was based not just on the specific medium of television, but

but on the idea of electronic media generally. But I think entertainment was Postman's way of getting at the larger point, which is that it's trivial, it's not serious. And what catches our attention, it's a larger set of dynamics and entertainment was just kind of way of pinpointing, but it really is that non-serious trivialization. It's a different kind of entertainment.

So I just want to name a pushback that I got when I remember speaking to these arguments in the tech industry when I was at Google in 2013, which is people, some people might say, well, why is that a problem? If people like to amuse themselves, people like amusement, don't we all need some amusement in the world? What would you say to that? Or what would Postman's argument be against that?

Well, Postman wasn't against amusement. You know, he said television is great. The best thing about TV is junk. He loved TV, especially sports.

You know, we actually bonded together as Mets fans, although his real love was the Brooklyn Dodgers. But, you know, in their absence, it was the Mets. He also, you know, loved basketball and all of that. I mean, sports is one of the great things that television can provide. It's awful for politics. It's awful for religion. And it really has degraded religious values.

participation and presentation by putting it on television and also through social media and all of the other advances that we've seen. And it's bad for education. Sean? I would go back to what you were saying earlier about distraction, which is a really important word. I think that's more closely pegged to the role of technology here, fragmenting our attention, pulling us around.

like greyhounds chasing around a slab of meat. I mean, I was talking to Chris Hayes the other day, who was on my show, and he has a new book out about attention and the fragmentation of attention and really sort of the death of mass culture in any meaningful sense, right? And I was asking him, well, I mean, isn't democracy on some level a kind of mass culture? And if we can't pay attention together, if we can't focus on anything together, then what the hell...

does that make of our democratic politics? Right. I mean, it, that's what concerns me. Right. I mean, I remember, you know, reading McLuhan who, you know, would talk about media and time. And he was so obsessed with electric media because it flattened time and it made everything instantaneous. And, um,

And he would argue that this sort of scrambled society's relationship to time. And, you know, like radio and TV and now the internet create this landscape where everything unfolds in real time. But, you know, in a print-dominated culture where you're consuming weekly or monthly magazines or quarterly journals...

or books, that facilitates a kind of deliberation and reflection that you don't get when everything is so immediate and frenzied. And in a democracy where the horizon of time is always the hell with the next election, it's the next news cycle. That kind of discourse makes it very hard to step back and think beyond the moment. It makes it very difficult to solve collective action problems. And all the most important problems are collective action problems. Totally. Yeah.

Yeah, I think to sort of my interpretation of what you're both saying is that there isn't a problem with people having amusement in their lives or having entertainment. It's about whether the media systemically structures the form of all information in terms of its amusing capability or its entertainment capability.

And that that systemic effect makes us confused about whether we're actually consuming information or getting educated versus we're really just being entertained. And he says, you know, the basic quote, the television is transforming our culture into one vast arena for show business.

And that was for the television era. When I think about social media era and I think about Twitter or X, I think social media is transforming our culture into one vast radiator stadium arena for basically drama and throwing insults and salacious tweets back and forth.

Another key concept that Postman is critical of is the information-action ratio. I remember this actually in the tech industry that so many people, and I used to really believe, how many problems really had to do with people just not having access to the appropriate information. It was just all about information access. I had a tiny startup called Apture that

It was a talent acquired by Google that was all about giving people contextual access to more information. I remember it. Do you remember that? Okay. Yeah, yeah, it was good. Yeah, well, thank you. I mean, it was motivated by, I think, the good faith version of this, which is that if people don't,

have, imagine, you know, right when you're encountering something that you have no basic, you have no reason to be interested in, the perfect, most engaging professor, guide, lecturer, you know, museum curator showed up and held your hand and suddenly just told you why this thing that you're looking at is the most fascinating thing in the world. And that's what this little

Apture thing was. It was basically providing instant contextual rich information that was supposed to entrance you and deepen your curiosity and understanding about everything. And it was driven by my belief, which is very common in the tech industry, that it's all about driving so much more information access. And if we only just gave people more information, then that would suddenly make us respond to climate change or respond to poverty or do something. And so I'd love for you two to articulate what was Postman's kind of critique

of information glut and the information action ratio he speaks of. Well, you know, what he would say is that in the 19th century, not having enough information was a problem. But we solved it. We solved it long ago. And that's the...

creates new problems because we just keep going and going and going. I mean, I would say, think about how most of human history not having enough food was a problem. And today we are wrestling with issues of obesity because

We solved that problem a long time ago. We've got plenty of food, but we just keep going and going and going. So, I mean, this was actually one of McLuhan's points is that you push things far enough and you get the reverse. You get it flipping into its opposite.

So, information scarcity, by solving it, we create a new problem of information glut. And that leads us, as you said, since most of that we're powerless to do anything about, it leaves us with irrelevant information, leaving us feeling impotent, powerless. Which again, I think a lot of people are feeling particularly right now. Yeah, I always found with those types...

There's a tendency to conflate information and truth as though they're the same and they are not the same. I don't know how anybody can look at the world right now and say that this superabundance of information has been a boon for truth. And to the point that Lance was just making, it's this combination of being constantly bombarded with information, most of it

Some of it true, a lot of it bullshit, a lot of it terrible. Being bombarded with that and also the simultaneous experience of complete impotence in the face of that. We've also engineered an environment that elevates the lies, it elevates the falsehoods, it elevates the distractions, it elevates the things that stimulate lies.

are more base primal impulses. And in the contest between diversions, amusements, provocations, and dispassionate truth, I think we all know who's going to win that fight 99 times out of 100.

And I would think it's really important to distinguish between information and knowledge. And knowledge is something that we largely got from books. And information is something that we are inundated through the electronic media. And it doesn't really have to be true or false information.

And that's why in a way the distinction, while valuable in some contexts, but the distinction between misinformation, disinformation and just information is not that important.

because when we have information glut, anything goes. You can't tell what's what because it's not relating to anything out there. I think it's a critical point that you're making because even let's say we solve the misinformation, disinformation problem, boom, it's gone. It's all gone from all the airways. You're still just bombarded by information glut and information that doesn't give you agency over the world that

that you're seeing the companies profit from mistaking and reorienting or restructuring what agency means in terms of posting more content on social media. So I see the social cause that's driving me to emotion. And then I hit reshare and think that I've like done my social action for the day. I think Malcolm Gladwell wrote about this like 10 years ago.

So the kind of failures of tech solution is I'm going to reshare this content. What I'm really doing is actually driving up more things for people to look at and keep getting addicted on social media. So I'm perpetuating the money printing machine that is the social media company. I want to actually get us to AI because so much of this conversation was really motivated for me about

how do we become a more technology critical culture, which I think is what Postman was all about. It's like, what does it look like to have a culture that can adopt technology in conscious ways, aware of the ways it might restructure community, habits of mind, habits of thought, education, childhood development,

and then consciously choose and steer or reshape that technology impact dynamically such that you get the results you would want by adopting that technology. And in doing that, I think I want to turn at this point in the conversation to his other book,

Technopoly, which he wrote several years later, which the subtitle is The Surrender of Culture to Technology. And I think this is actually the heart of what I'm... I mean, I think that amusing ourselves to death is a very accessible thing for most people and the race to the bottom of the brainstem and social media is an extension of TV. I think Technopoly really gets to the heart

of what does it mean to have a society consciously adopt technology in ways that it leads to the results that it wants? And what does that relationship look like? So how would we set the table of the argument that Postman is making in Technopoly? Either of you. Yeah, I mean, I...

This book was very interesting. In a lot of ways, his idea of technopathy is really like a more accessible expression of Heidegger's critique of technology. Technologies are things we use in the world to get things done or improve our experience in the world. And then gradually as we move into the modern world, technology becomes almost a way of being. As Postman says, we became compelled by the impulse to invent. Right.

It's innovation for the sake of innovation. It is a blind mania for progress, disconnected from any fixed purpose or goal. And that's sort of what Postman is calling technopathy, where our whole relationship to the world is defined by and through technology. Technology is this autonomous, self-determinative force that's both undirected and independent of human action. And we're almost a tool of it rather than the other way around.

Here's Postman in his own words. Well, in the culture we live in, technological innovation does not need to be justified, does not need to be explained. It is an end in itself because most of us believe that technological innovation and human progress are exactly the same thing, which of course is not so.

Postman was talking about the personal computer as a quintessential technology of technopoly. I mean, my God, what would he make of AI, which by any measure is and will be far more immersive and totalizing than personal computers? I just want to briefly add the quote that Postman cites from Thoreau, since we've mentioned it multiple times, that our inventions are but an improved means to an unimproved end, an

I think this really speaks to what you're speaking about, Sean, which is Postman's critique that we deify technology. We say that efficiency and productivity and all the new capabilities, whatever they are that technology brings, are the same thing as progress, that technology progress is human progress. And it's never been more important to interrogate the degree to which that's true and not true. And this is not an anti-technology conversation, but it's about how do we get critical about it. Lance, you were going to jump into that?

First, I'd say that Postman would say that Heidegger was a Nazi and should not be mentioned anymore, but that the big influences on Technopoli were Lewis Mumford, who was one of the great intellectuals of the 20th century and a key mediacology scholar, and then Jacques Ellul. And

But it definitely is this argument that particularly in America, it's not about the stuff. It's not about the gadgets. It's about a whole way of looking at the world and that efficiency becomes the only value.

that we make any decisions on, which means that it's almost impossible to say no when somebody goes, here's a more efficient way to do this. You can do it faster, do more with it. And we almost never say no. And you must've seen this new thing about mirror genes or whatever. Mirror bacteria?

Yeah, they can create organisms with mirror image DNA, which our bodies would have, our immune systems would have absolutely no defense over. And so we shouldn't do it. Well, somebody is going to do it. I mean, you know that somebody is going to do it because once we have that capability, nobody puts a stop to it. Postman did know about AI because that's been around for decades.

much longer than this sudden emphasis on it. And Joseph Weisenbaum, who was somebody that Postman knew, was one of these sort of pioneers in artificial intelligence. He did the ELISA program and

And in his book, Computer Power and Human Reason, he introduces the word ought that we've forgotten to use, O-U-G-H-T. Ought we do this? Not can we do this, but ought we do it? And that that has just vanished from our vocabulary. And he argues that we need to reintroduce it.

You know, I always think of that hilarious Jon Stewart joke, you know, that the last words a human being will ever utter will be, you know, some dude in a lab coat who says, it worked. I mean, like, Tristan, I would ask you a question. I mean, you were part of this world in a way I am not. You talked to these people, the people who are building this.

who want to build AGI and whatever else. I mean, they are acutely aware of how potentially destabilizing it can be.

Why do they persist in that? Is it just a simple, well, if we don't do it, China's going to do it or whoever's going to do it. And so therefore we got to be first. Same thing with the nukes. It's actually related to what Lance is speaking about, that if we don't have a collective ability to choose which technology roads we want to go down and which ones we don't. And if we just say it's inevitable, someone's going to do it and better we, the good guys who we think we have better values than the other guys, better off that we do it first. We actually even know what the dangers are and can try to defend against the bad guys.

And I think that the thing that, you know, Lance, you were just speaking about with the mirror bacteria is a perfect example because the reason that Postman's questions here about how do we consciously make decisions about what technologies we should do and not want to do rather than just because we can, we do it, is because AI is about to exponentiate the introduction of new capabilities into society. So it's just, it's going to be a Cambrian explosion of brand new technologies

Text and media and generative, everything that you can make. You can make law, you can make new religions, you can make, as we say, language is the operating system of humanity from code to law to language to democracy to conversation. And now generative AI can synthesize and decode and hack the language either of conversation in the form of misinformation, hack code in the form of hacking cyber infrastructure, hack law in the fact of overwhelming our legal systems or finding loopholes in law.

And so as we're unleashing all these new capabilities, it is more important than ever that we get an ability to consciously choose, do we want to do mirror bacteria? But then the challenge is, as technology democratizes the ability for more people to do more things everywhere beyond global boundaries, our problems are international, but our governance is not international. We have national governance responding to global interconnected issues.

And then we can see the political henwoods are not really trending in the direction of global governance, which is looked upon as a kind of a conspiracy of people who are out of touch of the national interests of the people, which is a very valid critique. So, yeah, Sean, I'm sort of wanting to play with you here on what's your relationship to this question that you're laying out? I don't know. You know, I mean, I'm just constantly thinking of what are the trade-offs going to be? I mean, you just think about the explosion of the Internet and the trade-offs involved there.

You know, one consequence of that, there are a lot of incredible benefits. I love the interwebs. I use them every day. But one of the consequences of that is the complete destruction of gatekeepers, of any kind of boundaries at all on the information environment. So...

We lost the capacity, society lost the capacity to dictate the stories society was telling about itself. And, you know, digital just exploded all that. You know, the internet is like this choose your own adventure playground and it unsettles and undermines trust. And a lot of people might say, well, good, these institutions, the elites were corrupt and untrustworthy to begin with. Okay, fine.

fine, but we tend to underappreciate how much what we take to be true is really just a function of authority. Most of us haven't observed an electron or a melting glacier. We take it to be true because we believe in the experts who tell us these things are real. And we believe the video clips on the evening news of glaciers melting. But if that trust is gone and the info space is this

hopelessly fragmented thing riddled with deep fakes and misinformation and consensus reality isn't possible anymore, then where does that leave us?

I will say, I think there's actually a way to get to a good world. It's just we have to distinguish between the internet being a problem versus the engagement-based business models that profited from drama derivatives, the amusement culture, the tweetification culture, and personalized information bubbles, which are incentivized. So it's important to recognize the reason we have personalized. It's not just that you can choose your own adventure. It's also true. But the mass reinforcement of personal information bubbles is...

is actually incentivized by the business models because it's better to keep you coming back if I give you more of the thing that got you interested last time.

And so we can split apart the toxic thing of the engagement-based business models from the internet. And then I think you could say, is there a different design of internet protocols and design of these Metcalf monopolies, meaning these network effect-based social media places where there's only a handful of them? Could they be designed in a different way that actually do reward the kinds of mediums that actually enrich and bring out the better angels?

of human nature. And that's still the optimist in me that believes that it's possible to do that. Lance, I see you sort of nodding and also maybe skeptically nodding your head here. So feel free to jump in.

Well, I mean, I think Postman would question whether more technology is the answer. And every new innovation solves some problems but creates many more, which we then solve by more technologies. And it just keeps expanding and expanding and expanding that way. You know, when I teach my students media ecology, I try to emphasize, let's think about what are the appropriate uses for

for this particular medium, and then what's inappropriate. And if we can start with that, the internet or various aspects of it were great for certain things. And they empowered people who were kind of in minorities and brought together people who were

having difficulties in a lot of ways. I can speak just in terms of my own family with having raised an autistic child that parents of autistic children were largely unable to go to a self-help group in person because your hands are full and being able to communicate over a discussion list or group

online was very valuable. So this is where we face this problem of trying to evaluate the costs and benefits. I actually feel like there is a vision of a world that would work. And I agree with you, Lance, that it actually

it takes asking what are the appropriate uses of a technology and the actively inappropriate uses, and then consciously designing our social structures, our social norms, our culture, like not designing, but like, you know, practicing cultural values that allow us to say, how do we reward those appropriate uses and anti-reward the inappropriate uses? Now, I want to just move a little bit from admiring the problem because there's a tendency to kind of rehash all these things. And I think Postman is unique in offering opportunities

I don't know if I'd call it solutions, but a form of taking an active and agentic stand on technology. And he has this famous lecture series where he outlined seven questions that we can ask of any new technology. And he said that these questions are a kind of permanent armament

with which citizens can protect themselves from being overwhelmed by technology? You know, the first is, what is the problem to which this technology is the solution? What is the actual human or social problem

for which that technology is the solution. It's a very basic question, but it's a very powerful one. So anyway, we can go into some of the others, but I'm just curious if either of you have a reaction to this or as we move into a more of a solutions-oriented posture. You know, Sean, what's your sense of this? I think it's a great question. I just go back to what we were saying a minute ago. How do we answer it? What is a mechanism for having that conversation? You know, I...

Science is very good at giving us more of what we want. It cannot tell us what's worth wanting in the first place. And the problem is I don't know how as a society we have that conversation together about what's worth wanting and then have a conversation about how to go about getting it. I just don't know. And the problem with some of these new technologies like AI is it's not even clear what they're going to do.

You know, so it's very hard to talk about the trade-offs that might be involved. But I don't know, it's not a very good answer because I don't have one, I guess.

Well, and it's interesting because I think that, so one of the things that actually excites me about AI is the ability to use it to more quickly augment society's ability to see the downsides and externalities and play out simulations of various new technologies. Because one of the things that we have to get incredibly good at is actually foreseeing the negative unintended consequences before they happen.

So imagine inventing plastics, but actually knowing about forever chemicals and then taking a left turn so we don't go down the road of creating more pollution than we have the capacity to clean up. And the same thing with social media. And that's one of Postman's other questions is whose problem is it?

So if it's, you know, the problem of not being able to generate content at scale, whose problem was that? This is basic second question. The third question is what new problems will be created by solving this problem with this technology? So in the case of generative media, you know, we will create a new problem of people have no idea what's true because now anybody can create anything and flood the information airwaves.

And then he asks which people and institutions will be most harmed by the adoption of this technology. So, for example, gatekeepers or the idea of trustworthy or having, you know, any kind of authority or expertise is suddenly going to be eliminated by the fact that there's a flood of information, kind of a denial of service attack on democracy through all this stuff that's coming.

And then he has this really important subtle question that he asks, what changes in language are being promoted by this technology? And I'm curious, Lance, if you have some examples that Neil has given on that one, because I think it's such a crucial one that's very subtle. Well, sure. And I think it's actually a very important one. And you're right that it does...

sort of take a left turn from the other questions. But what's often missed when folks just look at like amusing ourselves to death and technopoly is that Postman's grounding was in the study of language.

And he started out in English education and he was also very much associated with general semantics, which in a large part is about our use of language and trying to understand our misuse and how that changes our thinking. I mean, I think for me, a great example is community.

And when you think about the use of the word community, in a real community, people are together and they don't all share the same interests and viewpoints, which is what we mean when we talk about online community, virtual community, and that's where you get that siloing effect. You know, in a real community, people have to negotiate with people who are very different from themselves.

and find a way to live together. And you can't just like pick up and leave, you know, where you live. Whereas on the internet, you can just, you know, click a button and you've left that community and you find one that's more to your liking. So that meaning of the word community has changed drastically by that usage. And that's also, you know, you could also connect that back to a kind of Orwellian quality because that was something

You know, the idea in 1984, and it's expressed in the index, that we can change the meaning of words and change the way people think. That may not be happening anymore.

all that intentionally as it was under a totalitarian system. And it actually did happen under Nazi Germany and in the Soviet Union. But it's still happening and it's still changing the way we think. I think it's an excellent point. And it feeds back into real community. So when people are in real community, their expectations have been formed by these online experiences and these new definitions for words. Sean?

I guess I've done a lot of technology bashing here. And I just want to say, all of our problems cannot be laid at the feet of technology. I mean, it is also true that over the last three, four decades, we have stopped as a society investing in social infrastructure, community centers, libraries, third spaces, where people can actually get together and talk and be with one another and engage their community and not just be

home alone ordering pizzas with the app so that they don't have to engage with another human being in the entire process, right? So my worry is that these technologies have pushed society in a more solipsistic direction. It's pulling us more inward, right?

a la the movie Her, I feel like that's where we're going, where people are just, they're going to be in relationship with chatbots. They're going to be, you know, at home using VR technology or whatever, and they're going to stop going outside and doing things with other people. And so we have failed on both fronts. And there is a, there are policy solutions that could counterbalance some of this if we invested in those things and we, we haven't, or we stopped and we should again. Yeah.

I agree. I just wanted to name one other example of language that change that is happening without really reckoning with it is Elon's redefinition of saving free speech when he takes over Twitter to protect people's ability to reach millions of people anonymously inside of a newsfeed that rewards the most salacious inflammation of cultural fault lines in the cultural war.

and like a system that just like rewards the toxicity of inflammation on cultural fault lines everywhere. And then saying that that's about free speech, it's like it's a kind of a new speakian kind of turn on what was freedom of speech really meant to protect in its original essence as defined by the founding fathers. And it had nothing to do with, or it certainly did not foresee a world where a single person could reach

200 million people every day with their thumb as many times as they wanted to. And that's a different thing than the deeper ideas. And so I just think that this question of

of language we just imagine a society that is actually asking that question. So imagine that sort of a postman informed society. And every time there's a new technology rolling out, their immediate first thoughts are instead of being entranced by it and deifying the technology and welcoming it with excitement and using it, they first ask, what is the problem for which this technology is the solution? Whose problem is that? What are the new problems that are going to be created by this technology? What are the changes in language that it's actually hiding from us about the way it's reconstituting things?

So I just, I feel like that's a vision of society that I'm reminded of the, I think it's the opening chapter of Technopoly where he talks about

the story of, was it Socrates and, and famous, um, famous, famous, famous. And, and where it's, it's really about what is a conscious adoption strategy of technology where I think in that story, they're actually talking about, should we adopt the written word? And they're sort of talking about that as a choice and noticing all the things that that's going to give. And also it's what it's going to do. And also which things it's going to undo in the society. And I just feel like that's,

that's so within reach is to have cultures that actually are critical of technology in which you know postman is part of the curriculum of you know political science courses at every university and part of undergraduate education and it's all the more important because technology is so central in the fundamental shaping forces of the entire world so maybe i'm just a dreamer but this is the uh

Can I ask you a question? Do you think it's the responsibility of the people building these technologies to ask themselves these questions, or do you think it's the responsibility of the public to ask and answer these questions and then impose...

Well, it's all the more important that the people building it have a critical understanding of what it will do because they're being at the driver's seat and the control panels about how it's going to roll out means that it's even more important that they're contending or tending with these questions than it is with the regular public. And I think the regular public needs to contend with it as maximally as possible. Lance? Well, I mean, the history of invention shows that inventors...

pretty much are wrong about what their technology is going to do. And so they're the last people. I think Arthur Kessler called them sleepwalkers. Television's a great example because when television's introduced, especially in the post-war period,

all of the write-up of it is it's going to bring culture into everyone's home. They'll have opera and ballet and classical music. And it's going to be wonderful for democratic politics because we'll be able to televise political conventions and people will see, you know, debate and discussion on political issues. You know, and they couldn't be more wrong. And I, you know,

I think there's a great spirit of play that comes with invention. It's just, you know, to see, you know, what can be done, what we can do. But I don't even know if an AI program, I mean, you mentioned this before, Tristan, but I don't know if that can, they, I don't know, it can adequately...

uh, foresee all of the consequences because you introduce a change into a highly complex interdependent system. It's going to change. Something is going to change other things. They're going to interact with one another. And it's a complex system for sure. Yeah. And to be clear, I want to say a couple of things. I agree that, um,

We don't inventors don't have a good track record of foreseeing the consequences of their invention. I do think that there are tools one can use to much better foresee what those consequences will be in 2013. How could I foresee that the attention economy would lead to a more addicted, distracted, sexualized society? It's because the incentives at play help you predict the outcome. And I think an incentive literate culture that follows the Charlie Munger quote, you know, if you show me the incentive, I'll show you the outcome.

if we can understand what the incentives are, you can get a very good sneak preview of the future. I don't think it's an easy thing to reach for, but I think it's something that we need more of if we're going to be a technology-enhanced society and actually make it through, because we're quite in danger now. Sean? Yeah, look, even if the answer to these questions is, you know, in the words of Nate Bargatze, nobody knows, we should still be asking them. That would at least be a start. And

That's just not something that we've done or are doing. I think one of the real needs is to really reinforce literacy and that this is ultimately what's being threatened because that is the foundation of democracy and it's the foundation of the Enlightenment. In Postman's last book,

It was building a bridge to the 18th century, which wasn't saying that we should go back to the 1700s, but that we should retrieve from that era that literacy, typography, the Enlightenment era.

and the respect for science and democracy that existed back then, that we need to reinforce those elements of the media environment that the electronic media are really doing away with. And when you say, what is the problem that AI is going to solve? And I actually mentioned it before. I mean, information glut is one of the problems that it's causing.

there to solve. But I think one of the problems is that reading and writing are hard. They're hard to do. Anyone who has written a book will tell you that. What could be more unnatural than sending a

five-year-old to sit still for hours on end, you know, but that's what you need to learn how to read and write. And so what are we doing? I mean, we, and we've been doing this for a long time now. We're developing technology to read for us and to write for us. I mean, that's what AI, voice synthesis and voice recognition, that's what it's all doing. So we don't have to do it ourselves.

So the way to at least try to mitigate this is by reinforcing those aspects of the media environment that we still have that are under assault today.

I would just say that in a lot of ways, the problem of our time is this misalignment between our interest and our incentives. And the tragedy really is that we have built tools that have undermined our capacity to alter our incentive structures in healthy ways. That is it. If our whole damn problem could be distilled, that's it.

I don't know what to do about that, but that's the challenge ahead of us and we've got to figure it out. Completely agree. If incentives can control the outcome, then governance is normally the ability to change what those incentives are. You pass a law or a policy and you build social norms and consensus in order to get that law or policy passed to change and say, "Hey, you're not allowed or you can't profit from this thing that would be highly profitable." Whether it's underage, drugs, sex trafficking, whatever the thing is.

So I completely, completely agree. I know we're basically here out of time and just want to close with, um,

this quote that "No medium is excessively dangerous if its users understand what its dangers are. It's not important that those who ask the questions arrive at my answers or Marshall McLuhan's. This is an instance in which asking the questions is sufficient. To ask is to break the spell." And that just feels like what we're arming here is let's arm ourselves with the questions to protect ourselves from getting further overwhelmed. And also, let's be honest about the nature of what's coming. Questions are our most important medium.

That's from language. And that's the way that we start to think about things critically and deeply. Well, no one's going to listen to a three-hour Lincoln-style speech to save us. So we just need a kick-ass meme that's going to bring us all together. It's your job to find a tweet for this one and create some memes that are going to go viral. We'll tweet our way through it. No worries. Sean and Lance, just wanted to thank you for coming on Your Undivided Attention. My pleasure. Thank you.

So a thought I'd like to leave you with. There's a quote from the introduction of Amusing Ourselves to Death that has always stuck with me, where Postman compares two dystopian visions for the future. The first presented by George Orwell in 1984 of surveillance of Big Brother, and the other presented by Aldous Huxley in Brave New World. Postman wrote, what Orwell feared were those who would ban books, while Huxley feared was that there would be no reason to ban a book for there would be no one who wanted to read one.

Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we'd be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us, while Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared that we would become a captive culture, while Huxley feared we would become a trivial culture. As Huxley remarked, the civil libertarians and rationalists who are ever on the alert to opposed tyranny

failed to take into account man's almost infinite appetite for distractions. And it was Postman's fear that it would be Huxley, not Orwell, whose prediction would come true. Your Undivided Attention is produced by the Center for Humane Technology, a nonprofit working to catalyze a humane future. Our senior producer is Julia Scott. Josh Lash is our researcher and producer, and our executive producer is Sasha Feagan. Mixing on this episode by Jeff Sudakin, original music by Ryan and Hayes Holliday.

And a special thanks to the whole Center for Humane Technology team for making this podcast possible. You can find show notes, transcripts, and much more at humanetech.com. And if you like the podcast, we'd be grateful if you could rate it on Apple Podcast because it helps other people find the show. And if you made it all the way here, let me give one more thank you to you for giving us your undivided attention.