We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode EP33:Tech's Republic: Rebuilding Humanity in the Digital Age

EP33:Tech's Republic: Rebuilding Humanity in the Digital Age

2025/4/29
logo of podcast Deep into the Pages

Deep into the Pages

AI Deep Dive Transcript
People
主持人1
Topics
主持人1:我认为这本书的核心论点是,软件已经成为现代生活的基石,控制软件的人掌握着新的权力。硅谷早期与政府的合作关系逐渐演变为对消费技术的关注,这导致了对重大社会挑战的忽视。人工智能的发展也存在类似的风险,即对技术本身的迷恋可能会掩盖其潜在的危险。我们需要警惕潜在的威胁,不要满足于现状,尤其是在国家安全方面,西方国家在军事人工智能发展方面的犹豫不决可能很危险。此外,现代文化正在削弱我们深度思考和坚定信念的能力,许多硅谷领导者对更深层次的价值观持不可知论态度。对选择的过度渴望可能会阻碍真正的进步和有意义的改变。对旧有叙事的否定导致了更薄弱的民族认同感。这场最初的反建制革命最终创造了极其强大的科技巨头,这具有讽刺意味。工程思维是一种宝贵的思考方式,我们可以从蜜蜂和椋鸟的集体决策中学习如何构建信任集体智慧的组织。创造性的自由是硅谷成功的秘诀之一,但它也可能导致人才与更广泛的社会脱节。我们需要独立思考,抵制群体压力,适应性技术至关重要,技术与使用者之间的脱节会导致问题。对舒适的过度关注可能会扼杀真正的创新,市场并非总是明智的,并非所有资源的最佳利用。在技术和执法方面,需要在安全和自由之间取得平衡。提高公务员薪酬可以吸引更多优秀人才,有时结果比严格遵守规则更重要。构建更大的社会需要共同的叙事和价值观,建设伟大的创新技术需要审美观。重建科技共和国需要重新思考规则,分散权力,确保技术服务于公共利益。 主持人2: 我同意你的观点,这本书确实提出了许多发人深省的问题。它强调了个人自由与集体利益之间的持续紧张关系,以及科技时代民族认同的变化。特别值得注意的是,即使对于创新,也需要一种审美观点。我们需要思考如何才能与技术建立更深思熟虑的关系,以及如何为未来建设一个更有意图、更人性的科技共和国做出贡献。

Deep Dive

Shownotes Transcript

Translations:
中文

Welcome to the Deep Dive. Our goal here is always to cut through all the noise and really get you the core insights from important sources fast. Exactly. Making sense of complex stuff. Precisely. And today we're diving deep into Alexander C. Karp's The Technological Republic. It's

It's a book that just came out this February, and it really tries to grapple with, you know, technology's huge impact on everything. It's got a really wide scope. It does. And this deep dive is especially for you, the learner, someone who wants to grasp these big ideas without feeling, well,

Totally overwhelmed. Right. We're going to hit on a whole bunch of themes from the book, looking at the early days of Silicon Valley, some surprising stuff there about national security connections. Yeah, that part's big. All the way up to like the future of national identity itself. What does that even mean now? So, yeah, buckle up, The Learner. We've got a lot to unpack. OK, so where do we start? I mean, the first thing that hits you with Carp's book is just how ambitious it is. It's not just about gadgets or code. No, it's bigger than that.

It's about the whole system technology is creating and how that system is fundamentally reshaping, well, society.

Part I kicks things off. It's called The Software Century. And it basically argues that software isn't just a tool anymore. It's like the bedrock, the foundation of modern life. The Software Century. Yeah, that's a really strong opening statement, isn't it? The book saying the 21st century is software. It drives economies, shapes our daily lives. Everything. And that leads straight into this core idea of technology.

the technological republic, this notion that, well, the people who control the software, they're the new power brokers. Right. That's the crux of it. So for you, the learner, the big questions, Carp,

raises right away are who is really in charge in this new republic and what are the implications of us relying so much on these digital systems? And to figure that out, the book actually takes us back way back to the start of Silicon Valley. Chapter one, Lost Valley. Lost Valley. OK. And what's maybe surprising is how much collaboration there used to be between tech companies and the U.S. government, like back in the mid 20th century. You mean funding and stuff?

Yeah, funding flowed into really key areas. Pharmaceuticals, rockets, satellites, even the very precursors to AI. Wow. Like Fairchild Semiconductor.

Their work on semiconductors was actually crucial for CIA reconnaissance tech. It shows this time when tech progress was really tied to national goals. It's almost like a forgotten history of the valley, that deep connection to government work. But Karp argues that somewhere along the way, things shifted dramatically towards consumer tech, away from those big collective national goals towards just the next gadget, the next social media platform. Right. The focus changed.

And the insight here for you, the learner, is Karp's argument that maybe this pivot led to a kind of loss of ambition. We stopped tackling the really huge society-wide challenges. The big problems. Yeah. And the big tech companies now, they mostly steer clear of government work.

Carp even hints at a cultural thing maybe younger engineers just don't have that experience of, like a major national crisis that might have pushed earlier generations towards that kind of work. That reluctance to tackle the grand challenges, it's explored even more in Chapter 2, Sparks of Intelligence. And here, Carp draws this really compelling, maybe even unsettling parallel. Between what?

between how the atomic bomb was developed and how AI is developing now. Okay. How so? Well, the book points to Oppenheimer's initial view maybe a bit detached of the atomic bomb as just a gadget. Just a technical problem to solve. Exactly. And Karp suggests a similar mindset might exist among some AI developers today. This kind of fascination with the tech itself, maybe without fully grappling with the implications. That is, yeah. That's a chilling thought, this detached view of something so powerful. And the book,

Gives examples, right? Like GPT-4 stacking objects or drawing a unicorn from a description. Things that show surprising capabilities. Capabilities that hint at what some are calling sparks of artificial general intelligence. And for you, the learner, the key point is even the creators don't totally understand how these models do what they do. That black box problem. Right. And that uncertainty, of course, is part of what led to that open letter calling for a pause on giant AI experiments. Though

The book also brings up the counter argument. Which is? That maybe we should focus on building in safeguards, you know, guardrails rather than just stopping everything. It's a huge debate. This whole discussion about potential dangers and thinking ahead leads nicely into chapter three. And this idea Karp calls the winner's fallacy. The winner's fallacy. Okay. What's that about? He uses this really powerful story from the Talmud to make the point. Basically, the danger of getting complacent.

Ah, thinking you've already won. Exactly. Even if you are in a strong position, you just can't afford to let your guard down when you're facing threats. So for you, the learner, the takeaway is a warning.

Don't rest on your laurels. Right. And Karp applies this directly to Western societies. We've had this, you know, relatively long stretch of peace since World War Two, the long peace. And maybe that's inadvertently led us to focus too much on consumer tech. While meanwhile, potential adversaries have been seriously developing their military tech. Like China. Yeah. The book specifically calls out China's big advances in things like facial recognition companies like Cloudwalk.

And it touches on the debates happening here in the U.S. about AI and warfare. It really flags a potential blind spot. And adding to that, Karp argues that a lot of our geopolitical rivals are led by people with a really deep personal investment in their country's future. Unlike...

Well, he contract that with what he sees as a kind of hesitancy in the West to really push forward with military AI, maybe stemming from this post-Cold War feeling. He references Fukuyama's end of history idea that we sort of already want. That feeling that history is over. We're permanently on top. Which could be dangerous, right? Potentially very dangerous. And then the book gets into the ethical stuff, why some big tech companies like Microsoft with certain defense projects or Google pulling out of Project Maven.

why they've hesitated or backed away from defense work. Right, the employee pushback and ethical concerns. Exactly. And while those ethics are obviously important, Karp argues they might be missing a key point for this century. Hard power is increasingly built on software. Software is the new steel, almost. Kind of. He even quotes Thomas Schelling's point about the power to hurt, that grim reality that the ability to inflict harm is still a crucial factor in international politics.

Which brings us, logically, to the end of Part 1, Chapter 4, End of the Atomic Age. Okay. And it looks back at that initial hope after WWII that nuclear weapons, through deterrence, might actually bring lasting peace. That long peace Gaddis wrote about. Odd. But the book worries about complacency setting in.

especially maybe in places like Europe and Japan, which have become really reliant on the U.S. for their security. Yeah. That dependence is a big factor. Yeah. And it's happening while everyone's still focused on, you know, tanks and planes, legacy systems while the next wave is already here. AI warfare. Drone swarms, autonomous targeting. Exactly. A whole different ballgame. And the book just hammers home this growing disconnect between Silicon Valley, where all this cutting edge stuff is happening, and

the actual needs of national defense. Which leads to his call for... A kind of new Manhattan project, but for AI, a focused national effort. Right. Okay, so that wraps up part one. Now, moving into part two, the focus shifts quite a bit. It's titled The Hauling Out of the American Mind. Oof, that sounds bleak. It is a bit provocative. The core idea here is more internal. Karp's looking at how technology and just modern culture might be

eroding our ability to think deeply, to hold real conviction. - How so? What's eroding it? - Things like fragmentation of attention, you know, constant distraction, maybe leading to more superficial engagement with everything. - Yeah, I think we all feel that sometimes, that hollowing out, it suggests something's being lost internally.

And this sets up chapter five, the abandonment of belief. Right. And this chapter uses some really striking historical examples to illustrate this potential decline in, let's say, principled conviction. Well, he recounts the story of the ACLU defending the Nazi Party's right to march in Skokie, Illinois.

even though they lost tons of members. The director, Ariane Nair, stuck to the principle of free speech. Wow. That took guts. Absolutely. And similarly, the invitation for the segregationist Governor George Wallace to speak at Yale and how Pauli Murray argued passionately against the heckler's veto against shutting down speech you disagree with. For you, the learner, these are examples of people upholding core beliefs, even when it was really hard or unpopular. They stood for something, even at a cost.

And the book contrasts that with what? More recent events? Yeah, it's subtly juxtaposes those examples with the more cautious, maybe legalistic responses from the presidents of Harvard, Penn and MIT during that recent congressional testimony about calls for genocide on campus. Suggesting a shift away from those strong principled stands, maybe more fear of saying the wrong thing. That seems to be the implication, a potential fear of expressing firm beliefs in today's climate.

And this isn't just in academia. The book looks at Google changing its motto. Oh, yeah. From don't be evil to what was it? Do the right thing. Yeah. Which sounds positive, but Karp suggests it might reflect a shift away from having a clear moral compass towards just trying to avoid offense. Like avoiding bad is easier than defining good. Kind of.

He quotes the philosopher Pascal Bruckner, who said something like, "When you lack the power to really change things, sensitivity itself becomes the main goal. Avoiding offense takes over." Interesting. Just don't rock the boat. And this connects to Michael Sandel's ideas too, right? About individual rights. Exactly.

Sandal argued that focusing too heavily on individual rights can sometimes weaken our connection to shared goals, to collective moral debate. It's like we're so focused on me that we gets lost. Hmm. Which leads into chapter six, technological agnostics. What does that mean? This is Karp looking specifically at Silicon Valley leaders.

He argues many were raised in a culture that kind of shied away from deep moral or ideological debates beyond maybe basic fairness. So they're agnostic about deeper values. Potentially, yeah. Leading to a situation where their main loyalty is maybe to their company, their tech, and they might be skeptical of really strong American or Western ideals. For you, the learner, it raises questions about the underlying values or lack thereof driving innovation.

The book mentions Amy Gutmann's idea here too, doesn't it? About primary moral allegiance is to no community. Right. Suggesting these tech elites see themselves more as global citizens, maybe straining their connection to any particular national identity or set of values. It even brings up Eisenhower's warning way back when about a scientific technological elite

potentially gaining too much influence. Yeah, it's quite prescient. And Karp uses a quote from Mark Zuckerberg, something like, I built Facebook because I like building things as an example of this seemingly value neutral approach. Just building for building's sake. And this connects to what the book calls the cult of optionality.

This desire to always keep options open, to never fully commit. Which sounds good. But Karp argues it can be paralyzing. This reluctance to commit can actually hold back real progress and meaningful change. And this isn't just tech, right? The book links it to a bigger cultural trend. Yes, this broader move to kind of push strong values out of the public square.

The argument is that the educated elite have sort of stepped back from championing a clear vision for, say, the American National Project. Leaving a vacuum? Potentially. Yeah. Leading to a society that tolerates almost everything, but maybe doesn't strongly believe in much. He brings Kukuyama back in here, suggesting if all beliefs are equal, we lose the ability to judge things that are truly awful. And that has real consequences, like where top graduates end up working. Right.

The book points to the trend of Harvard grads flocking to finance and consulting. Good job, sure, but maybe not always tackling the biggest public problems. Producing skilled people who are maybe a bit disengaged. That's the concern. And he touches on the idea that elite power structures might be hardening, becoming more like fixed castes, referencing the sociologist E. Digby Boltsow.

raises questions about meritocracy. Okay, then Chapter 7 gets into the whole Western Civ course debate. What's the angle there? It highlights arguments like Dorothy Chayette's, that the traditional story of the West is a constructed narrative, not some objective truth. And it brings in figures like Crommie Anthony Appiah and Edward Said. Who challenged that narrative. Exactly. They pointed out how it often ignored or excluded non-Western perspectives and experiences.

So for you, the learner, it's about understanding that history isn't simple. It's shaped by who's telling the story. There's a tension then, isn't there, between wanting a shared story for national identity like William McNeill argued for. Right. The need for common ground. And needing to honestly confront the exclusionary parts of that traditional story, which Appian said focused on. A very real tension.

The book also mentions Samuel Huntington's clash of civilizations theory as another attempt to define the West, just showing how tricky that concept really is. And Karp argues that rejecting the old, maybe flawed narrative has led to what? A weaker sense of identity. Potentially a thinner one. Yeah.

Maybe one focused more on just individual rights or economic policy, which might not be enough to unify people and could even contribute to division. It also mentions Edwardside's Orientalism briefly. Yes. Highlighting its huge impact in shifting focus towards the speaker's identity and potential biases when analyzing culture or history. OK, moving on to Chapter 8. This connects the counterculture, the 60s and 70s, to the tech world. That seems...

counterintuitive. It does at first glance. But the book talks about people like Lee Felsenstein, who saw personal computers as tools to free people from big institutions. Empowering the individual. Right. And Stuart Brand, founder of the Whole Earth Catalog, who saw the counterculture as basically laying the philosophical groundwork for the Internet.

So for you, the learner, it shows these unexpected, almost idealistic roots. But that idealism didn't last. Well, the book contrasts that early spirit with the big shift in the 80s towards just serving the individual consumer. Link Steve Jobs and that famous Apple 1984 ad. Yeah, fighting Big Brother by selling computers. Exactly. That initial rebellious vibe faded, and now the tech landscape is overwhelmingly focused on consumer stuff. It is kind of ironic, isn't it?

A revolution that started out anti-establishment ends up creating these incredibly powerful tech giants. That's the irony Karp points out. These huge centralized companies were exactly what the early pioneers wanted to get away from. So then the book looks at the first Internet boom.

Chapter nine, lost in toy land. Yeah. Lost in toy land. Yeah. It argues that during that period, the main drive was often just consumer convenience. Maybe not always deep systemic innovation. Like the e-toys example. Perfect example. Toby Lankan e-toys. It's crazy fast rise and just as fast collapse. It embodied that whole grab market share now, figure out profits later mentality. Right. I remember that era. Pets.com, Boo.com. Cosmo delivering snacks.

The critique is that a lot of these companies weren't really solving fundamental problems. The book even references the movie Before Sunset to suggest there was a certain shallowness of ambition then. But it wasn't all bad. Did anything good come out of that bubble? Well, while acknowledging a lot of wasted money and talent, Karp does credit that era with forging a new kind of organizational culture much faster, more adaptable. That definitely had a lasting impact. Okay, that makes sense. So that leads us into part three, the engineering mindset.

What's the focus here? Here, the book zooms in on this specific way of thinking, the engineering mindset. But it stresses it's not just about technical skills. It's more fundamental. Yeah. It's a systematic, pragmatic approach to solving problems. And Karp argues this mindset is becoming more and more crucial and it's deeply linked to that idea of the technological republic.

For you, the learner, the point is this is a valuable way to think even if you're not an engineer. Can you give an example? Open source software is a great one. Lots of people collaborating, building on each other's work, solving problems together. That's the engineering mindset in action. And it's useful everywhere. Okay. So Chapter 10 talks about the X swarm. What on earth is that? Yeah, sounds weird. It refers to research by Martin Lindauer on honeybees in Munich.

He discovered how they make decisions collectively, like finding a new nest. How do bees make decisions? Through this amazing dance language. It's a completely decentralized process where they share information and somehow arrive at the best spot.

It shows the power of distributed intelligence. Wow. Bees have it figured out. Any other examples? Yeah. The book also mentions Giorgio Parisi's work on flocks of starlings, how they fly in those incredible formations with no leader, just seamless information flow. So the lesson for humans is... Maybe that we can learn from that, how to build organizations that trust collective intelligence, that give people autonomy. Interesting. Okay. Chapter 11, the improvisational startup. What's the link between improvisation

improv comedy and starting a tech company. It seems unlikely, but the book talks about Palantir actually using Keith Johnstone's book on improv with new hires. Really? Why? Because there are parallels. Improv is all about embracing uncertainty, being flexible, building on what others give you, which sounds a lot like startup life, doesn't it? Yeah, absolutely. Constant change, having to adapt. And Johnstone talks about status not being fixed, but something you play.

Palantir apparently tried to build a culture focused on outcomes, not job titles, contrasting that with rigid corporate hierarchies. So the argument is that this kind of creative freedom is part of Silicon Valley's secret sauce. That's a big part of it. Yeah. Treating people like they're brilliant, giving them room to run.

But there's a catch. Which is? The book warns that this very culture can also lead to talented people becoming disconnected from broader society. We need to figure out how to spread these positive ideas more widely, not just keep them locked up in tech bubbles. Makes sense. Okay, Chapter 12, The Disapproval of the Crowd. This sounds like social pressure. Exactly. It discusses Solomon Asch's famous experiments on conformity.

You know, the ones where people would agree with obviously wrong answers just because everyone else did. Yeah, those are disturbing. They really are. They show how easily ordinary people can just go along with things, even harmful things, because of group pressure. And it also mentions the Milgram experiments, right? The obedience studies. Mm-hmm.

showing how far people would go in following orders, even to inflict apparent harm. But crucially, the book also highlights the people who refused, like the medical technician who stopped giving the shocks. It shows individual conscience matters. For you, the learner, it's a reminder about thinking for yourself. And resisting that pressure. The book calls it constructive disobedience. Right. And it gives examples like Beethoven composing amazing music after going deaf.

or Monet's late paintings being shaped by his failing eyesight, sometimes limitations, or maybe refusing limitations, sparks creativity. Okay, Chapter 13, Building a Better Rifle. This sounds more military. It is. It tells the story of James Butts and the huge problem of IEDs, roadside bombs in Afghanistan, and how the U.S. military really struggled to use the intelligence they had because their systems were fragmented, their software was outdated.

For you, the learner, it's a case study in why adaptable tech is critical. And there was a disconnect between the people building the tech and the soldiers using it. A huge disconnect. The soldiers desperately needed better ways to analyze data, and they actually started demanding Palantir's platform because it worked. But the Army resisted, trying to push its own, much less effective system. Why? Bureaucracy. Bureaucracy. Culture.

The book talks about the military's longstanding preference for custom-built stuff, even when commercial tech is better and available faster. Think radios in the first Gulf War. Right. So Palantir had to fight to get their system to the troops. They ended up suing the army, citing a law meant to streamline acquisition, and they won. They got the contract.

The book really contrasts Palantir's focus on this urgent national security need with what other hot companies like Zynga or Groupon were doing then. Making games and coupons versus saving lives. Quite a contrast. Okay, chapter 14, A Cloud or a Clock. What's that about? It uses the painters Thomas Hart Benton, very structured, and Jackson Pollock, very chaotic, to talk about the tension between conformity and rebellion in creativity. Order versus chaos.

Kind of. The argument is that too much focus on being agreeable, on things being easy and comfortable can actually stifle real innovation. Like the discussion about trigger warnings. Yeah. The book touches on that trend of avoiding discomfort and contrasts it with the idea from people like Thomas Friedman that challenge and discomfort are actually necessary for growth. It also brings in Rene Girard's idea of mimetic desire.

imitating others. Right. Girard argued we often desire things because others desire them. Karp suggests that too much imitation, like maybe everyone chasing the same trends in Silicon Valley, can actually kill creativity.

So we need more nonconformity, like Emerson's self-reliance. Exactly. Trusting your own perspective. The book links this back to the engineering mindset, seeing it as pragmatic, like John Dewey's philosophy. And it says Silicon Valley's early success came from being pragmatic foxes, to use Philip Tetlock's term. But that pragmatism has faded. Karp argues it's been lost somewhat, replaced by too much calculation, too much fear of messiness.

He mentions Toyota's five whys technique, keep asking why, to get to the root cause used at Palantir, too. It's about observing the world as it is to figure out how to fix it, how to rebuild. Rebuild a technological republic.

Which brings us to the final part, part four. How do we actually do that? Right. Part four, rebuilding the technological republic, is about solutions. How do we integrate technology better? It suggests rethinking rules, maybe decentralizing power, making sure tech serves the public good and, you know, actually makes human life better. For you, the learner, this is the what now part. Okay. So chapter 15, Into the Desert, starts with weighing an ox. Huh.

Yeah, the Francis Galton story about guessing the weight of an ox at a fair. The average guess of the crowd was incredibly accurate, the wisdom of crowds. But the book immediately asks, is the crowd the market always wise? Is market allocation always best?

And it brings up Zynga and Groupon again. Are those the best uses of brilliant minds and resources? It questions that market triumphalism, as Michael Sandel calls it, the idea that the market solves everything. Precisely. And then it dives into the really tricky area of technology and law enforcement, DNA, technology.

Face recognition, gait analysis, drones. All the new tools. Yes, and all the ethical questions that come with them. Balancing safety and liberty, the age-old dilemma echoing Voltaire and Blackstone. It mentions the Palantir case in New Orleans again, the backlash. Mm-hmm.

highlighting that tension and how companies like Amazon and IBM have stepped back from selling facial recognition to police because of these concerns. What's the solution? The book introduces this idea of luxury beliefs. Yeah. From Rob Henderson. The idea is that certain positions on controversial topics like tech and policing can become status symbols for the educated elite, potentially alienating others who have different priorities like basic public safety. And this alienates people.

Carp suggests it can, particularly how some on the left discuss these issues might alienate half the country. The plea is to focus on actual outcomes that benefit everyone rather than just performative arguments. OK, tricky stuff. Chapter 16, piety and its price. This one starts with the Fed chair's salary. Yeah, an anecdote about David Rubenstein asking Jerome Powell about his pay, which is, you know, relatively low for that level of responsibility.

It highlights this general reluctance in the U.S. to pay top public servants well. And the argument is that this causes problems. The book argues it does. It potentially limits the candidate pool to people who are already rich. And it might create weird incentives like focusing on lucrative jobs after leaving office. Like with members of Congress, low pay while serving, then big lobbying gigs. That's the kind of thing. It references Matthew Galusius making the case for making government jobs more attractive.

And it contrasts the American skepticism about high public pay going back to Madison with Singapore under Lee Kuan Yew. What did Singapore do? They deliberately paid top official salaries competitive with the private sector to attract the best talent. A totally different philosophy. Interesting. The chapter also mentions Admiral Rickover again, the nuclear Navy guy. Right. And his habit of accepting gifts. It raises this provocative question. Should results sometimes matter more than sticking strictly to the rules?

Does focusing too much on procedural purity sometimes get in the way of progress? The ends justifying the means, potentially. It's a thorny question. And Karp uses René Girard's scapegoat mechanism here, how societies often blame leaders when things go wrong.

The call is to shift focus away from blame and towards outcomes, towards rebuilding that sense of shared purpose. Which leads into Chapter 17, The Next Thousand Years, starts with Dunbar's number. Yeah, the idea that we can only maintain stable relationships with about 150 people.

So how do we build bigger societies, nations? Through stories. Exactly. Language, storytelling, shared myths, what Benedict Anderson called the imagined linkages that create national identity. And the book looks at debates about national identity today, like Macron in France. And it circles back to that idea of shared identity maybe being hollowed out in the U.S. with consumer culture rushing in to fill the gap. And it critiques the left for maybe abandoning national culture. Yeah, suggesting that leaves the field open for consumerism.

It uses Singapore again as a counterexample, a place that actively built a national identity. The book also notes a growing skepticism towards leaders, heroes, maybe less emphasis on character. So we need new stories, new narratives. That's the argument. We need to rebuild shared narratives, shared values, especially with the decline of organized religion leaving a kind of vacuum. Can consumerism really fill that?

Probably not. So we need to talk more about the good life. Make the public square safe for those kinds of discussions. It mentions speech by Walter Walzer in Germany about needing to build a shared future identity, not just dwell on the past. Ultimately, it's about defining who we are and building communities around that. Okay, final chapter we're covering, 18, an aesthetic point of view. Starts with Kenneth Clark's Civilization. Right, the classic art history series and the backlash against it later for being, you know, elitist, Eurocentric. But the book...

Karp argues something was lost in that backlash. Yeah. Karp argues that in rejecting aesthetic judgment altogether, we might have lost our ability to discern quality, to tell the good from the mediocre. He uses a Peggy Noonan critique of a play as an example of this fear of making judgments. And how does this connect back to technology? Seems like a leap. The connection is that building great innovative technology requires taste.

It requires an aesthetic sense, a point of view about what's good, what works, what's elegant. Like Apple's design focus. Exactly. Silicon Valley's early success, the argument goes, came partly from this insulated confidence, this willingness to make bold claims about what was good, leading to breakthroughs at Apple, Google, etc. But there was a cost to that creative freedom. Yes.

The book argues the cost was often detachment, a willingness to ignore public opinion, maybe an amoral culture focused only on the tech itself, leading to a loss of shared meaning beyond just building things. So the conclusion is a call to? To rebuild that technological republic by finding a balance, balancing that individual creativity and freedom with a renewed sense of collective purpose so we can innovate without losing our humanity. Wow. Okay.

That is a lot to think about. This deep dive into CARPS, the technological republic, it really offers a challenging perspective on tech, society, our beliefs. It definitely does. For you, the learner, maybe some key takeaways are that constant tension, individual freedom versus collective good, or how national identity is changing in this tech age, and maybe that surprising idea about needing an aesthetic point of view, even for innovation.

Yeah, definitely food for thought. And maybe that leads to the final question for you, the learner. How can we as individuals actually cultivate more thoughtful relationship with technology? How can we contribute to building a more intentional, more humane technological republic for the future? It's the big question, isn't it? It really is. Something to chew on after listening. Thanks for joining us on this deep dive.