We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Night of Ideas: Author Laila Lalami on her Dystopian 'Dream Hotel'

Night of Ideas: Author Laila Lalami on her Dystopian 'Dream Hotel'

2025/5/2
logo of podcast KQED's Forum

KQED's Forum

AI Deep Dive Transcript
People
L
Laila Lalami
M
Mina Kim
Topics
Laila Lalami: 我的小说《梦之酒店》设定在一个近未来的世界,在这个世界里,人工智能可以扫描我们的梦境,并据此判断我们是否有犯罪倾向。这并非完全虚构,而是对现有社会系统(如全球监控系统)的逻辑推演。我从2014年开始构思这部小说,当时我的灵感来自于一次Google的手机通知,它让我意识到数据收集的隐蔽性和潜在的侵入性。小说中,主人公萨拉为了获得足够的睡眠,使用了可以扫描梦境的植入式设备,结果被系统判定为潜在犯罪者而被拘留。这反映了我们为了便利性而牺牲隐私的现实。 我参考了《少数派报告》等作品,但我的小说更关注现实社会中存在的风险评估算法。这些算法基于一系列因素,包括个人的犯罪记录、教育程度、居住环境、收入水平等,甚至包括梦境数据。算法看似中立,但实际上它会再现社会中存在的对犯罪的偏见,并加剧社会的不平等。 小说中,主人公萨拉被送往所谓的‘保留中心’,而不是监狱,这体现了政府试图通过温和的语言来掩盖其监控行为的本质。小说中使用的语言非常温和,以避免引起人们的反感,从而更容易让人们接受这种监控系统。 在小说中,主人公萨拉开始质疑算法的判断,并开始自我监控,这反映了持续监控对个体心理的影响。她最终与其他被拘留者建立了联系,这体现了在监控社会中人际关系的重要性。 创作这部小说让我意识到,任何监控系统都存在漏洞,并且人类的创造力和复杂性是无法被完全监控的。我们拥有无限的创造力和改变的能力,这给了我希望。 Mina Kim: 莱拉·拉拉米的小说《梦之酒店》描绘了一个近未来的世界,在这个世界里,政府监控、未经指控的拘留以及人工智能工具的使用已经成为现实。小说主人公萨拉为了获得足够的睡眠,使用了可以扫描梦境的植入式设备,结果被系统判定为潜在犯罪者而被拘留。这反映了我们为了便利性而牺牲隐私的现实,也引发了我们对人工智能技术应用的伦理思考。 小说中,主人公萨拉在机场被拘留,这突出了机场作为极端监控场所的现实。持续的监控会让人开始自我监控,并产生自我怀疑。小说探讨了算法的偏见以及其对社会公平的影响。 小说中,主人公萨拉即使受到不公正待遇,仍然保持着与他人建立联系的能力,并相信人性中的善。这体现了在监控社会中人际关系的重要性,以及我们对人性的坚守。

Deep Dive

Shownotes Transcript

Translations:
中文

Support for KQED Podcasts comes from San Francisco International Airport.

Did you know that SFO has a world-class museum? Get ready to be wowed by art, history, science, and cultural exhibitions throughout the terminals. Learn more at flysfo.com slash museum. Switch to Comcast Business Mobile and save hundreds a year on your wireless bill. Comcast Business, powering possibilities. Restrictions apply. Comcast Business Internet required. Compares two unlimited intro lines and lowest price 5D plans of top three carriers. Tax and fees extra. Reduce speeds after 30 gigabytes of usage. Data thresholds may vary. ♪

From KQED. From KQED in San Francisco, I'm Mina Kim. Coming up on Forum, Leila Lalami's new novel, The Dream Hotel, imagines a dystopian future where even our dreams are under surveillance. AI tools can scan dreams to determine whether we're likely to commit a crime. Then we're sent away to so-called retention centers to be monitored in the name of safety.

I sat down with Lalami last month at Night of Ideas in San Francisco to talk about the timeliness and inspiration behind her story about a Los Angeles mother caught in a web of government surveillance, detainment without charges, and AI tools. We'll hear that conversation right after this news. ♪

Welcome to Forum. I'm Ina Kim. L.A.-based writer Leila Lalami, known for The Moores Account and The Other Americans, has a new novel called The Dream Hotel about a not-too-distant future where AI tools can scan our dreams and use them to determine whether we're likely to commit a crime. In Lalami's novel, the technology ensnares Sara Hussain, a California mom of toddlers who is detained and monitored.

I talked to Lalami about the connections between her fictional world and our present day in front of an audience at Night of Ideas in San Francisco on April 5th. Oh, welcome. Welcome to Forum at Night of Ideas, Forum from KQED. I hope there are some listeners out there tonight. And how incredibly lucky are we to have Leila Lalami with us? Hi.

Thank you. Thank you so much. So without further ado, I want to get into it because this is a near future world, Leila. But there is government surveillance. There is detainment without charges. There's even a wildfire in Los Angeles. So I'm wondering, did you mean a near future world? Because it sure sounds like the present. Yeah.

I did mean a future world, and I guess it feels like the present for a number of reasons that I'll go into. But it did feel like a future world to me when I started conceiving the novel in 2014. So I started working on it 10 years ago. And at the time, my original idea was to set it at a tech startup company

And it was following an engineer within that tech startup company whose product is a device that helps you sleep. I'm an insomniac. And so I thought this would be a product that I would definitely be interested in using. And I was so desperate for sleep that I'm sure that even if I knew that that device was collecting data in my sleep, I probably would have sacrificed it for the convenience of having a good night's sleep.

And so I started thinking through that 10 years ago, and I ended up setting the book aside because the sort of setting of a tech startup company just felt a little sterile to me, and it felt like it had been done, and I just didn't see where I was going with it. And I worked on another novel, which came out. We talked about it five years, six years ago, is it? Six years ago. Yeah, The Other American. Yeah.

And so I didn't really pick up the idea again until 2020, in March 2020.

And then picked it up again and started working on it. And then it came out in March. I mean, and I turned it in in March 2024. So I had no idea that the things that are happening right now would be happening in quite the way that they have happened. And I think the reason it feels very present to readers is because I was really focused on systems, right?

that are already in existence in our society and kind of following them to sort of their logical conclusion, right? So we have been living under what is essentially a global surveillance system for the last 25 years. I think if you went to somebody like Enver Hoxha in Albania and you said, I have a way, if you time traveled to the past and you said, I have a way of telling you what every citizen, where every citizen in Albania is at this precise moment, right?

Every, you know, communication that they have with their loved ones, every letter that they send, every picture that they take, I can deliver this to you. I'm sure Enver Hoxha would have been very interested in this and everybody else would have thought, what is this tyranny? But because it took place over such a slow, I mean, it took place so slowly that we really never realized

apprehended how large it could be and how invasive it could be. And so because the sacrifices happened step by step by step, we didn't really notice the large distance that we've

that we've crossed. And I, and so, so the idea was just to follow this to its logical conclusion. I heard one of the things that really freaked you out early on and helped, um, really cement the idea for the novel was that you woke up one morning, looked at your phone and it told you that if you left at this very moment, you'd make it to an, uh, to an appointment, right? Yes. Yes.

Yes. Yeah. So I remember I said I was an insomniac and basically I tend to fall asleep finally in the wee hours of the night. So by the time I woke up, I was in a panic. Oh, my God, am I running late? So I picked up my phone to look at the time and saw a Google notification. This is in 2014. And it said, if you leave right now, you will make it to the name of my yoga studio.

at 7.28. Now, of course, I never told Google what day of the week, what time of day, or even that I went to yoga. But because it followed my movements, it knew that every Tuesday and every Thursday at 7.30, I went to a particular location that its mapping software knew was a yoga studio. So it quite helpfully wanted to remind me that I was running late.

And that I should get going. Well, I didn't go to yoga that day. Oh, you were so weirded out by it. You're like, forget this. I'm going to change it up on Google. Yeah.

But it really was a very disturbing moment. It's one of those moments where, of course, we all know that data is being collected about all of us. But by design, that data collection is meant to be invisible, right? It's meant to be very smooth. You get to have the services that you signed up for, the convenience, the speed. And you're supposed to forget that you signed the terms and conditions that allow these companies to collect

all of that data. And that morning, it felt as if the curtains had parted and I had a peek inside this sort of surveillance. It had made itself visible just briefly. And, you know, I was disturbed. And I remember I turned to my husband and I said, you know, pretty soon the only privacy any of us will have left will be in our dreams.

And then, of course, being a novelist, I thought, oh, well, what if we continue along these trends of data collection? What if we very soon have the ability to collect data about dreams? What uses would these companies find for this data? And how would they make money off of it?

Well, like in the book, obviously, I come up with a range of services and uses that they have for this data. And of course, I picked one of the ones that I thought was most disturbing, which is pre-crime. I think there's also something so intrusive, that feels so intrusive if someone can extract and interpret your dreams.

Because there's something about dreams that just feels so special or personal to us, right? Absolutely. Yeah. I mean, I think that dreams are probably the most private part of ourselves because dreams are the only thing that you engage in completely privately. You can...

You can share them with a loved one, like when you wake up in the morning and you ask, how did you sleep? And you start talking about your dreams. But that person doesn't have access to the images that you have in your mind. And they're very...

Of course, most dreams are going to be very banal, right? So that you're running late for an appointment or you miss the train or, you know. But every once in a while, you have dreams that are very frightening or very exciting or very revealing or that you feel have some kind of weight, some predictive power. And it's those dreams that I think have...

that essentially carry the weight of the book because those are the kinds of dreams that you might use to predict what might happen in the future. Yeah, which then raises the question of

Why would somebody be willing to allow a company access to those dreams? And actually, it's making me think of a couple questions that I would love to have our audience chew on to and answer. In addition to questions you might want to ask Layla or about the conversation, I do wonder what...

you would or how you already trade your privacy and why. What do you do it for? I also would love folks to think about how they behave differently when they know they're being watched and in what ways. So in the case of the main character of this book, who is Sara Hussain,

What is she willing to give up the total intimacy of her dreams for, right? And it's what you were talking about earlier. Yeah, sleep. It is sleep. And I really, you know, if you are very blessed and you don't have insomnia, it's very difficult to understand what continued sleeplessness is.

Can do to a person like you really are this is that I mean I think it's not a coincidence that sleep deprivation can count as torture under certain circumstances, right? I mean there is a reason for that right we biologically need to sleep and when something happens that prevents us from sleeping I think it can make us very desperate and my main character is is she's a museum archivist and

But she's also a very busy mom because she has twin toddlers and she's trying to balance that and she's trying to be a good mother to these children.

They're 13 months old when the action starts. And she's just, you know, desperate for sleep and trying to manage two kids and her job. And so because this device exists on the market and her husband already has it and loves it, she decides, well, I'm going to get one too. And it's like an implant. Yeah, it's a tiny little implant. You get it. It's, you know, two hours in and out. And then you get to sleep and...

One benefit of this device is that if you are in bed for four hours, you get the benefit of eight hours of sleep. So, I mean, who wouldn't sign up for that? Almost everybody that read the book before it came out told me that, you know, if a device like that existed on the market, that they would get it. Because imagine, the idea is that you basically...

would finally control time because you would have an extra four hours in the day. You could go to school or you could get another job or you could spend more time with your grandmother, whatever you see fit to do in that time. Because you could feel so restful and blessedly, yeah. And also embedded in the fine print of this device, they say they're not going to share your data with anybody, but...

Unless there are certain circumstances. Yes. So who reads the fine print? That's also what makes us feel very plausible as well, those little details, those little exceptions. Yeah. And I think the idea is basically by literalizing something that all of us do all the time. You know, we're trying to get somewhere and we need the app to make a reservation or we need the app to do the thing. And it's like, okay, quick, quick, quickly. And you're just...

clicking and agreeing and not reading. And of course, for the rest of time, you're being followed by this, by this app. So you're listening to my onstage conversation last month with writer Leila Lalami about her new novel, The Dream Hotel. I'm Mina Kim. Stay with us.

Support for Forum comes from Rancho La Puerta, a health resort with 85 years of wellness experience, providing summer vacations centered on well-being. Special rates on three- and four-night August vacations include sunrise hikes, water classes, yoga, and spa therapies, all set in a backdrop of a dreamy summer sky.

A six-acre organic garden provides fresh fruits and vegetables daily. Learn more at ranchoertuerta.com. Greetings, Boomtown. The Xfinity Wi-Fi is booming! Xfinity combines the power of internet and mobile. So we've all got lightning-fast speeds at home and on the go. That's where our producers got the idea to mash our radio shows together. ♪

Through June 23rd, new customers can get 400 megabit Xfinity Internet and get one unlimited mobile line included, all for $40 a month for one year. Visit Xfinity.com to learn more. With paperless billing and auto-pay with store bank account, restrictions apply. Xfinity Internet required. Texas fees extra. After one year, rate increases to $110 a month. After two years, regular rates apply. Actual speeds vary.

This is Forum. I'm Ina Kim. We're listening to my onstage conversation with Leila Lalami about her novel describing a future where even our dreams are under surveillance called The Dream Hotel. It was recorded at Night of Ideas on April 5th. And here's a question from audience member Aizen. Is your book based on the film Minority Report or is the film with Tom Cruise based on your book? Because it sounds like there's a lot of parallels.

Oh, very flattering. Tom Cruise is based on a movie on my book. That would be nice, wouldn't it? No, no. So that is based on the Minority Report by Philip K. Dick.

And it was made into the movie that many people have seen. And in the minority report, it's precogs that are basically interpreting... I forget even what it is that they're interpreting. But basically, they're able to predict that a person is going to commit crime. So...

So that kind of story, which I ended up going back, because I had seen the movie back when it came out in the 90s. And then when I had the idea for this book in 2014, the first thing I did was go and actually look at these movies and read these books. And Minority Report was one of the ones that I went to. Yeah.

And what's striking about that kind of sci-fi is that it feels very distant into the future. So there's these three precogs and they have these, whatever, supernatural powers. And so it feels almost... And there are flying cars and all kinds of things. So it feels very comfortable, right? Because it's so far into the future. It's not going to happen in your lifetime. You can sit back, relax, and enjoy this simulation.

The way that I think of writing this speculative fiction is to actually bring it a little bit closer to us and making it more in a future that feels that could happen in our lifetimes. So in my book, it's really not supernatural. It's not far into the future. There are no flying cars. It's a world very much like ours. It's just...

It's just a little bit distant into the future. And it uses essentially risk assessment algorithms that basically assess the likelihood of every person in this room, every citizen committing a crime. And this is based on a range of factors.

the first of which obviously is whether you've already committed a crime, you know, that would be a huge one, but also a range of other things, you know, like your educational level, the neighborhood that you live in, your income level, whether you've been evicted, you know, all of these things that are kind of like class-based essentially. And then the

they throw in their other things like reputation level, your use of social media and your dreams. If you're willing to give over that data, they use that too. Any data that it can access, it can do that. And the reason that this sort of system of surveillance comes into being has to do with the fact that in the book there is this mass shooting event which is televised and instead of

doing the sort of logical thing, which is to address gun violence through things like gun control, it's much easier to kind of control the citizens and kind of assign each citizen a

a crime score, a crime prediction score, than it is to do that. So that's what happens. So it was kind of modeled a little bit on the, where I modeled it on the Patriot Act. Yeah. Well, you know, it also reminded me a little bit of China's social credit score. Was that an inspiration? I,

I did, once I had the idea for that, then I went back and I was like, well, how is it done? And I ended up reading this book, which I would recommend. It's called, I think it's called, I hope I'm not mangling this. I think it's called In the Camps. It's by a scholar named Theron Beiler. And it's basically about what's happening to the Uyghur minority in China. And it definitely looks at surveillance, like violence,

surveillance and credit score systems and all that. Yeah. Were you inspired by also our credit score system? Yes. Yes. Absolutely. Foundational way to think about how. And I think, yeah. And I think what's exciting about that for me is the idea that when you tell people, you know, credit,

FICO scores have only been around for about 40 years, right? Before that, you had to walk into a bank and make your case to somebody before you could borrow money. And now it's all automated through a credit score. And it's become so woven in into every financial transaction that you make that you don't question, why am I being, why is this number that I don't control is controlling decisions about my life that I get to make?

So, you know, this is what I meant when I was trying to make it plausible, as I was trying to base it on things that we already have, but just sort of extrapolating them to the world of dreams and the future. Yeah. Do tell the story of Sarah first being...

Basically detained at the airport. Yeah, yeah. So the book opens when she's already detained, but immediately after that we flashback and find out how she ended up there. And basically she's returning home from a conference abroad, a trip that she has done many times because it's the same conference. She goes there every year.

And this time, as she's going through customs, she is told, hang on, you have to go into secondary because, you know, your risk score is a little high. And of course, like anybody, you know, you hear that something is wrong at the airport. You're annoyed because you have somebody waiting for you. You know, you have plans, you have dinner reservations. You just want to go about your life. So you're frustrated.

So she gets pulled into secondary. And from there, you know, there's all these questions about her life and about everything that she's doing. And it just kind of goes downhill.

Anyone who's been pulled aside at the airport or because there was some kind of a passport issue or a paper issue probably knows exactly what it feels like. It's happened to me before. Has it happened to you? Of course. Yeah, it's happened. I mean, I've had the full range of experiences, the sort of, you know, you get your boarding pass and you get the SSSS, the 4S code on your boarding pass, which means you're going to get

You're going to get pulled aside before you get onto the plane, the very last minute. So it's the full range of searches, including right before getting on the plane.

So basically airports for me are sites of anxiety in general. Not that anything's happened, anything at all, because I've never done anything illegal, but it still is something that fills me with anxiety. And so it was something that I enjoyed writing because I had experience of it. So it was fun to be able to kind of

tighten the screws and put more and more tension on this character as she's going through customs. Yeah. I heard you worked at a tech startup. I did. So did you use that for this? I did. I did. But part of the reason that I couldn't write the whole novel in a tech startup was because I was getting PTSD. I was getting PTSD from those years. I was like, I can't spend several years

in 400 pages in a tech startup. And so in the final version of the book, there is one chapter that takes place in a tech startup. And that was good, just one chapter. I understand one of the audience members, Chris, does risk assessment. Are we able to get to Chris? Hi, you just perked up my ears because my job is to do risk assessment for people who are getting into jail. I work for a behavioral health department, private,

Problem that we have is that people who are suffering from schizophrenia are constantly being booked into our jail. Very often they're already into a behavioral health program with the county. They go into the jail. My job is to get them out of the jail and back into services as quickly as possible.

The way that we do this is we have to use risk assessment to make the judge comfortable that this person can be released. We have to have lots of data because we have to reconnect them back to their programs. So all I'm saying is that there are good things that you can do with this data. I very much respect your perspective on this because I share your fears on what can be done with this. So it's an incredible challenge. But I will say that

Your book is going to make my job that much more difficult because part of my job is that I have to convince people to share data with me so that we can do this risk assessment. But I think there is a key difference here and that is the fact that it's Chris doing it, right? And in your book it's an algorithm and there is a lot of trust in that algorithm. Yeah, I mean I think that we all constantly just as a species, we're continually assessing risk, right? For survival, right? So as a woman, and I'm sure this is

This is true of many women or the women in the room. I'm continually, when I walk down the street, I'm assessing like, what is my safety level? Like, is it safe for me to get from point A to point B?

And so just as a species, we're continually doing that. And in your job, you are assessing that risk for people in jail. The problem is that in automating these systems, the technology is not able to compare to human beings in terms of how human beings assess risk.

when it comes to human beings not risk in general, but like when it comes to how other human beings are going to Behave and so when you automate

oftentimes that risk assessment is going to because it's treating everybody the same and if it's not looking at holistically at like the person's life experiences and and um what might happen to them it's um there's a danger basically because it's automated in a way that

in a way that isn't necessarily true with human beings, I feel like. Yeah. And this is, by the way, these risk assessment things are actually, have been tested in certain police departments. You know about that, right? So certain police departments have used...

crime predictive tools. And the results have not been positive. The results have been terrible. Because again, when you're relying on a tool that is automated, you're not necessarily getting a whole environment the way that a human being would. So it's not

Another reason I think why your book feels very much close to the present, unlike as we were talking about with Minority Report earlier, is that this idea of

Stopping people or predicting crime or pre-crime, intervening pre-crime is something that has already been attempted. Yeah. And one thing I will say about that is whenever people think about pre-crime, they think about the far future, but pre-crime is happening right now.

And it has also happened in our past. So forms of pre-crime are things like, so in New York, for a long time, had a policy of stop and frisk. That is a form of pre-crime, right? Because police officers are just stopping people who have committed no crime based on the suspicion that they might commit a crime, right? So that is an assessment about the future. And they're getting...

padded and searched, essentially a Fourth Amendment situation. You have things like the no-fly list. That's a form of pre-crime. And then before that, in the past, there were laws like the Wayward Minors Act in the 1920s where somebody, particularly young women, were criminalized for things as simple as, you know,

carousing or whatever and they were just laughing out loud or laughing too loud and then being sent to things like Reformatories for three years, right? So that is a form of incarceration. So I think you know, so pre-crime has You know a basis in reality isn't something that just happens in the minority report. It is happening right now Yeah, it's interesting in your book

Because it's almost there's an awareness that there isn't a charge here yet. The language for how they treat people who they feel like their risk assessment score is a little high and they need to be watched for a while, they call them retainees. Mm-hmm.

They say they're being sent to a retention center temporarily. And your book is called The Dream Hotel. So talk about language and how you used it here. Yeah. So one of the things that I thought would make sense for a book about this sort of surveillance and what's happening, and it's kind of like dystopic, I felt it was necessary for the language to be very inoffensive and

Because it's much easier to convince people that this system is in place to protect them if the language reflects that. So because this book is about Sarah gets taken away because her dreams suggest she will commit a crime, I knew that...

She couldn't be taken to a jail. She hasn't actually committed a crime. So it had to be a kind of in-between place. And so I thought, well, it can't be a detention center, but maybe it can be a retention center. And it's not filled with guards. It's filled with attendants. So it functions. So they're kind of making the language inoffensive in that way, an anodyne.

to me seemed appropriate for this kind of situation where no crime has happened. And so these people are being kept for their own safety, right? Because they're being held so that a crime will not be committed. So they're being held for their safety and for the safety of others. And then it's the dream hotel because one of the attendants, the main, the senior attendant says that

He jokingly, he calls the place where they are held the Dream Hotel because he aims to make every woman stay memorable. Let's go to audience member Jenny.

I already currently give away my dreams to AI because it is actually really very good at dream interpretation. There are chats that can tell you psychologically what's going on in your dreams. And so I wondered what you thought about that. And in the novel, is it actually good at dream interpretation or is it bad?

And also, you know, is that something that you've played around with or know about? So one of the things that strikes me listening to you talk about that is how over the course of humanity, how important dreams have been to us as a species and how we have always felt that they have weight over

They have scriptural weight, right? So in the Bible and the Quran, you know, the story of Joseph and so on and so forth. And they also have like secular weight. So, you know, I have a dream. The speech happened first before the Civil Rights Act. So we have a sense that we need to dream before things will happen, right? So we have a sense that there is a predictive power in dreaming. And we invest dreams with a lot of

like I said, with a lot of power. And listening to you talk about giving your dreams to, was it Chad GPT or something like that, to get dreams, it's very similar to what human beings have done for centuries, which is to keep a dream journal and to kind of reflect and ask people, what do you think this means? And throughout history, there have been many examples of dream collections,

And people attempting to interpret what those dreams meant. So this is just a new version of that. So to just give you a couple of examples, there's a sultan in Mysore in India who in, I believe, the 18th century kept a dream journal scrupulously because he was convinced that his dreams were helping him

um planned his military strategy to keep the british out of his sultanate and so that was one example another example is and this is a book that i actually cite in the acknowledgement is the third reich of dreams which is a collection of dreams collected by a german writer named charlotte berat who started collecting dreams in the 1930s in germany before she left

Nazi Germany in 1939. So she collected dreams from people that she knew and these dreams are incredible because they really show the dreamers struggling with authoritarianism in their country and having these extremely revealing dreams about the effect of authoritarianism on their own lives.

So there's, you know, so there's, there's so many examples like this. I could talk forever about these dream collections. So what you're doing is essentially a new form of that is instead of collecting it and sharing it with others or with a, with an analyst, you're just sharing it with chat GPT. So I think it is going to happen. We'll hear more of my night of ideas conversation with writer Layla Lalamie about her new book, the dream hotel right after the break. This is Forum. I'm Mina Kim.

This episode is brought to you by Indeed. When your computer breaks, you don't wait for it to magically start working again. You fix the problem. So why wait to hire the people your company desperately needs? Use Indeed's sponsored jobs to hire top talent fast. And even better, you only pay for results. There's no need to wait. Speed up your hiring with a $75 sponsored job credit at indeed.com slash podcast. Terms and conditions apply.

Greetings, Boomtown. The Xfinity Wi-Fi is booming! Xfinity combines the power of internet and mobile. So we've all got lightning-fast speeds at home and on the go! That's where our producers got the idea to mash our radio shows together! Xfinity!

Through June 23rd, new customers can get 400 megabit Xfinity Internet and get one unlimited mobile line included, all for $40 a month for one year. Visit Xfinity.com to learn more. With paperless billing and auto-pay with store bank account, restrictions apply. Xfinity Internet required. Texas fees extra. After one year, rate increases to $110 a month. After two years, regular rates apply. Actual speeds vary.

You're listening to Forum. I'm Nina Kim. I sat down with L.A. writer Leila Lalami last month at Night of Ideas in San Francisco to talk about dream surveillance and her new novel, The Dream Hotel, all about Sara, an ordinary California mom who was detained and monitored because algorithms suggest she's likely to commit a crime. But there is a moment when Sara sort of starts to question, you know,

Are the algorithms interpreting me correctly? Am I somebody with the propensity to do these kinds of things? And this sort of questioning of the self, which I think also comes when you're being monitored. Yes. Was that something you wanted to explore too? Yes, yes, yes. Yeah, I mean, I really think that that's the effect of total surveillance. Like if you are continually surveilled,

I think there comes a moment where that surveillance kind of moves inside and you start to surveil yourself because you're so afraid, right? And you internalize it. And then you become paranoid. Am I really, you know, I better not do this, you know, because what if I might actually be this terrible person that surveillance is supposed to catch?

So, yes, she because she is told that she's there because of her dreams, she starts to wonder, maybe, maybe it is true. Maybe I did do maybe I did dream something. Maybe I did really intend to commit this crime.

And in the book, as we get to know her as a character, we discover things about her in her childhood that really have given her a feeling of guilt that she has carried into her adult life. And then the question becomes, like, is the algorithm using that latent sense of guilt that she has carried? You know, is it sensing that? Is that why, you know, her dreams are being interpreted under such a negative light?

Can I ask you what you think when whole societies are being surveilled, right? If we ourselves are internalizing and starting to contain ourselves maybe to some degree or not allow ourselves to express or engage or interrogate different parts of ourselves because we're afraid of what we'll find, what do you think happens when that's happening on a broad scale to a society or a people?

potentially. Yeah, I mean, for me, I think what's really interesting about this moment is that it's what's happening in the United States is not unique. I think we're dealing with global corporations, right? So these are companies that operate not just in the US, they're also operating in many other countries. And

And yes, you know, in certain parts of Europe, there's a bit more regulation, but by and large, across all of the places where these companies, where big tech is operating, data is being collected.

And I think that the level of data collection that we are seeing right now is unprecedented in human history. In a piece that I just finished that I turned in for the nation, I liken it to what the Bible calls the book of life.

which is your, and what the Quran calls your record of deeds, which is basically everything you've done in your life, right? There comes a moment in judgment day where that gets opened up and you have to answer for everything you've done. Where that's, you know, that's the level of power and control that we are really talking about, where everything that you do is leaving a trace in an archive that you do not have control over.

That archive is in the hands of big tech, and it has laid claim over it. It has laid dominion over it. So it is a global surveillance system. And what I thought was interesting in the book is to try and connect this global surveillance system to other surveillance systems that we have in our lives that we might maybe not question as much, which is things like, for example...

Patriarchy is a global surveillance system, right? Because we, you know, as women, you know, we surveil ourselves. We decide, you know, is this, I don't know, is...

Am I wearing the right clothes? Am I, you know, is my hair done the right way? What is fashion? What am I supposed to do this week? Next week I'm supposed to do something else. And so we really are surveilling ourselves continually, and I thought it was interesting to draw connections between the two. Let me go next to Tamara and then to Ryan.

Hello, thank you so much for this talk. I've actually written and thought about pre-crime a lot, so it's pretty amazing to hear it. Cannot wait to read the book. My question is about race. So you named the character Sarah Hussain for a reason, and of course in criminalizing, you know,

race, ethnicity, religion carries weight in how people are perceived as potentially criminal or already criminal. So I'm wondering if you can highlight a little bit about your decision-making between, well, the naming of your character, if race played a role in the role that race plays in your book, potentially. Yeah, no, that's a great question. So...

I think the reason it is a good question is because when I was talking about surveillance being a global system, surveillance, technological surveillance at this point is universal. Yes. But universal does not mean neutral. And so in the book, the character is like me. She's Moroccan and different.

All my books are about Moroccan characters because that's what I'm interested in writing about. And race does come into it. And this is why the book starts in an airport because it is a place of extreme surveillance for people who are of Muslim background.

And so when she gets pulled aside and she gets asked all these questions, she starts to lose her temper and she says, "Am I, is this why?" And of course the officer gets really offended. He's like, "Of course not, you know, we are just doing our jobs." But you know, as readers, you know that this is obviously why she got pulled aside.

And one of the things that I hint at in the way in which this algorithm is calculating your risk score is that on the surface, all of these elements that go into assessing your risk score seem the same, seem universal, right? So like everybody has...

I don't know, like an eviction status. Have you ever been evicted? Yes or no, right? So it seems like it's a question that everybody has to answer. But realistically, those numbers are not going to be the same across races. So it's essentially looking at...

race and class, but it's asking the questions in ways that are neutral. And all of those things go into the algorithm. So the algorithm is actually giving a fairly, it's reproducing the sort of

the predictions about crime, the way that crime is already being policed in society, if that makes sense. So yes. So the short answer to your question is that yes, race does play a role, but it's, it's being done through these risk scores. One of the things that's so interesting. So, you know, Sara is retained of course, and much of the book is about her experience. But she, she,

She maintains, even though so much has happened to her and she's been treated so terribly, she maintains this ability to connect with the other people who are being held. And also, I think, a willingness to believe in the capacity of humans to be kind, to be people that you can connect with despite what is happening to her.

And I was wondering if you could explore that a little bit, because one of the things that, well, is the theme of this whole event is on common grounds. And one of the ideas and questions that this event challenged us to think about is what binds us together in an increasingly digital world.

And so I'm wondering about her ability. Not only is she in a very digital near-future world, but she's also been treated terribly by other humans who are interpreting that digital data. How is she able to maintain that? And do you think that maintaining that is also something that's really important for us to do as a way to push back on things? Well, the short answer is yes. But I think that in the beginning of her story, which is not the beginning of the book,

but when she gets pulled aside at the airport and gets sent to this retention center, which is called Madison. Of course, when she arrives, she's still in shock. She doesn't believe that she should have been sent there. She's innocent. She hasn't done anything. And so she thinks, she hangs on to this idea that there has been a mistake.

And so, of course, because she's innocent, so there can only be a mistake. There's some mistake in the paperwork, in the way that this crime risk score has been computed. Some way to get her out. So when she first arrives, she...

sort of keeps herself at a remove from everybody else in the facility because she's like, well, I'm not like them. I'm not here because I'm at a risk of committing any crime. I've done nothing wrong. There's got to be a mistake somewhere. And it takes her time

Of course, the book opens when she's been there sometime. But it takes her that time to really realize that she's not all that different from everybody else in the facility. And to, well, actually, she does have in common certain things with them. And so during the book, she really comes into that realization fully that she is part of this community, which is what it is. It is a community, right?

and that she needs to nurture that community and

find ways through that community to figure out a way out of that place. I don't know if that answers your question. It does. And I guess I'm also just wondering about, because I think it's really hard for us to connect, especially when we have determined that people are who they are in a certain way about them. Like if I'm a mistake, other people here are accurate. You know what I mean? And what it is, how do we move past that?

We're talking with Leila Lalami, and you're listening to Forum. I'm Mina Kim. I promised Ryan we'd go to Ryan next, so where is Ryan? So you talked a little bit earlier about some of the incentives that these companies and governments and other entities have to provide for you to be so willing to give vast amounts of your deeply personal data to them. It could be convenience, it could be some sort of

Addiction could be a bunch of different things. And it's good to hear that friends are deeply increasingly skeptical of all of these types of entities. But I think there's also this kind of pervasive attitude of, you know, I have nothing... If I have nothing to hide because I'm a good person, well-behaved citizen,

What's the difference if I just give these people all my information because there's nothing they could use it against me for because I'm an upstanding citizen. So what is your response to that general attitude? I'd be curious. Thanks. Well, I think the answer is what we're seeing now already, which is that the problem with that attitude is that

The notion of crime is essentially artificial, right? So there's, what I mean by that is that the line between what is legal and illegal is not immutable. It is not fixed. It can move with time. There are things that used to be crimes, for example, smoking pot, that is no longer a crime. And there are things that didn't used to be a crime that are now crime. So crime can shift.

And the problem with having all of this data in the hands of only essentially five companies is that depending on where that line moves, you are essentially, they have a way, like they have enough data that they can determine on which side of that line you might be. And so imagine if you have a government that is on, you know,

you know, a massive purge of, I don't know, academia, for example. Imagine. Then it makes it possible, right? Because at this point, all of this data is available. And so it's just a question of deciding where that line is, where, you know, you might think that the First Amendment means that

that you have the right to protest and peaceful assembly and all of that, well, then somebody else can come in power and say, actually, the part of that is that, you know, we're going to go fishing into a law from 1956. And suddenly now that is actually not the definition of the First Amendment that is going to be applied. So, so,

Just because you have you you are a law-abiding citizen and you have nothing to hide doesn't mean that that data should be Should be in the hands of somebody else. The bottom line is this you You have ownership over your body and your body is producing that data Therefore that data is yours. It does not belong to another to another and

And this book is set in the future. And two books ago, I wrote a book that was set in the 16th century and was about Spanish conquistadors who landed in Florida and claimed Florida for the Spanish crown.

And can I just tell this anecdote? This is very, very brief. And so one of the things that the conquistadors did, this was true, they did this starting in 1513. Every Spanish expedition to the New World had to do this. They had to read a document called the requerimiento, and it basically informed indigenous people that they were henceforth subjects of

of his holy imperial majesty, that they had to obey the cavaliers who had just now descended upon their lands. They had to convert, they had to obey orders, and if they should offer up any resistance, that that resistance would be met with swift punishment.

In writing this book, what I really saw is that big tech is essentially doing the same thing. It has laid claim to all of our data, the data that is produced by our bodies that belong to us, and

And is essentially, you know, having a document that says that all of that data belongs to them. That is the terms and conditions of today. It's not the requerimiento, but it is the terms and conditions. And yet, Leila, I think I read that you said in an interview that writing this helped you shed fears about technology. Yes. So how so? Yes. Yes.

It did. It really did. Because it made me realize a few things. First of all, no surveillance system. Because I mean, I've been spending like 10 years thinking through this. And I just think that every surveillance system that you can think of, there's always a loophole or something that the surveillance system misses, which is that human beings have this infinite capacity of creation.

And creativity. And so it's just very difficult to fully control and surveil at all times, 24-7, every thought. Do you know what I mean? Especially if you're doing it at scale over a society. So maybe with one person you can, but not over a society. And so it just...

So that's one of the things. And then I also realized that no amount of data about us can ever really capture how we think, how we behave, how we feel, how we love. Like, we are so much more complex than the data that is collected about us. And so those two things, it just...

I just, it gave me hope, to be honest with you, because I felt that we are continually, we're always, we have that capacity to create and to change. Yeah, I think that's a good place to end it. And I just really want to thank you, Leila, for talking with us today. And I want to thank our audience for all your great questions and for listening to this conversation. So thank you for coming. Thank you.

Thank you.

Jennifer Ng and Ashley Ng were our engagement producers. Susie Britton is our lead producer. Our engineers are Danny Bringer, Christopher Beal, Catherine Monaghan, and Christopher Greiley. Our interns are Brian Vo and Jesse Fisher. Katie Sprenger is the operations manager of KQED Podcasts. Our vice president of news is Ethan Tobin-Lindsey, and our chief content officer is Holly Kernan. I'm Mina Kim. Have a great weekend.

Funds for the production of Forum are provided by the John S. and James L. Knight Foundation, the Generosity Foundation, and the Corporation for Public Broadcasting. Support for KQED podcasts come from Berkley Rep, presenting Aves, an intriguing new play about memory, forgiveness, and unexpected transformation.

playing May 2nd through June 8th. More info at berkeleyrep.org. Greetings, Boomtown. The Xfinity Wi-Fi is booming! Xfinity combines the power of internet and mobile. So we've all got lightning-fast speeds at home and on the go. That's where our producers got the idea to mash our radio shows together. Xfinity!

Hey.

I'm Jorge Andres Olivares and I'm hosting a new show, Hyphenacion. Unlike many other hyphenated Latinos in the U.S., our cultures and our communities inform our choices, like with money. We had that pressure to be the breadwinner. Religion. I just think Jesus was what we would now define as Christ.

and family. We're not physically close and we're not like that emotionally close either. So join me and some amigas as we have easy conversations about hard things. Catch Hyphenación from KQED Studios wherever you get your podcasts and on YouTube.