From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist podcast, we're demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I'm your host, Luke Malks, VP of Business Operations at Brave Software, makers of the privacy-respecting Brave browser and search engine, now powering AI with the Brave Search API. ♪
You're listening to a new episode of The Brave Technologist, and this one features Debbie Reynolds, also known as the Data Diva. She's a globally recognized technologist, thought leader, and advisor in data privacy and emerging technology. With over 20 years of experience, she's delivered keynote talks for major organizations like TikTok, Johnson & Johnson, Coca-Cola, PayPal, and Uber.
Debbie hosts the number one global award-winning podcast at Data Diva Talks Privacy, and Identity Review has named her one of the top global privacy experts. In this episode, we discuss how consumer expectations for data privacy are changing, new risks families should be aware of, especially parents with kids online, simple habits users can adopt to take back control of their privacy, how companies can scale a human-centric approach to privacy. And now for this week's episode of The Brave Technologist.
Debbie, welcome to The Brave Technologist. How are you doing today? I'm great. Happy to be here and glad to chat with you again. Yeah, likewise, likewise. You want to give a little bit of background on what your current role is, what your area focuses in on the privacy space and kind of how you got there and made a career out of it? So I'm Debbie Reynolds. I work at the intersection of emerging technology and
So I am a technologist by trade. I've been, I actually had a personal interest in privacy personally, starting in around 1995, read a book called The Rights of Privacy. And at that point, I was shocked when I read this book. Caroline Kennedy was a co-author of this book. It was about privacy.
privacy in the US and it was about the laws and the gaps in privacy back then. Remember, this was around a time when the commercial internet was like brand new and we didn't really know what data was going to be like on the internet.
And I was shocked because I think in the U.S., we think we're the land of the brave, home of the free. And I just thought privacy was part of it. And because I found out that it wasn't, it's not like part of the constitutional right. I was shocked. So as technology continued to evolve and I worked, I'm a technologist and I've worked a lot in digital transformation with a lot of big companies around multinational data transfers and stuff. And people...
Who knew me from that work started to call me once the European Union started updating their privacy regime. So when they were trying to move towards the general data protection regulation, one of the companies, the first big company to contact me was McDonald's Corporation. And they asked me to talk with them about privacy. And this is way before, you know, the GDPR came.
came out and I was like, this is a big deal and this is how it's going to impact everybody, not just people in Europe and stuff like that. And so over the years, as people learn more about it, I got contacted by PBS. They asked me to be on there to talk about, you know, why this is a big deal, why privacy is a big issue around 2018.
Yeah, so then I've been doing that. So I am a data privacy officer for many different companies. I do a lot of speaking and writing about privacy. I work with companies probably you've never heard of, PayPal and Uber and Coca-Cola and TikTok. Yeah.
The little ones. Yeah. Companies you've never heard of. And I also work on standards. So I run a group for IEEE. We are endeavoring to create a privacy labeling regime that is...
technology and legal agnostic. Cool. Very cool. It must be quite a journey going like from looking at the privacy with the internet back in the 90s to now. I mean, I was around when advertising went from kind of a site by site thing with a cookie to like programmatic. And that was such a huge shift. But from then to now, it's just like crazy.
Brings me to my next question, too. I think big tech, you know, they often promise convenience in exchange for data. How can everyday users spot when that tradeoff starts to kind of tip against them? I think it's hard. I think part of it is that companies are very good at making offers to people that are very enticing in the moment. And
And you don't really read the fine print and understand maybe down the line, it's maybe not that big of a benefit to you. I think the relationship between consumers and these big companies are asymmetrical by nature, right? They get more than what you get, than what you get back. But I think, especially in the internet age, the asymmetry is astronomical at this point. So I think for people...
They should think, for example, if they want to give their biometrics to Amazon for a $10 coupon, you're like, is that really worth it? If my biometrics are breached, was it worth $10? So just thinking about that long-term risk and not just the benefit of what you do, I think it will help consumers make better choices.
Yeah, that makes sense. Do you see, I mean, you've got an interesting vantage point because you work both with companies and with the public. Are you seeing consumer expectations for data privacy changing? And what role should companies like Brave and others play in that? I'm definitely seeing a change. I think...
Especially around the time when the General Data Protection Regulation came out, I felt like I was like on a horse. Oh, the British are coming, trying to warn people if you weren't kind of listening. But now, because of all the things that we're seeing in the news around data breaches and things like that, I think
I'm seeing consumers holding their data closer to the vest, right? Not giving it out to everybody and really thinking about it. And so I think that's probably the best first step. But I think what you all do in education and having people understand what a Internet experience or browser experience should be like is very important.
I actually tell you a really good story. I was recently on the internet. I was on a different browser and I was trying to read an article. You know how you get an article pops up. You're like, oh, let me see that. And so I went to this site and it was so many, first of all, I was slow and it was so many ads on it that I couldn't read the article. It was like, you're trying to press buttons to close things so that you can look at stuff. And I
was like, that's it. So I took that site and I actually opened up my brain browser and I was like, oh, this is so much easier. It was faster. I didn't get all these crazy pop-ups. It was telling me, hey, we're, you know, this is how many days we blocked and different things like that. So to me, it was just a better experience than I had
And I've told a lot of people about it. So you all are doing a great job. I think I want to see a lot more people use that because I think people kind of default to what people typically use and they don't really understand that.
In addition to it not being as fast as others, it's just very annoying. So I was pleased that I could read the article and not have to, you know, swat these ads away like flies. I know it's a really bad. I noticed you on the recipe sites where I just want to get to the ingredients and it's just I can't get the ingredients to cook time. There's just like 50 other things in the way.
It's turned it into a bad game of annoyance, you know, you framed up there. As AI is getting more advanced at profiling individuals and targeting them, et cetera, from your point of view, what risks should people, families be aware of, especially with parents with kids online? That's a great question. Well, kids online, obviously that's a very sensitive issue. I think...
It's very important for parents to know what kids are doing online. So having conversations with them about what they're doing online, maybe even monitoring their phone, things like that. And that's, you know, that may seem kind of intrusive to kids. But I mean, think about I'll give an example in the real world. So let's say you have a kid. They have a friend that comes over that you don't know.
And they decide, okay, well, I'm going to take this friend that you don't know up to my room. And you're like, what? Oh, hell no. That's not going to happen, right? That's what happens when kids are online, right? So they're interacting with people you don't know. They're probably doing things that you don't know about. You don't know they're meeting up with people. So just understanding that.
what kid is doing. And so when I was growing up, you know, we didn't have those issues. So I couldn't just have a friend or someone that my parents didn't know, like burst into my house or whatever. And so now I think in the internet age, because it's, you know, it's not that same physical thing where the strangers in the house, because the strangers on the phone are on the internet and,
I think parents really need to be more connected with their kids and what they're doing online and making sure that they understand what those dangers are. They don't really, you know, they may think, oh, it's fun. I have all this freedom. But they may not know who they're talking to, whether it's a human person who is pretending to be maybe someone their age, maybe is older. Maybe they're on with a bot or someone who's trying to manipulate them in some way.
you know, it makes a parent's job or a guardian's job much harder than it was, I think, in my parents' day. Yeah. And it is kind of a different balance too, because I feel this personally, you know, as like a parent, right, with two young kids that are under 10 and it's like,
I really care about privacy, of course, for myself and for my kids too. But at the same time, like you said, it's like a whole different level of stranger danger you're dealing with online. And sometimes even just a harmless query or something like that can take your kids to a very, very dark place and not get to it. It is a weird dynamic where you kind of have to
be surveilling your kids despite being very anti surveillance in a general kind of context but there really aren't i mean a lot of safeguards out there at least in the default modes with devices i haven't looked into these monitoring apps and things like that like have you spent any time looking at those for parents to like monitor their kids devices and things like that just curious
I haven't looked at them in a long time. I haven't heard of any. I know parents are still kind of struggling with that. I know that Apple has something that they put together for you to say, okay, I'm a parent and I'm the guardian of this person. And if we share this account, then I can give you updates and stuff. Most people don't do that.
A lot of parental controls that are in place now, a lot of parents don't use them because it's just a lot of work. I actually had a friend, it's very interesting. So he decided for his family that the computers in the house had to be in the common areas of the house so the kid couldn't have a phone or a computer upstairs. And at night they had to give their phones to their dad.
or whatever. So that's how he did it. That was actually a good idea. Even when I was growing up, you know, it makes me sound like I'm 100, that, you know, we had one TV. And so my parents always knew what we were watching because we were always watching the same thing. And so now we have people watching
on tablets, their phones or computers, you know, they're at school, they're at home, they're away. So there are just so many touch points or so many entry points for them to be able to interact. It's just hard to manage that flow of information.
Yeah, totally. And it feels like more and more all these other, you almost have to assume every device connects to the internet now, you know, which is kind of freaky in a different light. But I think that that's a really good point that you bring up, though, in your example there, where some of this is technological, right? Whether it's a monitoring app, but a lot of this is also just, you know, practices, right? Keeping the devices within eye's reach in kind of limiting the space at home to certain areas or certain time
frames. That's what we do. I mean, we were just pretty vigilant on like, you know, not giving them too much time or making it more of a reward thing with a limited time span on it. But it seems like it's still very kind of wild west on that front and not as well. I don't know, maybe they're studying it, but it doesn't seem like it's really it's not seatbelt ready yet. You know, like where, you know, your cars are and things like that, you know, kind of
Switching gears a little bit to, you're starting to hear a lot around human-centric approaches for privacy. I'm wondering if you might be able to shed a little light on what some of the key elements of human-centric privacy approaches are and how can they be scaled across different industries?
Great question. Well, that's part of the work that I'm doing with IEEE. We're working on next connectivity systems for human control and flow of data. And so part of that human centricity is making sure that the human has more control of their data as opposed to, so like, for example, like the early internet, a lot of the ways that we interact with the internet are
Because back then, a lot of the computing, a lot of the power was at these big incumbents, right? So, you know, your phone wasn't very powerful back then. You know, iPhones didn't exist back then and stuff. So now we have more power or computing on devices and stuff. So our devices...
may have enough power to do things that we don't necessarily have to do them like in the cloud or we shouldn't have to share our credentials with everybody. Right. So a lot of that is kind of sharing, trying to think about ways to sharing just in time. Right. Where it's like,
You're not throwing a cauldron of your data into some big bucket that gets shared around with different people. You're more like, okay, I want to do this transaction. And then instead of me sharing everything, I just share what you need to know. Right. So, yeah. Yeah. So thinking about data in that way, and then also being able to take the data back. And that's, that's been the hardest part, I think.
privacy is that, you know, once you decide that you don't want to use a service, you don't want to do anything, you know, these data systems are made to remember stuff and not to forget it. And that is totally in opposition with what privacy should be, which is people should be able to choose, have agency. They decide they don't want to use something or they want their data removed or something. They should be able to do that. And right now the architectures are built in such a way that makes it like extremely hard to do that.
Yeah, and that's a great point. People don't really grasp just how long the data collection has been going on for. It's 2025. I mean, programmatic advertising started scaling up with 2011, 2012. Like people's data has been getting profiled for decades.
over a decade now. And it's a huge amount of information that these companies have. And I just remember from working in the space, like working in ad tech, like I couldn't tell you where the data ends up. It ends up, you know, copy synced with a bunch of other stuff, with a bunch of other companies people haven't heard of. It's a really strong point, I think, that you bring up around being able to like have this ability to forget your data. How
is that from your point of view, you work with a lot of big companies on this. I feel like it's almost like people are kind of questioning their trust in a lot of companies is kind of waning because of a lot of the breaches. And when they find out about how their data has been used and shared around, I
I wonder with these things, like if people actually trust that their information is gone when they say it's gone. What do you say to that? You've worked with companies on this. How are they thinking about this? Should people be concerned about that? Yeah, I think people should be concerned about that.
People should be concerned about that. You know, I had experience, thought about this the other day, where I had previously lived in Washington, D.C. And Washington, D.C., they had a lot of CVS stores, right? So I moved back to Chicago. Chicago is more of like a Walgreens type of town. And so somehow I was in some other neighborhood and I ended up at a CVS. I hadn't been to a CVS in like 20 years. I walked in, they were like, oh, well, we have your account on file. I'm like, from 20 years ago? Oh,
Like, you still have it? Like, oh, my God. Right. And so it is concerning that a lot of companies don't delete your data, don't get rid of your data. So it just makes the risk for you as a consumer and a company higher. So I think what companies are trying to figure out is that.
How can they, a lot of them are like super hot on personalization. So when you hear personalization, that just means they want more data about you, right? Because they want to like tailor experiences to you. But then how do they lower their risk and lower their reach risk for the individual by not keeping stuff too long? And so I think it's, you know, it's like a trillion dollar question that we have here is being made harder with AI where AI is more, you know, not as transparent, right?
some other ways that companies have used data before so that things can really get out of control but I think part of that is not just looking at it in terms of you know let's gather someone's data and then figure out what to do with it later some of it is like let's
Give less. Ask less. Put less stuff in. I think it's like a multi-pronged approach. Ask for less. For example, someone wants to buy alcohol. You scan their ID. I don't even know what's on your ID. We don't even know what information is tied to your ID when they're scanning. You don't know what they have. Do they need all that information?
information, right? Like you just need to know, am I over 18 or if I'm over 21? And so part of that is figuring out how to broker an answer to a question without creating more risk for the individual. So almost like you go up to a bar, right? And you say, okay, you flash your ID and that's it. There's no data collected, right? It's that person saying, okay, I agree that you're over this age and we let you in.
So thinking about it in a digital way, how can we do that in a digital sense where you're not creating more risk for the individual? Yeah, no, I think that makes sense. I mean, I remember...
incentives used to be so much towards collecting as much data as possible to where you see like dark patterns and things like that, where, you know, people are kind of getting baited and switched into giving more and more of their information away without even knowing it. But now it seems like, and I'd love to hear your take on this, like companies might be starting to see that collecting so much data is kind of becoming a liability. Do you feel like that's the case now? Has there been a shift there? Yeah.
has been a shift and a lot of that is because of a lot of the data breaches that are happening so a lot of the data breaches and I'll follow these very closely because I'm always interested in like how did they get breached or what got breached and a lot of the data that gets breached are things that are typically legacy data so data that maybe is aged out it doesn't have a super high
business value, but it has a high cyber or privacy risk. So having companies hold on to that data is just creating more risk for them and the individual. So I think some companies are trying to wake up about that. So maybe they're trying to get rid of old accounts. They're trying to get people limiting the time period of how long they keep data. And those are all good things. So I'm hoping companies do more of that.
Awesome. Awesome. You know, every day, too, people like we've been talking about are handing over more and more bits of their personal data. You know, you mentioned a couple of things here, but is there one simple habit that you would recommend that our users can adopt to take back their privacy or limit some of that collection? Yeah. Well, you know, I guess I'll make a shameless plug here. And it's true. So I use your browser even before we knew each other.
And I really love it. And I tell a lot of people about it. So just doing that, I think will help people a lot because they don't know what sites they're interacting with. They don't know the things they may be clicking on, maybe leading them down some path or increasing their breach risk somehow. And so being able to also certain states have laws or regulation around companies having to
honor what's called a global privacy control. And we talked about this on my podcast too. So the global privacy control is really a way for someone to be able to set their settings in a browser. And then when they go to websites, it's supposed to not ask them all these...
questions because the answer should be gotten in the browser. So not all companies are on board with that, but I feel like that will help consumers and like make their customer journey on certain websites a lot easier where they're not always having to go to every site and say yes, no, except monitor, manage cookies and stuff like that.
I think that makes sense. People are so blind to those cookie consents, you know, like they just have kind of become another thing jumping out at them in the webpage or whatever. With AI kind of powering everything from recommendations to surveillance, what top privacy and governance challenges lie ahead from your point of view? And how do you recommend companies prepare for that? I think some of the top privacy challenges that companies, they think about is, you
Transparency, transparency, number one. So a lot of data uses that companies had in the past, they did not have to be transparent. So the AI age is making it even that much harder, right? Where a lot of companies don't know where, they already don't know where data goes once it goes into the organization, right?
It's duplicated, it's split up, it's put into different places. And so now you're creating more complexity when you're bringing in AI. So the challenge for them was to be able to find a way to be transparent about how they're using people's data and think about it, not just from a collection point of view, but all the way through the data life cycles.
So data has a life cycle from cradle to grave. So there should be an end of life for data. And so a lot of that plays into privacy, where if you're keeping data too long, you're creating more risk for you and for the user. And so I think companies are trying to...
Especially with AI now, they have to think more holistically around that data lifecycle because a lot of the problems that we see companies have are in those maybe secondary data uses. Let's say data, this is a good example. So Twitter many years ago had a situation where they had decided they wanted to have people use multi-factor authentication.
And they wanted them to opt into that. So some people opted into it. And so in order to do that, they need to collect more information.
So the people did that. Some people agreed to do that. And so that was fine. Right. So that's what we call data providence, where you get someone data. They have the right to use it. Right. Because you agree to it or they're doing some type of service for you. But then somehow the marketing people got their paws on the data and they start using it for advertising. Right.
Right. And so that was bad. Right. Because they didn't ask the people, you know, that was not the initial intended purpose for the data. So they got in trouble for that. But I think a lot of that happens because when you have data too long or you don't understand where it comes from or understand that lineage of data is easy for companies to run afoul. Like I said, they've kind of run off the railroad tracks. Right.
At some point. So the majority, I think a lot of companies, unfortunately, right now are very concerned about the provenance part, kind of like the collection part at the front end. They aren't as concerned, aren't looking very much at the lineage part. And that's mostly where companies fall short on the provenance.
privacy end of things. No, it makes sense. Yeah. And I think we're definitely seeing some concerns around kind of even company data, you know, when people are putting things into models, I mean, into these bots and into these chat prompts and things like that, and making sure that you're not like leaking company data and information like that. No, it's interesting. You know, looking ahead five years down the road, where do you see the biggest battleground between privacy and emerging technology taking place?
Well, I think the biggest battleground between privacy and the future will be around decentralization of data. So right now, say we want to log into a particular thing. So we go, we'll say we have Google. So we go in, we log into Google. We maybe use 10 or 12 different services. So that data is shared with all those services. I think in the future, people will maybe on their devices, you know,
Their devices will be almost like a bank of their own information and that they're only sharing what they need to do a certain transaction. So it's not as though they're going to the mothership and their mothership has all their information. Right. And they're kind of logging in. I think in the future, people on their devices will have a way where they're brokering information.
Right. Where in this example where I say, OK, let's say you want to buy alcohol. You don't need to know my name. You don't need to know where I live. You need to know. Am I over 21? Yeah.
So I think in the future, there's going to have to be some type of intermediary where I hope it's decentralized in a way where people don't have to share everything to answer one thing or two things. So that's what I think is going to be the biggest battle because I
What that will mean is that a lot of that power will shift more to the individual away from kind of these bigger services. So I think that's actively happening already where, you know, a lot of these companies, they want you to be on a more central service. But the technology has advanced to a point where.
the type of decentralization I'm talking about is possible. So we'll see how that works. Yeah, no, I think that's a great point. We've seen cases here and there where it seems kind of siloed in a way where like, oh, my YouTube channel was taken down. Now I have no way of making money anymore because
because I had invested all of my effort into this platform or this service. Or I think I was seeing something where somebody got, what was it, a false positive flagged for something, were locked out of their Google account and all their photos of their kids and everything were all stored in the cloud. They couldn't access those memories anymore. And so I think you've got a great point there because it also seems like we're getting into a time where
regulators and lawmakers are kind of realizing they've got to start to look at things a little bit differently. And the more they try to regulate these things, they just end up collecting more data and then missing the ball anyway. So it seems like having things, technology will need to do a lot of the lifting there on it. But I couldn't agree more on the decentralization part because it's really how you could have control over it. So I think that's a great point.
Debbie, you've been super gracious with your time and covered a lot of ground here. If people want to follow you, you know, you got your podcast, where can they take a look and see where you're posting online?
Sure. Well, I'm always on LinkedIn. So you type in Data Diva, Debbie Reynolds, and my name will pop right up. I also have a website, DebbieReynoldsConsulting.com. I have a lot of my articles and videos and things there. And then I have a podcast. So it's the Data Diva Talks Privacy Podcast. It's the number one data privacy podcast in the world for five years. We have listeners in over 123 countries. Cool.
Awesome. Awesome. Well, Debbie, thank you so much for coming on today. I'd love to check back in later and see how things are going too. And yeah, yeah, really, really appreciate you making the time. Oh, thank you. It's a pleasure to be on your show. I really appreciate it. Awesome. Thanks. We'll talk soon. Bye-bye. Thanks for listening to the Brave Technologist podcast. To never miss an episode, make sure you hit follow in your podcast app.
If you haven't already made the switch to the Brave browser, you can download it for free today at brave.com and start using Brave Search, which enables you to search the web privately. Brave also shields you from the ads, trackers, and other creepy stuff following you across the web.