Back to episodes

Episode 73

Privacy & Parenting: Your Child’s Digital Footprint

Debbie Reynolds, Chief Data Privacy Officer and host of “The Data Diva Talks Privacy” podcast, discusses the challenges parents face in protecting their children online, and simple habits users can adopt to take back control of their privacy. She also discusses the growing recognition among companies that excessive data collection poses significant risks.

Transcript

Luke: [00:00:00] From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist Podcast, we’re demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I’m your host, Luke Moltz, VP of business operations at Brave Software, makers of the privacy respecting brave browser and search engine.

Luke: Now powering AI with the Brave search. API. You’re listening to a new episode of The Brave Technologist, and this one features Debbie Reynolds, also known as a data diva. She’s a globally recognized technologist, leader and advisor in data privacy and emerging technology. With over 20 years of experience, she’s delivered keynote talks from major organizations like TikTok Johnson and Johnson, Coca-Cola, PayPal, and Uber.

Luke: Debbie hosts a number one global award-winning podcast at Data Diva Talks. Privacy and Identity Review is named for one of the top global privacy experts. In this episode, we discussed [00:01:00] how consumer expectations for data privacy are changing new risks. Families should be aware of, especially parents with kids online.

Luke: Simple habits users can adopt to take back control of their privacy, how companies can scale a human-centric approach to privacy. And now for this week’s episode of The Brave Technologist,

Luke: Debbie, welcome to The Brave Technologist. How are you doing today? I’m great. Happy to be here and glad to chat with you again. Yeah, likewise. Likewise. You wanna give a little bit of background on what your current role is, what your area focuses is in on the privacy space, and kind of how you got there and made a career out of it.

Debbie: So I’m Debbie Reynolds. I work at the intersection of emerging technology and privacy, so I am a technologist by trade. I’ve been, I actually had a personal interest. In privacy personally, starting in around 1995, we read a book called The Rights of Privacy, and at that point I was shocked when I read this book.

Debbie: Caroline Kennedy was a co-author [00:02:00] of this book and it was about privacy in the US and it was about the laws and the gaps in privacy. Back then. Remember this was around the time when the commercial internet was like brand new and we didn’t really know what data was gonna be like on the internet, and I was.

Debbie: Shocked because I think in the US we think we’re the, the land of the brave home of the free, and I just thought privacy was part of that. And because I found out that it wasn’t, it’s not like part of the constitutional right. I was shocked. So as technology continued to evolve and I worked, I’m a technologist and I’ve worked a lot in digital transformation with a lot of big companies around multinational data transfers and stuff, and people.

Debbie: Who knew me from that work started to call me once the European Union started updating their privacy regime. So when they were trying to move towards the general data protection regulation, one of the companies, the first big company contacted me was McDonald’s Corporation, and they asked [00:03:00] me to talk with them about privacy.

Debbie: And this was way before you know the GDPR. Came out and they, I was like, this is a big deal and this is how it’s gonna impact everybody, not just people in Europe and stuff like that. And so over the years, as people learn more about it, I got contacted by PBS and they asked me to be on there to talk about, you know, why this is a big deal, why privacy is a big issue around 2018.

Debbie: And yeah. So then I’ve been doing, doing that. So I am a data privacy officer for many different companies. I do a lot of. Speaking and writing about privacy. I work with companies probably you’ve never heard of PayPal and Uber and Coca-Cola and TikTok,

Luke: those little ones. Yeah.

Debbie: So yeah, companies never heard of, and I also work on standards.

Debbie: So I run a group for IEE. We are Inev endeavoring to create a privacy labeling regime that is technology and legal agnostic.

Luke: Cool. Very [00:04:00] cool. It must be quite a journey going, like from looking at the privacy with the internet back in the nineties to now. I mean I, I was around when advertising went from kind of a site by site thing with a cookie to like programmatic and that was such a huge shift.

Luke: But from then to now, it’s just like crazy. Brings me to my next question too. I think big tech, you know, they often promise convenience in exchange for data. How can everyday user spot when that trade off starts to kind of tip against them?

Debbie: I think it’s hard. I think part of it is that companies are very good at making offers to people that are very enticing in the moment, and you don’t really read the fine print and understand maybe down the line is maybe not that big of a benefit.

Debbie: You, I think the relationship between consumers and these big companies are asymmetrical by nature, right? they get more than what you get what than what you get back. But I think especially in the internet age is. The asymmetry is astronomical at this point. So I think for people, they should [00:05:00] think, for example, if they want to give their biometrics to Amazon for a $10 coupon, you’re like, is that really worth it?

Debbie: If my biometrics are reached, like was it worth $10? So just thinking about that long-term risk and not just the benefit of what you do, I think it will help consumers make better choices.

Luke: Yeah, that makes sense. Do you see, I mean, you’ve got an interesting vantage point because you work both with companies and with the public.

Luke: Are you seeing consumer expectations for data privacy changing? And what role should companies like Brave and others play in that?

Debbie: I’m definitely seeing a change, I think. Especially around the time when the general data protection regulation came out, I felt like I was like, like on a horse, oh, the British are coming, trying to warn people if you weren’t kind of listening.

Debbie: But now, because of all the things that we’re seeing in the news around data breaches and things like that, I think I. I’m seeing consumers holding their data closer to the vest, right? Mm-hmm. Not giving it [00:06:00] out to everybody and really thinking about it. And so I think that’s probably the best first step, but I think what you all do in education and I.

Debbie: Having people understand what a internet experience or browser experience should be like is very important. But I actually tell you a really good story. I was recently on the internet. I was on a different browser and I was trying to read an article, you know, you know how you get a article pops up.

Debbie: You’re like, oh, lemme see that. And so I went to this site and it was so many a, first of all, it was slow and it was so many ads on it. That I couldn’t read the article, you know, it was like you’re trying to press buttons to close things so that you can look at stuff. And I was like, that’s it.

Debbie: So I took that site and I, I actually opened up my Brave browser and I was like, oh, this is so much easier. It was faster. I didn’t get all these crazy popups. It was telling me, Hey, we’re, you know, this is how many. Things we blocked and different things like that. And so to me it was just a better experience and I, and I’ve told PE a lot of people [00:07:00] about it.

Debbie: So you all are doing a great job. I think I wanna see a lot more people use that. ‘cause I think people kind of default to what’s what people typically use and they don’t really understand that. Mm-hmm. In addition to it not being as fast as others is just very annoying. So I was pleased that I could read the article and not have to, you know, swat these ads away, like flies.

Luke: I know it’s, it’s a really bad, I noticed too on the recipe sites where I just want to get to the ingredients and it’s just I can’t get the ingredients to cook time. There’s just like 50 other things in the way. It’s turned it into a bad game of annoyance. You know, you framed up there as AI is getting more advanced at profiling individuals and, and targeting them, et cetera.

Luke: From your point of view, what risks should people, families be aware of, especially with parents, with kids online? That’s a great question.

Debbie: Well, kids online, obviously that’s a very sensitive issue. I think it’s very important for parents to know what kids are [00:08:00] doing online, so having conversations with them about what they’re doing online, maybe even monitoring their phone, things like that.

Debbie: And that’s not, you know, that may seem kind of intrusive to kids, but I mean, think about. I, I’ll give an example. In the real world, so let’s say you have a kid, they have a, a friend that comes over that you don’t know, and they decide, okay, well I’m gonna take this friend that you don’t know up to my room.

Debbie: And you’re like, what? Oh, hell no. That’s not gonna happen. Right? That’s what happens when kids are online, right? So they’re interacting with people you don’t know. They’re probably doing things that you don’t know about, you don’t know they’re meeting up with people. So just understanding. What kid is doing.

Debbie: And so when I was growing up, you know, we didn’t have those issues. So I couldn’t just have a friend or someone that my parents didn’t know, like burst into my house or whatever. And so now I think in the internet age, because it’s, you know, it’s not that same physical thing where the [00:09:00] stranger’s in the house because the strangers on the phone are on the internet.

Debbie: I think parents really need to. Be more connected with their kids and what they’re doing online and making sure that they understand what those dangers are because they don’t really, you know, they may think, oh, it’s fun. They have all this freedom, but they may not know who they’re talking to, whether it’s a human.

Debbie: Mm-hmm. A person who’s pretending to be maybe someone their age, maybe it’s older, maybe they’re on, on with a bot or someone who’s trying to manipulate them in some way. You know, it makes a parent’s job or a guardian’s job much harder than it was, I think in my parents’ day.

Luke: Yeah, and it is kind of a, a different balance too because I, I feel this personally, you know, as like a parent, right?

Luke: With two young kids that are under 10. And it’s like, I really care about privacy, of course, for myself and, and for my kids too. But at the same time, like you said, it’s like a whole different level of stranger danger you’re dealing with online. And sometimes even just a harmless query or, or something like that can take [00:10:00] your kids to a very.

Luke: Very dark place and not good. So it is a weird dynamic where you kind of have to be surveilling your kids despite being very anti, it’s, it’s surveillance in a general kind of context, but it, there really aren’t, I mean, a lot of safeguards out there, at least in the default modes with devices. I haven’t looked into these monitoring apps and things like that, like are, have you spent any time looking at those for parents to like monitor their kids’ devices and things like that?

Luke: Just curious.

Debbie: I haven’t looked at them in a, a long time. I haven’t heard of any, you know, I know parents are still kind of struggling with that. I know that, that Apple has something that they put together for you to say, okay, I’m a parent and I’m the guardian of this person and if we share this account then I can give you updates and stuff.

Debbie: Most people don’t do that. A lot of parental, a lot of parental controls that are in place now. A lot of parents don’t use them ‘cause it’s just a lot, lot of work.

Luke: Mm-hmm.

Debbie: I actually had a friend, it’s very interesting, so [00:11:00] he decided for his family that the computers in the house had to be in the common areas of the house.

Debbie: So the kid couldn’t have a phone or a computer upstairs and at night they had to give their phones to their, their dad. Or whatever. Mm-hmm. So that’s how he did it. Yeah, that was actually a good idea. Even when I was growing up, you know, it makes me sound like I’m a hundred, that you know, we had one TV and so my parents always knew what we were watching ‘cause we were always watching the same thing.

Debbie: And so now we have people on tablets, on phones or computers. You know, they’re at school, they’re at home, they’re away. So there are just so many touch points or so many entry points for them to be able to interact. It’s just hard to, to manage that flow of information.

Luke: Yeah, totally. And it feels like more and more all these other, you almost have to assume every device connects to the internet now, you know, which is kind of freaky in, in a different light, but I, I, I think that that’s a really good point that you bring up though, in [00:12:00] your example there, where some of this is technological, right?

Luke: Whether it’s a monitoring app, but a lot of this, it’s also just, you know, practices, right? Keeping the devices within eyes reach and kind of limiting the space at home to certain areas or certain timeframes. That’s what we do. I mean, we we’re just pretty vigilant on like, you know, not giving ’em too much time or making it more of a reward thing with a limited time span on it.

Luke: But it seems like it’s still very kind of wild west on that front and, and not as well, I don’t know, maybe they’re studying it, but it doesn’t seem like it’s really, it’s not seatbelt ready yet, you know, like where, you know, your cars are and, and things like that, you know, kind of. Switching gears a little bit to, you’re starting to hear a lot around like human-centric approaches for privacy.

Luke: I’m wondering if you might be able to shed a little light on what some of the key elements of human-centric privacy approaches are and how can they be scaled across different industries?

Debbie: Great question. Well, that’s part of the work that I’m doing with IEE. We’re working on next connectivity systems for human [00:13:00] control and flow of data, and so mm-hmm.

Debbie: Part of that. Human centricity is making sure that the human has more control of their data as opposed to, so like for example, like the early internet, a lot of the ways that we interact with the internet are because back then a lot of the computing, a lot of the power was. At these big incumbents, right?

Debbie: So, you know, your phone wasn’t very powerful back then. You know, iPhones didn’t exist back then and stuff, so now we have more power or computing on devices and stuff. So our devices. May have enough power to do things that we don’t necessarily have to do them like in the cloud, or we shouldn’t have to share our credentials with everybody, right.

Debbie: So, mm-hmm. A lot of that is kind of sharing, trying to think about ways of sharing just in time, right. Where it’s like. You’re not throwing a cauldron of your data into some big bucket [00:14:00] that gets shared around with different people. You’re more like, okay, I wanna do this transaction and then instead of me sharing everything, I just share what you need to know.

Debbie: Right. So,

Luke: yeah. Yeah.

Debbie: So thinking about data in that way, and then also being able to take the data back and that’s. That’s been the hardest part, I think, around privacy is that, you know, once you decide that you don’t want to use a service, you don’t wanna do anything, you know, these data systems are made to remember stuff and not to forget it.

Debbie: And that is totally an opposition with what privacy should be, which is people should be able to choose, have agency, they decide they don’t wanna use. Something, or they want their data removed or something, they should be able to do that. And right now the architectures are built in such a way that makes it like extremely hard to do that.

Luke: And that’s a great point people don’t really grasp, but just how long the data collection has been going on for. It’s 2025. I mean, programmatic advertising started scaling up what? 20 11, 20 12. Like [00:15:00] people’s data has been getting profiled for over a decade now, and it’s a huge amount of information that these companies have.

Luke: I just remember from working in the space, like working in ad tech, like I couldn’t tell you where the data ends up. It ends up, you know, copy synced with a bunch of other stuff with a bunch of other companies people haven’t heard of. It’s, that’s a really strong point I think that you bring up around being able to like have this ability to forget your data.

Luke: How is that, from your point of view, you work with a lot of big companies on this, I feel like. It’s almost like people are kind of questioning their trust in a lot of companies is kind of waning because of a lot of the breaches and when they find out about how their data’s been used and shared around.

Luke: I, I wonder with these things, like if people actually trust that their information’s gone when they say it’s gone, what do you say to that? You’ve worked with companies on this. How are they thinking about this? Should people be concerned about that?

Debbie: Yeah, I think people should be concerned. People should be concerned about that.

Debbie: You know, I had experience [00:16:00] thought about this the other day where I had previously lived in Washington DC and Washington DC They had a lot of CVS stores, right? So I moved back to Chicago or Chicago’s more of like a Walgreens type of town. And so. Somehow it was in some other neighborhood and I ended up at a CVS.

Debbie: I hadn’t been to a CVS in like 20 years. I walked in, they were like, oh, well we have your account on file. I’m like, from 20 years ago. I’m like, you still have that? Like, oh my god. Right. And so it, it is concerning that a lot of companies don’t delete your data or don’t get rid of your data. So it just. Makes the risk for you as a consumer and a company higher.

Debbie: So I think what companies are trying to figure out is how can they, A lot of ’em are like super hot on personalization. So when you hear personalization, that just means they want more data about you, right? Because they wanna like tailor experiences to you. But then how do they. Lower their risk and lower their breach risk for the individual by [00:17:00] not keeping stuff too long.

Debbie: And so I think it’s, you know, it’s like a trillion dollar question that we have here is being made harder with ai, where AI is more, you know, not as transparent as some other ways that companies have used data before. And so the things can really get outta control, but I think part of that is not, mm-hmm.

Debbie: Not just looking at it in terms of, you know, let’s gather someone’s data and then figure out what to do with it later. Some of it is like, mm-hmm, let’s give less. You know, ask less. Yeah, yeah, yeah. Put less in. No, I think that’s a good point. And so, so I think it’s like a multi-pronged approach, so asking for less, like for example, someone wants to buy alcohol.

Debbie: You scan their, their id, I don’t even know what’s on your id. We don’t even know what’s, what information is tied to your id, right? When they’re scanned, you don’t know what they have, right? Mm-hmm. But like, do they need all that information, right? Like you just need to know, am I over 18 or if I’m over 21?

Debbie: And so part of that is figuring out how to [00:18:00] broker and answer. To a question mm-hmm. Without creating more risk for the individual. So almost like you go up to a bar, right? And you say, okay, then you flash your ID, and that’s it. There’s no data collected, right? It’s that person saying, okay, I, I agree that you’re over this age and we let you in.

Debbie: So thinking about it mm-hmm. In a digital way, how can we do that in a digital sense where you’re not creating more risk for the individual?

Luke: Yeah, no, I think that makes sense. I mean, I remember incentives used to be so much towards collecting as much data as possible to where you, you see like dark patterns and things like that where, you know, people are kind of getting baited and switched into giving more and more of their information away without even knowing it.

Luke: But now it seems like, and, and I’d love to hear your take on this, like companies might be starting to see that collecting so much data is kind of becoming a liability. Do you, do you feel that that’s the case now? Has there been a shift there?

Debbie: There has been a shift and a lot of that is because of a lot of the data breaches that are happening.

Debbie: So a lot [00:19:00] of the data breaches and I, I’ll follow these very closely ‘cause I’m always interested in like how did they get breached or what got breached. And a lot of the data that gets breached are things that are typically like legacy data. So data that maybe is aged out. It doesn’t have a super high business value, but it has a high cyber or privacy risk.

Debbie: And so having companies hold onto that data is just creating more risk for them and the individual. And so I think some companies are trying to wake up about that, so like maybe They’re trying to get rid of old accounts, they’re trying to get people like limiting the time period of how long they keep data.

Debbie: And those are all good things. So I’m hoping companies do more of that.

Luke: Awesome, awesome. You know, every day too, people like we’ve been talking about, are handing over more and more bits of their personal data. You know, you mentioned a couple of things here, but is, is there one simple habit that you would recommend that our users can adopt to take back their privacy or limit some of that collection?

Debbie: Yeah, well, you know, I guess I’m gonna make a shameless [00:20:00] plug here and it is true. So I, I’ll use your browser even before I, we knew each other and I really love it. And I tell a lot of people about it. So just doing that I think will help people a lot because they don’t know what sites they’re interacting with.

Debbie: They don’t know the things they may be clicking on, maybe leading them down some path or increasing their breach risk somehow. And so being able also. Certain states have laws and regulation around companies having to honor what’s called a global privacy control. And we talked about this on our, on my podcast too.

Debbie: So the global privacy control is really a way for someone to be able to set their settings in a browser, and then when they go to websites, it’s supposed to not ask them all these. Questions because that answer should be gotten in the browser. So not all companies are on board with that, but I feel like that will help consumers and like make their customer [00:21:00] journey on certain websites a lot easier, where they’re not always having to go to every site, say yes, no, except monitor, manage co cookies and stuff like that.

Luke: I think that makes sense. People are so blind to those cookie consents, you know, like they just have kind of become another thing, jumping out at them in the webpage or whatever. With ai, kinda empowering everything from recommendations to surveillance. What top privacy and governance challenges lie ahead from your point of view?

Luke: And, and how do you recommend companies prepare for that?

Debbie: I think some of the top privacy challenges that companies they think about is transparency. Transparency number one. So a lot of data uses that companies had in the past. They did not have to be transparent. So. The AI age is making it even that much harder, right?

Debbie: Where a lot of companies don’t know where, they already don’t know where data goes once it goes into the organization, right? It’s, it’s duplicated, it’s split up, [00:22:00] it’s, you know, put into different places. And so now you’re creating more complexity when you’re bringing ai. And so the challenge for them was to be able to Find a way to be transparent about how they’re using people’s data and think about it, not just from a collection point of view, but all the way through the data life cycle. So data has a life cycle from cradle to grave, so there should be a end of life for data. And so a lot of that plays into privacy where if you’re keeping data too long, you’re creating more risk.

Debbie: For you and for the user. And so I think companies are trying to, especially with AI now, they have to think more holistically around that data life cycle because a lot of the problems that we see companies have are in those maybe secondary. Data uses, let’s say data. This is a good example. So Twitter, many years ago had a situation where they had decided they wanted to have people use multifactor authentication, and they wanted them to opt into that.

Debbie: So some people [00:23:00] opted into it. And so in order to do that, they need to collect more information. So the people did that, some people agreed to do that, and so that was fine, right? So that’s what. We call data provenance, where you give someone data, they have the right to use it. Right? ‘cause you agreed to it, or they’re doing some type of service for you.

Debbie: But then somehow the marketing people got their pause on the data and they started using it for advertising. Right. And so that was bad. Mm-hmm. Mm-hmm. Because, mm-hmm. They didn’t ask the. People, you know, that was not the, the initial intended purpose for the data, and so they got in trouble for that. But I think a lot of that happens because when you have data too long or you don’t understand where it comes from or understand that lineage of data is easy for companies to, to run afoul, like I said, they’ve kind of run off the railroad tracks at some point.

Debbie: So the majority, I think a lot of companies unfortunately right now are very. Concern about the providence part, kind of like the collection part at the front end. [00:24:00] They aren’t as concerned, aren’t looking very much at the lineage part, and that’s mostly where companies fall short on the, the privacy end of things.

Luke: No, that makes sense. Yeah. And I think we’re definitely seeing some uh, concerns around kind of, even, even company data. You know, when, people are putting things into models, I mean, into these bots and into these chat prompts and things like that, and making sure that you’re not like leaking company data and information like that.

Luke: No, it’s interesting, you know, looking ahead five years down the road, where do you see the biggest battleground between privacy and emerging technology taking place?

Debbie: Well, I think the biggest battleground between privacy and the future will be around decentralization. I. Of data. So right now, let’s say we wanna log into a particular thing.

Debbie: So we go, let’s say we have Google. So we go in, we log into Google, we maybe we use 10 or 12 different services. So that data shared with all those services. I think in the future I. People, [00:25:00] maybe on their devices, their devices will be almost like a bank of their own information. Mm-hmm. And that they’re only sharing what they need to do a certain transaction.

Debbie: So it’s not as though they’re going to the mothership and they’re, the mothership has all their information. Right. And they’re kind of logging in. I think in the future, people on their devices will have a way where they’re brokering. Information. Right Where, in this example where I say, okay, let’s say you wanna buy alcohol, so you don’t need to know my name.

Debbie: You don’t need to know where I live. You need to know, am I over 21? Yeah. Yeah. So, so I think in the future there’s gonna have to be some type of in intermediary. Where I hope it is decentralized in a way where people don’t have to share everything to answer one thing or two things. So that’s what I think is gonna be the biggest battle because what that will mean is that a lot of that power will shift more to the individual away [00:26:00] from kind of these bigger services or so I think.

Debbie: That’s actively happening already, where, you know, a lot of these companies, they want you to be on a more central service, but the technology has advanced to a point where the type of decentralization I’m talking about is possible. So we’ll see how that that works.

Luke: Yeah, no, I think that’s a great point.

Luke: We’ve seen cases here and there where it, it is, seems kind of siloed in a way where like, oh, my YouTube channel was taken down. Now I have no way of making money anymore. ‘cause I had invested all of my effort into this platform or this service, or I think I was seeing something where somebody got, what was it, a false positive.

Luke: Flagged for something were locked out of their Google account and all their photos of their kids and everything were all stored in the cloud. They couldn’t access those memories anymore. And so I think you’ve got a great point there because it also seems like we’re getting into a time where I.

Luke: Regulators and lawmakers are kind of realizing they’ve gotta start to look at things a little [00:27:00] bit differently. And the more they try to regulate these things, they just end up collecting more data and then missing the ball anyway, you know, so it seems like having things, technology will need to do a lot of the, the lifting there on it.

Luke: But I couldn’t agree more on the decentralization part because, you know, that’s really how you could have control over it. So I think that’s, that’s a great point. Debbie, I, I, you’ve been super gracious with your time and covered a lot of ground here. If people wanna follow you, I know you got your podcast.

Luke: Where can they take a look and, and see where you’re posting online?

Debbie: Sure. Well, I’m always on LinkedIn, so you can type in Data Diva Debbie Reynolds and my name will pop right up. I also have a website, debbie reynolds consulting.com. I have a lot of my articles and videos and things there, and then I have a podcast, so it’s the Data Diva Talks Privacy podcast.

Debbie: It’s the number one data privacy podcast in the world. For five years. We have listeners in over 123 countries.

Luke: Awesome. Awesome. Well, Debbie, thank you so [00:28:00] much for, for coming on today. I’d love to check back in later and see how things are going too and yeah. Yeah. Really, really appreciate you making the time.

Debbie: Oh, thank you. It’s a pleasure to be on your show. I really appreciate it.

Luke: Awesome. Thanks. We’ll talk soon.

Debbie: Bye-Bye.

Luke: Thanks for listening to the Brave Technologist Podcast. To never miss an episode, make sure you hit follow in your podcast app. If you haven’t already made the switch to the Brave Browser, you can download it for free today@brave.com and start using Brave Search, which enables you to search the web privately.

Luke: Brave also shields you from the ads trackers and other creepy stuff following you across the web.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • How consumer expectations for data are privacy changing
  • The asymmetrical relationship between consumers and companies regarding data privacy
  • How companies can scale a human‑centric approach to privacy
  • The importance of decentralization in future data management

Guest List

The amazing cast and crew:

  • Debbie Reynolds - Chief Data Privacy Officer and host of “The Data Diva Talks Privacy” podcast

    Debbie Reynolds—aka “The Data Diva”—is a globally recognized technologist, thought leader, and advisor in data privacy and emerging technology. With over 20 years of experience, she has delivered keynote talks for major organizations like Coca-Cola, Johnson & Johnson, PayPal, TikTok, and Uber. Her insights have been featured in leading media outlets, including The New York Times, Forbes, Bloomberg, and Wired.

    Debbie hosts the #1 global award-winning podcast, “The Data Diva Talks Privacy,” which reaches listeners in over 120 countries. Identity Review has named her one of the Global Top Eight Privacy Experts, and the European Risk Policy Institute has named her a Top 30 CyberRisk Communicator. She also chairs the IEEE Cyber Security Committee for the Next Generation Connectivity Systems Privacy Labeling Project.

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.