Back to episodes

Episode 25

The Right to Be Forgotten: Fundamental Human Rights in the Digital Realm

Pat Walshe, Data Protection Officer at Brave, discusses the intersection of artificial intelligence, privacy, and data protection…and how these elements intertwine with fundamental human rights in the digital realm. He also scrutinizes tech industry giants, questioning the authenticity of their privacy claims and the efficacy of their data protection measures.

Transcript

[00:00:00] Luke: \From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist podcast, we’re demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I’m your host, Luke Malks, VP of Business Operations at Brave Software, makers of the privacy respecting Brave browser and search engine, now powering AI with the Brave Search API.

[00:00:29] You’re listening to a new episode of the Brave Technologist. And this one features Pat Walsh, who is a data protection officer here at Brave, who’s been working in the field of data protection for over 28 years. Pat says he uses the term data protection over just privacy because data about people, and especially their digital lives, can impact a range of human rights and freedoms, privacy being but one such right.

[00:00:51] In this episode, we covered the current state of regulation and how well it’s currently working, why tech companies have more influence and power than the law [00:01:00] itself, as well as the ways that big tech companies like Apple are responding to privacy laws being placed on them. And now for this week’s episode of the brave technologist.

[00:01:09] Pat, welcome to The Brave Technologist. How are you doing today? I’m all right. Thanks, Luke. Great to be with you. likewise. You know, I’ve known you from working at Brave, obviously, and from some of your work on, you know, posts on Twitter and stuff before then, but let’s give the audience a little bit of background.

[00:01:27] Like, did you end up kind of getting into the privacy space and what motivated you to really get involved on this front?

[00:01:33] Pat: I kind of fell into, to this space really into what I do, I’d finished a degree in social anthropology in 1995, saw a job in 1996 to help set up a telecommunications intelligence unit for a law enforcement agency, strange enough.

[00:01:50] And the anthropologist in me was kind of, my curiosity and interest was peaked because of the, what I perceived then as the biometric nature of communications [00:02:00] data and what it said about individuals and groups. And I understood then from that role, once I started it, how data impacted on a broad range of human rights beyond privacy, because often we talk about privacy.

[00:02:12] And I think I said to you before about, I really don’t like the term data privacy because data doesn’t have privacy. People do. And data impacts in a broad range of human rights. That piqued my interest. I got involved in that. And then I was poached by the telecom industry and then the rest is history really, here I am.

[00:02:31] Luke: People who don’t follow you on Twitter should follow you on Twitter, whatever X, whatever we’re calling it these days. One of the things I really remember early on was that you were great at finding these things called dark patterns, right? On these websites that people try to do to bait you into kind of getting them more of your data.

[00:02:47] Like, but what’s the story there? I I’m just kind of out of my own curiosity.

[00:02:51] Pat: It started on, Twitter, I guess. It didn’t have my identity at all. For a couple of years, I would just be snarky in the nice kind of way. [00:03:00] So I was at a, at an event. I was having dinner with a couple of people. I suddenly chatted about something and they said, Oh my goodness, you are privacy matters, aren’t you?

[00:03:10] Just what I’d said. And so I eventually after about two years, put my name to it, but I used it because Luke, we’ve had decades of law in Europe. I’m a global DPO for Brave, so I have to look at this globally, but in Europe in particular, we’ve had decades of law, decades of e privacy law, but I became so frustrated.

[00:03:30] at the lack of enforcement and no change in the status quo that people’s privacy was being invaded and eroded every single day, that I started to use the Privacy Matters account to highlight some of those issues and those concerns and to give a little bit of voice to, I guess.

[00:03:46] Luke: You mentioned my strange coming from having worked in a law enforcement background.

[00:03:50] I mean, like I, I think it’s, it’s one of those things that people don’t understand when you’re working in the space and you see how the data is used, it really can have an impact and make [00:04:00] you more actively wanting to protect people. Like, I mean, that’s how, kind of how I felt too, coming in from advertising, right?

[00:04:05] Like where it was like working and you see the scale that data is used and how it’s handled and all of that. And a lot of that can be an alarm bell, like For a career change to in a lot of, reasons, I think it’s like, uh, sometimes people put a negative light on that type of thing, but I think it’s really like you want people that have worked in those spaces to be advocates because they know what’s happening.

[00:04:26] And so much of this gets just like put in the abstract, you know?

[00:04:28] Pat: Yeah. even though I worked in that law enforcement space and then in the telecom sector, I also used to run a team. that was responsible for government lawful interception and disclosure. But even in the law enforcement space, I was hot on human rights and compliance with human rights.

[00:04:44] I used to challenge things. There was known as a bit of a pain in the derriere, but because I wasn’t afraid, even as far back as then, to ask the difficult questions. And to challenge something that just seemed wrong, putting the law to one side, sometimes things just seem wrong. And when you [00:05:00] particularly think about communications data, and when you think about our online digital trails, all of this is binding to us.

[00:05:06] All of it reveals lot about us and intimate aspects of our lives and those that we’re connected to and the lives of those that we’re connected to. So for me, it’s always been important. from right back then. So yeah, I’ve been doing it for 28 years and I’m still as passionate today about the issues as I was back then, I think, to be honest.

[00:05:22] Luke: I can attest to that. I think it’s really important too. I mean, a lot of these things, like people might get frustrated sometimes around people asking the hard questions, but if you’re not asking them, then That’s how we get to where we’re at today, right? Like we’re exponential growth of bad privacy practices just leads to kind of the need for, change.

[00:05:39] And I think you mentioned enforcement earlier. I was really interested in your take on this and because, you know, obviously GDPR is like a really impactful, at least a large effort at trying to tackle privacy, right? Like, and you’re seeing all these different regulators in different parts of the world approaching privacy.

[00:05:54] Like. I don’t want to be too blunt about it, but is it effective? Does it matter having a lot of [00:06:00] this regulation if, the enforcement’s not there, like, is it possible that things will improve from it? Like what’s your take on kind of the current state of regulation and enforcement?

[00:06:09] Pat: I think it’s definitely worth it.

[00:06:10] you can imagine how bad it would be if we didn’t even have the regulation.

[00:06:14] Luke: Right.

[00:06:14] Pat: So the GDPR was an evolution by the revolution in EU law, which has been around since 1995. At the end of the UK, for example, we’ve had data protection law since 1984. So it’s an evolution. And the good guys always will say, okay, we’re going to do this stuff.

[00:06:30] We’re going to respect. Our customers, we’re going to respect their right to privacy and these other rights, and we go to design a bill for it. But then you have some entities that simply will never have respect until a regulator walks in and says, okay, well, here’s some enforcement action, which can either take the form of, for example, under GDPR prohibiting the processing of data or big fines, but we haven’t had enough.

[00:06:54] Enforcement and certainly in the online e privacy space. you see a lot of complaints, [00:07:00] particularly from those in the ad tech space about the e privacy directive and whoa, the impact it’s had on people online, everybody gets cookies and you can blame that stupid EU cookie law, but actually it’s not the EU cookie law.

[00:07:12] What that law says is that if you want to access somebody’s information that’s stored on their device or store information on their device, Where that isn’t strictly necessary for service requested by the user, i. e. to technically deliver it, then you need their consent. So rather than blaming the law, I think those that do blame the law should look at the business model, underlying surveillance economy that drives that.

[00:07:34] Brave is the antithesis of that, right? It’s proven that you don’t need that model. To provide privacy, respectful and privacy protective services. So to give you an example, is law, needed? I think so. You have in the UK, data protection law is being revised. There’s a proposal in there that gives the secretary of state.

[00:07:55] So the government, the right to introduce secondary legislation [00:08:00] to enable you to signal from your browser. Your consent or otherwise so for me Actually, yes bring that law in and let me send a gpc signal or a dnt signal from my browser that says I don’t consent, and I otherwise object to your tracking, profiling a target of me.

[00:08:17] Go away. If you like the digital garlic to the data vampires that seem to have infested the web today.

[00:08:23] Luke: It makes me wonder sometimes too, like, okay, you see this emergence of these cookie consent banners to the point where people are blocking them. Because of annoying, like, do these things like, and people blame the regulation for these things really.

[00:08:36] I mean, but it does sound like a higher level solution is important, but I think you’re totally right. The ethics of it really what it gets down to, you know, like until there are other ways that you can do business that don’t require it or that stand out. Like what we’re doing, like. People knew what they knew.

[00:08:51] I’m glad that that regulation is there. I mean, I think just for selfish reasons, you know, if you live in the U S it seems like privacy is kind of defined by whatever the big companies [00:09:00] say is defined as, you know, like, and then, you luckily like, at least there’s some parameters on this in Europe, what is your take on like, it just people’s ability to understand.

[00:09:08] you know, how to protect their privacy in the current time. People often say that people don’t care about privacy because they’re posting stuff online. Like what’s your reaction to those types of statements? You get those who say that

[00:09:20] Pat: people don’t care, but I can point to research after research after research going all the way back to the mid nineties, if you want.

[00:09:26] And before that, to how people do care about their privacy, particularly when you have a discussion with them. About the implications of a certain thing online. And then they say, Oh, well, I didn’t realize that what’s happening. Yes. I do care about that. Yes. I should ask me. And yes, they shouldn’t do it unless I say, yes, you can do that thing.

[00:09:43] And I get asked, even last week, I was asked, Pat, why is this particular social network showing me adverts for these things that I discussed? With someone else and yet I’ve never used it in the browser with that social network and and you get asked questions So people generally are [00:10:00] concerned they generally do care But I think it’s beyond the comprehension of the average person if I think of my friends and family and others They just want to go online.

[00:10:09] They don’t feel that they should be a computer scientist or have a law degree Just to search for information or to shop online That why should they? And I think there’s a responsibility on business and a responsibility on legislators, I think, to safeguard individuals when they go online, because we see time and time again that some big players simply don’t care.

[00:10:31] And some of those big players only take action. When under the scrutiny or the threat of regulatory action.

[00:10:38] Luke: Yeah, I was going to ask about that. I mean, cause I know that just from working with you at brave, there are oftentimes where you have to engage with partners or other, people that we’re working with, entities.

[00:10:48] And I know it seems like the big tech companies, they can arbitrate themselves. To death, like they can pay for court cases forever. How has your experience working with some of the smaller companies around these issues when they’ve [00:11:00] come up? Like, have you found that they’ve been amenable to requests around these things or that they’re thinking about the questions that you’re posing to them or what’s your take on not the big tech guys, but maybe the smaller entities?

[00:11:10] Pat: Well, a number of the smaller guys, and particularly the web three space that I might encounter through what I do brave, uh, very amenable to. Learning that they’ll have a solution and then there’s someone that puts my brain on that looks for the awkward bits When I have a conversation with them, then they’ve been very amenable to making change both technical and At a policy level if you like and I notice level for consumers.

[00:11:36] Yeah

[00:11:36] Luke: You’re seeing things like Privacy Sandbox lots of put it lightly. It seems like lipstick on a pig kind of initiatives from some of these guys, but they’re at the same time you’re seeing AI. It’s kind of, you know, hitting, there’s a big hype around this. And you just mentioned Web 3. 2. There’s all these emerging techs that are going to bring a whole bunch of different angles on privacy problems.

[00:11:57] What’s your take on how big tech is [00:12:00] acting in the space right now? And is there any hope on that front?

[00:12:03] Pat: Well, you can see AI front that there are a couple of big players who are trying to be dominant in the AI space. There is a concern there and there are a number of commentators on Twitter that are very good at highlighting that and the issues.

[00:12:17] Is big tech good at changing to provide a more respectful and protective privacy experience? Was that it, Luke, or? Do you think

[00:12:27] Luke: that there’s any substance to some of these initiatives? When I see Google say something like privacy is paramount to us and everything we do in a blog post, I laugh. But I imagine I’m the minority just because I know what happens over there.

[00:12:41] Some people might take those things seriously, right? I see Apple saying marketing on privacy, right? Well, they’ve done some good things with the tech on privacy. Well, no, let’s go. I mean, like, I think it’s important. That’s one of the reasons why I wanted to talk to you about this, because they might do one or two things that help with like cookie, you know, address problems with [00:13:00] cookies, but at the same time, you know, these entities that are talking about privacy are also capturing all of your purchase data and all these other things.

[00:13:06] Like, I mean, what is somebody that’s, that works in this protection space? When you see these initiatives from these big companies, kind of what’s the reflex?

[00:13:13] Pat: there’s a lot of marketing. I don’t think people should think Apple is a data company. And it wants a slice of you, you know, I don’t think apple views you any differently to others You’re a pizza and everybody wants a slice of you.

[00:13:26] How are you going to slice it? How are you going to dice it apple is as much the same as well I’ve had engagements with apple as me as an individual To the point now where I have to refer some things to the regulator because that’s what apple are doing They’re saying go away where you point out the issues the concern some of that might be for example If you detect and turn off personalized ads You You still get ads, on what basis and I don’t want these ads, you can’t stop them, right?

[00:13:54] You can’t stop Apple digging into the data to, but you know, they come up with the argument that it’s all privacy [00:14:00] preserving. Well, yeah, but that also comes to this respect for the individual as well. So shouldn’t the individual have a choice, at least on Brave, for example, you opt in to ads. If you don’t like them, you can turn them off.

[00:14:10] So sometimes you just have to go beyond the law and ask, well, what’s right? What’s the right thing to do by consumers, particularly where consumers are spending thousands and thousands of dollars on a device. And they don’t spend thousands and thousands and thousands of dollars on a device to be a source of further revenue.

[00:14:29] By mining data about them. I think there’s a lot there and I looked at the other day where apple for example Had done something that removed a couple of apps for the chinese government But they still use the the strapline that privacy is a fundamental human right? Maybe they should add some subtext except when you live in china or other regimes where we will do what the government wants us to do So I do think people should look carefully at what big tech companies say or any company that makes a privacy promise You

[00:14:59] Luke: How hard is [00:15:00] it to navigate when you’ve got, gosh, I think at one point I was looking, there’s like a hundred different data bills that are out there across the globe, right?

[00:15:08] Or maybe I’m overstating a number.

[00:15:09] Pat: There

[00:15:12] Luke: you go. Okay. Like, like that’s a lot to manage, right? Like, is it, how do you balance all of that when we’ve got users? I mean, we’re a global company, right? Like with the global user base, I mean, we’re a company with a global user base. Like, how do you manage all of these?

[00:15:26] Data points and knowing from a first principles approach, I guess, like on how to address these things and make sure we’re doing what we need to do.

[00:15:33] Pat: Well, I guess it’s, we’re very fortunate that of those 162 laws, there’s a lot of, there’s a common set of principles that apply in each of them.

[00:15:41] The US is the outlier of course, because it doesn’t have a federal law at all. It has a number of state privacy laws now, which are coming along. And the US government is talking about federal law, but then the states are worried that they don’t want the federal law to preempt state law because I think state law can be better than federal law.

[00:15:58] So we’ve got that to deal with at the moment. But, [00:16:00] so what we do is map to the high standard, and then we look for any nuances that we can. that might come about, for example, through the CCPA. There are subtle differences sometimes. But that’s what we’re doing. If you look at, like, the right to be forgotten, and search, I don’t deny someone the right to be forgotten because they live in a country without a law that says you have a right to be forgotten.

[00:16:22] I have to balance up that request and weigh up a lot of things to determine is it a public interest in the information being made public, for example. What are the implications for the individual? What are the negative implications for the individual? And where there isn’t a law, I apply the same rules because, you know, I get requests.

[00:16:41] Related to people being doxxed or young people that have put information online when they were young, but have since reflected and regret it. And you can see the harm that comes from that information still being available. So you act, and that’s what I think the thing earlier about data, privacy, data doesn’t have privacy, but people [00:17:00] do keep people at the center of what you’re doing.

[00:17:03] And I’m glad to say that in brave, there’s also a huge commitment. To privacy and beyond, you know, a couple of colleagues, we’ve had great conversations when we’ve met about privacy being, but one human right that data can impact on these colleagues have wanted to learn more about it. So we discussed how data, something as innocuous as a, as an IP address could lead to someone’s arrest, for example, in certain contexts and what that might mean, depending on the country that’s requesting the data, for example, and they’ve been so fascinated by it that they wanted to learn, well, how can I learn more about.

[00:17:36] Data protection and human rights in the design space. So I pointed them to a course They both went online did a 12 hour course And and i’ve come away from that really refreshed and and thinking differently about how you design and how you build So that’s like it’s not even pushing it to an open door.

[00:17:53] The door was opened in the first place really quite refreshing for me to have that and I think You [00:18:00] know, when I look at the engineering, the security engineering team, the privacy engineering team, and I look at the researchers at Brave and all those work in DevOps and elsewhere, you’ve actually got really world class people that are employed, and that’s one thing that I’ve been deeply impressed about working at Brave, to be honest.

[00:18:21] Luke: No, it’s awesome. I mean, because I think too, not to toot our own horn too much, I think, but, and it’s not just us either, right? Like, if you look, people say privacy, you know, do people care? I think time is a factor in all of this, right? You know, when I joined Brave back in 2016, we had a couple thousand users, right?

[00:18:37] It was very much a, in the theory realm okay, do people care? You can’t argue with the fact that 70, Um, 3 million people are using a privacy product with brave or 100 million people are using protons products, right? Or, you know, VPNs are such a huge segment with a huge amount of users using them, regardless of what specific reason.

[00:18:57] I mean, I think that, you know, there are benefits that you [00:19:00] get from these things that are kind of becoming fact where as more and more people adopt them. But I also think on the other end of the spectrum, too, because you mentioned the states versus federal It’s one thing I noticed too, when all of the Supreme Court and stuff, for example, happened, whether it’s like COVID or, the Supreme Court stuff around Roe v.

[00:19:17] Wade, et cetera, where all of a sudden that geolocation means something to somebody that they didn’t think about before, right? It’s kind of a forcing function. And I think. It’s one thing where Brave’s principles are basically like, and you can run with this too, please. But, but like, you know, collect or process as little information on the user as possible, right?

[00:19:38] Like, are you seeing other companies starting to kind of adopt these practices or would you wish that they would, or, you know, what’s your take on, let’s say like three, five years out, do you see more companies trying to take this approach from a policy perspective? I see more

[00:19:51] Pat: and more. Startups, more and more smaller companies that are truly privacy focused.

[00:19:56] You mentioned Proton. I use Proton services, for example. I also use [00:20:00] Tutanota is another one. That I use and there are these companies But you’ve also got to find information on whether you can trust them because anybody can say hey, we’re we’re privacy We’re okay, right when they may not be which is what I found on some services and I dig One thing that I think because we talked about law Is that technology?

[00:20:19] And tech companies are more powerful than the law. They have the ability to either protect or to erode privacy. And I think we’ve got to the point now where individuals are more aware than ever they were because of cases like you, you discussed, you know, whether it’s the fact that someone might know that you’ve been to an abortion clinic, or someone might know that you’ve used this particular fertility app.

[00:20:42] Or someone knows that you’ve been visiting a mosque or a church, for example. I think this is more and more in the public consciousness and what people are wanting is how can I prevent These privacy invasions and these risks to myself, and they are looking for those tools and those services. You [00:21:00] know, I get tweeted all the time.

[00:21:01] Why don’t you try this operating system? When I was on Android, cause I use Android for testing purposes. So I’ve tried some of those operating systems. I do have a look at these things. But I said, well, there’s no way on earth that my family and my friends are going to use this because you have to be a geek.

[00:21:14] You have to be a, you know, to do it and they’re not going to do it. All they want is something that’s simple and easy to use. And I get asked all the time, we’ve had law for decades. So why are we still having to do these things? So where will it be in 10 years time? I think a lot better than we are now because I think we are seriously looking at and we see regulators, whether that’s the FTC or whether European commission at the moment, that’s outlawing deceptive design, for example, requiring a big tech to provide better choices, better competitive choices too.

[00:21:45] And I think, you know, I’d go back to that question about his privacy dead is one of the things, but I agree with Gus Hussain. Who’s a friend, he’s the head of Privacy International, and he said, I think it was about 2017, I think, and said that the [00:22:00] defense of privacy will be the savior of the future, essentially, and I agree with him entirely, because you know, this concept of privacy has developed for centuries.

[00:22:10] The legal and technical approaches have developed over time. I think you may recall when we were in Lisbon together, I discussed a case of, uh, tic in Carrington in 1765, a UK court case, and Tic was a writer who was thought to have produced a sidious material, and bailiffs from the king broke into his house and searched his home and his private papers, and he sued them in court.

[00:22:32] This is 1765. And the court found in favor of NTIC, and they said, by the laws of England, every invasion of private property, be it ever so minute, is a trespass. So we see this concept of the right to privacy as the right to expect privacy for a person’s private life, family life, home life, correspondence.

[00:22:50] And that, interestingly, that 1765 UK court was referenced in Boyd versus United States of 1886. In that case, [00:23:00] that Boyd versus United States established a principle that governments must obtain a warrant or consent before searching and seizing private papers. Now link that to today, so you can leap from 1765 to 1886 to today, or just a few years ago.

[00:23:12] If you think of that concept of they were, searching for and seizing is private papers and private information. And the same in Boyd versus United States. And I think of today, how much information that’s private on your mobile device, for example, and there was a case in the court of justice of the EU just a few years ago, it was called the planet 49 case.

[00:23:33] And it was about cookies and consent, interestingly enough, but the judge ruled again and emphasize the information that’s stored on your device. is part of the private sphere. It’s part of the private sphere of your life. It requires protection under your right to privacy under the European Convention on Human Rights.

[00:23:49] So you’ve got all the way from 1765 to today, and it shows that the concept Of privacy, though it changes and the nature of technology changes the [00:24:00] need for privacy, the need for protection exists the same.

[00:24:04] Luke: Absolutely. People lose sight of these things. This is where I get concerned too, is you’ve got a generation of people that basically have been told, Oh, the internet’s free because of all this ad stuff, and this is just the cost of doing things and then even, was yesterday, you know, there was a bill that passed that it infringes on, you know, ability to collect information from people and people use this veil of law enforcement and, you know, breaking the law, like, Oh, we just, we want to make sure that we can law enforcement can get this information if they need to, or whatever.

[00:24:31] But I think people lose sight of the fact, you know, in order for them to know what information they have to collect everything, that’s a high cost of doing any of that. What’s your take when people. Say those types of things around, well, we, we just want to use it to get the bad guys or whatever.

[00:24:44] Pat: Well, it’s decides is the bad guy and what it is.

[00:24:48] And, you know, I think it reminds me of librarians actually to clearly us librarians, but others also that stood up to defend privacy, you know, demands for the reading lists that you, the [00:25:00] books that you’ve signed out, et cetera, because all of this has significant implications for your life. I mean, it can literally be a matter of life and death.

[00:25:07] Access to this data and I have no objection to technology companies Designing to not have data in the first place if you’ve got it They’re going to come for it. If you haven’t got it, they can’t come for it and I understand and empathize With law enforcement would say, but we need to catch the bad guys.

[00:25:25] You do investigate techniques I’ve always evolved. and it’s not as if the bad guys don’t make mistakes. It’s not as if even in sort of encrypted services that there isn’t data, some metadata there, or some, some other signals there to help. And I think often when I think of law enforcement, the argument I see used is that.

[00:25:47] They don’t want encryption. They don’t want other services. And a lot of that is because of the lack of resources that enforcement have, the ability to investigate, the ability to apprehend. so they want technology to do more. I do worry that yes, [00:26:00] particularly like in the UK, that we’re becoming more and more of a surveillance society.

[00:26:04] I think the UK definitely edging into that space.

[00:26:08] Luke: Yeah, I think that the point you had about the technology, the power of the technology is really important, and it goes in ways, too. Yes, the technology can be used against you, but also we’re able to do things like at Brave where we can drive this point that we don’t need the data to begin with.

[00:26:24] And we can get adoption with that. You can start to see things change in the positive direction, especially if we can keep proving that stuff out. So I’m glad we’ve got folks like you here to, help us ask those challenging questions internally and, communicate that more broadly too. You know, just really quick.

[00:26:38] I think you mentioned a good point earlier about companies saying that they’re using this tool as a privacy tool where it might not be. Like, for the audience here, are there any things you’d recommend when they’re looking at privacy products? Like what things to look out for, red flags, things like that, so that, or resources that might help them learn about which, you know, how to make responsible, good [00:27:00] choices around privacy products that they’re evaluating.

[00:27:03] Pat: Yeah. It’s, it’s tough. Really. You’ve got Arthur’s privacy list, of course, of browsers, Arthur Edelson. One thing that I do, first of all, is when I visit a website, I look at what tracking is taking place. And if there are a couple of people that say, Oh yeah, we’re really into your privacy. And there’s a whole bunch of tracking taking place on the website.

[00:27:20] That’s a bit of a flag for me. And then I would definitely, I mean, I’m, I’m a geek, right? So I read the privacy policies. I’m one of the few that does. And if I find things in there. That concern me and I write to them and they don’t come back to me then they’re off my list But it’s hard. I mean I got a message earlier today What would you recommend for this part from a friend and and that’s what they do They turn to me they ask because they know that I do do the research Um what i’ll do i’ll drop some of the tools in my twitter handle I think okay and some of the resources they can read because there’s quite a lot and in terms of people and books and resources There are so many fantastic good people You And actually [00:28:00] for someone like me, who’s, off blonde now, I think it’s really fantastic to know that there are fantastic people that are coming up and up and up and up and on who are going to be there in 20 years, 30 years, still defending and championing privacy, really, but I’ll drop some of those names, some of the resources in my Twitter handle.

[00:28:18] That’s a really tricky one to answer because even for someone like me, it can be pretty tricky getting to know and having to look. There’s a, if you’re on Android, for example, there’s an app called TC slim developed by a professor, assistant professor, I think it’s over in the Netherlands, Conor Colnick, who developed TC slim.

[00:28:39] And it allows you to understand what trackers are in an Android app. And I would recommend people have a look at that. And they might be quite surprised and then I would suggest to them Go and ask google go and ask the app developer. Why are these trackers? What are they doing? Where have you got my consent for these things and then?

[00:28:58] If the response is negative then go off [00:29:00] to the regulators and give the information regulators and they’ll press for change takes a lot because even when you yeah, even when someone like me responds to the regulators, but the regulators Don’t always look at the complaint or the refuse to look at the complaint as is the case at the moment in the UK.

[00:29:16] But yeah, that sucks.

[00:29:19] Luke: It’s good that people know that too, I think, you know, because these things all just look like big monoliths to people, right? Like that aren’t around, like it’s kind of out of off the radar, you know? And the only way that people understand is if they know and you know, I think you’ve shown a light here that’s important for people to understand.

[00:29:34] You know, one of the takeaways I have from this conversation is that There is no silver bullet, right? Like a mix of things, you know, using technology that will help protect your privacy, like being active in and seeking out solutions that

[00:29:45] Pat: care. It is. So one of the things, for example, rather than having the Twitter app on my phone and I still call it Twitter and we’ll always call it Twitter, I think I access Twitter via the break browser.

[00:29:55] And I don’t have any issues. I don’t see any ads when I want to watch [00:30:00] YouTube video. I used brave browser I don’t see any ads I access. yes, I use facebook and there are some out there going to go You care about privacy how the hell can you use facebook because i’m a pragmatist and many people I know use facebook But I use one of the other versions of brave.

[00:30:14] So I use brave nightly just for facebook alone So I do segregate facebook out there.

[00:30:19] Luke: Yeah,

[00:30:19] Pat: there are things that you can do And there and split your, you know, split between browsers and always use a browser that does strip out this advertising. I think we’re going to see more and more attention on the role of the browser because they are literally your gateway to the web.

[00:30:36] I see the emergence of some browsers that cause me a little bit of concern, but I think, yeah, choose a browser that strips out the tracking and that can verify that. Yeah. Privacy tests. Isn’t it? Dot org. I think is Arthur’s. I’ll, I’ll put that in the, in my Twitter handle. Yeah, that’s a great, great resource.

[00:30:53] Go and have a look at that and the privacy features that different browsers have. Speak to people who, who can help you, I think. [00:31:00] Yeah.

[00:31:00] Luke: Yeah, no, that’s great. And you’ve been really gracious with your time today. You know, I’d also love to have you come back to at some point and check in, I think maybe when there’s some interesting stuff happening.

[00:31:10] Anytime you want to come on and talk about it, we’d always welcome that. And you’ve got that federal privacy law. Oh my gosh, exactly. And it’s been really great. And I’m really glad that you came on because it kind of gives people a human face and a voice to like, a really important role at Brave. I mean, if we’re not caring about this with our own stuff, aside from, you know, building the tech and.

[00:31:30] Actually, one, one more thing, you know, we cut off, I think this is important, and it might not be very well known. Like when we ship new things, we have a process that we go through. It’s like security and privacy review. I mean, maybe you can walk the audience through that process a little bit, just so they can understand what that means from your point of view on these things.

[00:31:48] Pat: Yeah. So that’s right. There’s a fantastic security team and a fantastic privacy team and the research team, of course. And when a product is proposed and before it’s [00:32:00] built, it will go through a review process. So security folks, privacy folks, and then, and also me sit between the two, I guess, would look at that and see what data is being used.

[00:32:13] See whether that data was strictly necessary, for example, and make sure we do apply principles of data minimization, make sure it’s secure. and how it will be manifested to the user as well. So can they understand if that service requires the user taking an action based on a small notice, whatever else can the user, can the individual understand what’s being asked of them and make an informed decision?

[00:32:35] It’s a good, process. To be honest, and I’m glad it’s there, but more often than not, I have to say the Luke that the guys are so good that they’ve thought about most things before it gets to the review process because they’re so used to it now.

[00:32:49] Luke: Yeah.

[00:32:49] Pat: That they think, what would these guys say? I think is what they do.

[00:32:53] And they try and preempt most of the questions we might ask. And that makes it a lot easier because we do have a lot of reviews [00:33:00] that we have to get through, but even third parties, even when brave is thinking of engaging with the third party, it’s I would be asked even at that level to do some due diligence on that third party, which I do.

[00:33:12] That’s before you even get into a product space. Are these the kind of guys that we might want to do business with? Are these the kind of guys, how are they approaching privacy? Sometimes I get asked those questions as well, which is good.

[00:33:24] Luke: Thanks for walking through that. Cause this is one of those things where, you know, we have a lot of developers and a lot of.

[00:33:29] Not just really adopters that listen to this, but teams and a lot of times it’s like, okay, well, what am I going to do? Right? Like, but I think like approaching it from that level to where you’re putting a process and people are kind of thinking about these things before we’re integrating them and kind of taking a proactive approach is something that I wanted to make sure people heard more about, because I think it’s one of those things that’s really great.

[00:33:48] Like I, having worked in ad tech before I came to brave, I can guarantee you, nobody talks about stuff like that even after it’s already shipped in production. Right. And so for folks that are listening. Maybe developing [00:34:00] applications that people use. I think like, you know, starting early and creating process that involves different stakeholders, like what Pat just described is probably a pretty smart approach.

[00:34:08] Do we really need to make those data collection things happen? Or is there a lighter touch that we can apply and that kind of thing like that? So I appreciate you shining a light on that. And, uh, yeah, thanks again for your time. Where can people find you on Twitter? Yes, at privacy matters, really simple.

[00:34:24] Privacy matters, easy enough. Well, thank you so much, Pat. Really appreciate your time today, and we’ll be sure to have you back soon. All right. Thanks, Luke. Take care. Cheers. Bye. Thanks, man. Thanks for listening to the Brave Technologist podcast. To never miss an episode, make sure you hit follow in your podcast app.

[00:34:41] If you haven’t already made the switch to the Brave browser, you can download it for free today at brave. com. And start using Brave Search, which enables you to search the web privately. Brave also shields you from the ads, trackers, and other creepy stuff following you across the web.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • The current state of regulation and global data protection laws, and how well those laws are working
  • The heightened public awareness of issues like geolocation tracking and data minimization
  • Why tech companies have more influence and power than the law itself

Guest List

The amazing cast and crew:

  • Pat Walshe - Data Protection Officer at Brave

    Pat Walshe, Data Protection Officer at Brave, has been working in the field of data protection for over 28 years. Pat prefers the term ‘data protection’ over just privacy because data about people (and their digital lives) can impact a range of human rights and freedoms (privacy being but one such right).

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.