Back to episodes

Episode 31

Data Regulation & GDPR During COVID-19

Christian D’Cunha, Head of the Cyber Coordination Task Force at the European Commission, discusses the critical role of GDPR during the pandemic, and explores how Europe balanced the necessity of contact tracing and mobility data with the imperative to protect personal privacy. He also shares the challenges regulators face in enforcing GDPR against tech giants, and the strategies these companies use to delay compliance.

Transcript

[00:00:00] Luke: From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist podcast, we’re demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I’m your host, Luke Malks, VP of Business Operations at Brave Software, makers of the privacy respecting Brave browser and search engine, now powering AI with the Brave Search API.

[00:00:28] You’re listening to a new episode of The Brave Technologist, and this one features Christian who is the head of the Cyber Coordination Task Force and the Directorate General for Communications, Content and Technology in the European Commission. In recent years, he oversaw a study into the future of digital advertising and its impact on privacy, drafted the Data Act proposal, the EU Cybersecurity Strategy, and the rollout of interoperable contact tracing apps during the COVID pandemic.

[00:00:53] In this episode, we discussed how GDPR is upheld in an evolving technological landscape and what the future of its [00:01:00] effectiveness depends on, how the deployment of contact tracing apps during COVID and the nuances of using mobility to stop the spread of pandemic while maintaining privacy. Potential ways that the regulatory enforcement and policies can evolve in this changing landscape.

[00:01:13] Now for this week’s episode of the brave technologist.

[00:01:17] Christian: Christian, welcome to the brave technologist. How are you doing today? I’m doing well. It’s great to be here. Excellent. why don’t we just start and kind of set the table a bit.

[00:01:28] Luke: why don’t you give our audience like a little bit of a sense of kind of how you ended up in the position that you’re in now and what you’re up to these days?

[00:01:36] Christian: I work in a part of the European Commission that deals with cyber security and digital policy. I’ve been in this space for a few years now.

[00:01:42] I moved to Belgium almost exactly 16 years ago from London, as you could tell from my accent. How did I get here? really don’t know. not had a plan. I started off in my career working for politicians. I spent time working for senior judges. After I moved to Brussels, I was working in the area of internal security.

[00:01:59] I had to [00:02:00] defend something called the Data Retention Directive. And I got out just in time before the court annulled it. Then I worked for the European Data Protection Supervisor, and my main task there initially was to build a case for synergies between competition or antitrust enforcement and privacy law.

[00:02:16] And then Giuliani Buttarelli, the then supervisor, asked me to be his chief of staff, and we worked together on his big ideas around ethics and digitization and market power. When I moved back to the commission in 2020, it was just before lockdown. And so I got caught up in a lot of the activity around responding to the pandemic, you know, trying to find digital tools to help you cope with it.

[00:02:41] The last two or three years have really been cyber security and developing kind of operational aspects of how you cooperate on cyber threats.

[00:02:49] Luke: what an interesting time to get thrown into the mix, right? after you’ve got GDPR kind of. Starting to take effect or a couple of years after that, and then all of a sudden you have this global pandemic where how do you wrangle these [00:03:00] two things where they inevitably cross paths in a pretty significant way, right?

[00:03:04] I mean, I would imagine for folks that that aren’t. familiar, like GDPR was like this really, really big regulation around privacy. Right. I think it was enacted in 2018 where Europe actually like defined what personal data is and some pretty sharp teeth on people that broke those rules. How was it balancing that with.

[00:03:21] Now you have this concept of like contact tracing, right? Like with COVID and all of these things that are kind of antithetical to, so like good privacy for users.

[00:03:31] Christian: Well, the GDPR I think was probably became the most famous EU legislation. In the world, I remember, like you say, I came into force in 2018 and suddenly, you know, people I’d never expect to talk to about my job were wanting to talk about it because it basically affected everyone.

[00:03:51] The way it collided with the global pandemic was, was interesting because there’s always been so much hype in the last 10, 15 years about [00:04:00] how data was going to save the world. This was the moment where, it was time to, you know, show that it wasn’t just slogans, when we were in this, this extraordinary situation.

[00:04:11] And there was a lot of discussion. I remember being part of them, both formally and in, in the workplace, but also, you know, with, friends in the wider community about what is it that we can do with data now that we’re in a genuine, global public emergency. There was tension between those who thought, you know, now’s the time that let’s get data out of its, kind of stranglehold in a handful of monopolies and let’s start making it work for society.

[00:04:35] On the other hand, there were those who were more cautious thinking, you know, you still need to, data protection is a fundamental right. Privacy is a fundamental right. we have to take care that we do it according to the rules. In the end, you know, there was a lot of where I was standing that it crystallized around a couple of initiatives.

[00:04:51] One of them was To what extent you could use mobility data, you know, we’re walking around with our, cell phones. So to what extent can we use [00:05:00] mobility data to track the spread of, of the pandemic? That was one thing. And then anonymized mobility data as well, which then got into the question of when, when do you know if something’s actually anonymized?

[00:05:11] And then the other question was on, as you say, the, these contact tracing apps, which I first, I think first became like a. They generated a lot of excitement because they were deployed in, I think it was Singapore and Taiwan initially. And so you had around that period was a lot of academics developing protocols, putting them on GitHub on how, how an app could be privacy preserving, but also enable individuals to be pinged if they were in proximity with someone who’d, who was positive with the virus.

[00:05:42] Yeah, I mean, go into more detail about what happened there, but you know, from a, from a privacy angle is extremely interesting. Although at the same time, I, I felt it was almost parochial because, you know, these weren’t, this was just an app at the end of the day. It wasn’t like, it wasn’t like data was saving the world as, as we’ve been told to believe.[00:06:00]

[00:06:00] Luke: Looking back, I mean, nobody can change the past, but I mean, how well you think that response to that pandemic, like help prepare us for the next event like that? Like how do you feel like we’re in a better place now on these issues than we were a few years ago?

[00:06:17] Christian: A technical way to answer that question is that one of the things that we did in response, and I was involved in this was, was developed something called the data to act.

[00:06:26] Which entered into force, I think last year it does various things. It’s mainly around, you know, access to data on your internet of things, devices, but, and cloud switching, but there is a piece in there on, government access to privately held data, there was an expert group. In the context of the pandemic, which which did did some work on how what sort of framework could enable society more generally to get access to valuable data, which would help them tackle societal problems like, you know, environmental problems, you know, public emergencies.

[00:06:59] In the end, like [00:07:00] the data access is very, it does open the door to that, but in a very kind of controlled way. So time will tell whether it makes a difference. I don’t know. I mean, more generally, rather than, you know, the role of data, I think what struck me most from the experience of the contact tracing apps was that whilst there was this kind of buzz of activity around, you know, protocols and, Which one is the best?

[00:07:20] Which one is the best for privacy? Which one’s the most effective? Suddenly like a ex machina, there was, Google and Apple who decided together that they were going to favor one particular protocol and they basically impose it on the world. And they were able to do that because they’re in charge of the operating systems that almost all the mobile phones use.

[00:07:41] In reality, you know, the, the protocol that they chose was probably good for, for privacy. On the other hand, it demonstrated the degree to which these big power, big companies can basically take the lead on, on public regulation, which would, in the past had fallen to, fallen to governments. [00:08:00]

[00:08:00] Luke: Almost switching gears, but not necessarily, on the back of that end, your predecessor, mentioned, a digital underclass, in his, in his manifesto, Giovanni Buttarelli, could you explain this concept a bit further?

[00:08:10] Cause I found it really interesting and kind of for our audience, a little bit more about this whole idea of the digital underclass and potential solutions or, or, or ways to protect them from your point of view.

[00:08:21] Christian: Yeah, this is the manifesto. He was, he was my boss. He was the data protection supervisor for five years.

[00:08:28] He very sadly died too young, but just before he died, he, we were working together on what he wanted to be his manifesto for what he hoped to be a second mandate is data protection supervisor. And he died before he could finish it. And so I basically took his notes and my recollections of our conversations to write it up.

[00:08:46] That was published at the end of 2019. He was very conscious of the kind of what we call the digital dividends. So, you know, digitization was, was happening rapidly, [00:09:00] but it wasn’t being experienced equally everywhere. And there were some, a few people who were benefiting enormously from it, making an enormous amount of money or, you know, enjoying conveniences of modern life, but there are also others who were basically being, you know, Not, not really enjoying the benefits, but having to pay the costs.

[00:09:19] So mainly in, in the, in the form of being farmed for their data. So, you know, he, he had in mind like low wage workers in warehouses being tracked constantly, children, whether it’s, you know, through the, through their own phones or in schools, heavily surveilled. migrants and refugees. So basically, the idea was that, you know, there are certain, certain people in control of the algorithms, making an enormous amount of money and becoming more powerful.

[00:09:50] There was a group of people who were enjoying the benefits of digitalization, but then, then there was others who were basically having their lives dictated to by algorithms. [00:10:00] And that they weren’t able to question those decisions that were made about them. And then you, it’s a global issue because like in the global south in particular, we know that companies have been moving in there to collect data, often where there are no controls over, over that process.

[00:10:16] If you look at the work of,United Nations Conference on Trade and Development, They do digital economy reports, I think, every two years. And that really does expose the kind of inequalities on a global scale in terms of, rights over data and benefits from, from data processing. The other thing that was really inspiring for us was the work of AI now, particularly Kate Crawford, where they draw attention to the, you know, the physical resources in terms of human labor and natural resources and the waste that’s generated in order to develop AI systems.

[00:10:45] I think he really wanted to engage with that, that fact that, you know, digitalization can’t be treated as an abstract thing, that it’s woven into the fabric of, Social justice and equality or lack of it within our societies and around the world.

[00:10:59] Luke: Yeah, no, it [00:11:00] seems like, even now too, all of a sudden you have this kind of idea of certain parts of the world, not having access to some of these tooling and how it could, you know, with the rate in which this is getting adopted, I mean, some of it’s kind of hype, right.

[00:11:12] But, but still at the same time, like how you could really kind of fall behind too, if these things aren’t accessible, globally. Looking at privacy regulation, like GDPR, how do we ensure that that regulation is effective in the face of kind of this rapidly advancing landscape?

[00:11:30] Christian: I think the first thing to say is that the GDPR is a baseline.

[00:11:33] It’s a, it’s a horizontal rule book. It basically applies to everyone who’s processing personal data, unless it’s like a household purely personal, they try to write it so that it was technology neutral. I can tell you that Giovanni believed, and he said this publicly, that he thought it was too bureaucratic and prescriptive.

[00:11:52] Prescriptive in terms of what controllers had to do. [00:12:00] You know, all the detail about data protection offices, for example. Yeah, I think he argued that that was perhaps a bit excessive. Also the detail on what data protection authorities should do to do their jobs. His watchword was kind of accountability.

[00:12:14] Don’t, you don’t have to treat them like children, make them take responsibility for what they’re doing and defend it. I would also say that it’s, it’s unfair that people have put all of the burden on the injustices of the imperfections in digital society on the GDPR and on data protection authorities.

[00:12:33] You know, it was never designed to, to fix everything, especially when you’ve got the most powerful companies in the world. Their business practices are really kind of under the GDPR, but it’s, it’s not only, it’s not only about the practices, it’s also about the size, which isn’t something that the GDPR necessarily can do anything about.

[00:12:51] I’ve just had an article published. I wrote with, someone called Anna Kollaps and we both worked for Giovanni in his office. This is as [00:13:00] EDPS 20th anniversary, which was published last week. It’s on the notion of imbalance, particularly in the sense of consent. So within the GDPR, there is this idea that you can’t freely consent to something if, if you’re in a position of extreme power imbalance, what we’ve seen in the last five years is a new generation of laws, like the Digital Markets Act, the Digital Services Act, also the Data Act, which calibrate responsibilities according to the level of risk, that’s and power of the entities which are in scope with the obligations, which is not something which you really see in the GDPR.

[00:13:43] So I think the GDPR needs to be seen within that wider context to see how they’re all going to interact. It’s, it’s not altogether clear yet. I think also going back to what Giovanni used to say is he actually predicted that the notion of personal data would [00:14:00] disappear. In the future that you would just have this massive data that machines can somehow use to single out an individual or make a decision which affects the individual or the collective.

[00:14:12] And we weren’t thinking about generative AI was not a thing back then while he was alive. It kind of makes sense now when you consider just how much data has been scrubbed and basically it was already reported last year that the big like AI companies have already scraped all of the data on the internet.

[00:14:31] So there’s like, there’s nothing left. So I think in the future, and this is also in the manifesto, the idea that, you know, those who care about human rights and data, data justice, you need to build alliances with the green, with the green agenda issue questions of sustainability. But also democracy and the rule of law.

[00:14:52] So that there’s certain, there’s certain practices that need to be somehow controlled by, by democracy, which is, which is hard to do China does [00:15:00] it through authoritarian means, but the future of data protection and GDPR, in my opinion, is going to hinge on, on our abilities, democracies to control extremely powerful players.

[00:15:10] Who are doing things with data, which is very hard to contest because of their size and power in a general sense, not just in terms of their computes.

[00:15:19] Luke: Even from kind of a technical perspective, right? I remember a year or two ago, having a conversation with Johnny Ryan about this, where he was, you know, he was very optimistic about GPR when it was initially coming out, but then, you know, after.

[00:15:33] The rollout was speaking to the fact that there’s so much of a technical barrier for a lot of people, even people that are responsible for enforcing the law or for the, just the regulators in general to kind of understand, unless they can really put time into learning it, like the adversarial game. At play because these companies are so large that they can kind of spend all this time litigating, but at the same time, like [00:16:00] the way that the technology works is it’s not terribly easy for people to understand what’s your take on that?

[00:16:07] Do you think that there is enough like operational knowledge on behalf of the folks that are in charge of regulating or holding these companies to account that have this power?

[00:16:17] Christian: I hear people joke about the fact that a lot of these really big companies, they say, you know, we care about privacy, but if you interrogate it, it’s like they care about their own privacy, you know, their own ability to keep their own secrets.

[00:16:29] And they’re, they’re incredibly secret in their practices. We’ve come a long way, but we still don’t know how their algorithms work. We still don’t know exactly what data, you know, as an individual, I don’t know what data they have on me. And I could ask the question and I might get a curated answer, but is that, is that really the case?

[00:16:47] I also think that there’s a book to be written about the, the tactics that are used in order to defer enforcement. So these companies have armies of lawyers working for them [00:17:00] and, you know, they will use every single trick in the book in order to contest every minute detail during the process. The longer a case can be kept from resolution, the more revenue can be generated.

[00:17:15] With those practices, which may or may not be contrary to the GDPR. So, yeah, I mean, I, I think, and this isn’t just a, this isn’t just in, in data protection, it’s, it’s in, it’s in other areas of law as well. It’s a question of size and power.

[00:17:32] Luke: Speaking on that too, are there practical strategies that, and policies that can be put in place to empower users and, and promote competition and kind of have a more balanced data ecosystem?

[00:17:43] Christian: In the EU, there’s obviously been a lot of legislation with new kind of rights and, and tools available to individuals to exercise their rights. It takes a lot of effort and money in order to get to court and have your rights enforced. You know, Max Schrems is, [00:18:00] is almost a household name because, He’s managed to have access to, to funds to enable him to, to pursue Meta and others through, through the court system.

[00:18:09] I don’t know how much he’s spent, but you know, for, for high profile cases cost hundreds of thousands of Euro. Most people don’t have access to that sort of resource. I think there are things that could be done within the spirit of the existing rules. So in a report which we commissioned, On digital advertising published last year, they went through the various issues with the ecosystem.

[00:18:31] But in terms of like solutions, one of the things they suggested was having, like, you should be able to choose your own digital avatar. The problem is not that you’re targeted with advertising. That’s not necessarily a problem. The problem is that Your profile is, is basically hidden to you. You have no control over it.

[00:18:48] And the data again, which is collected on you is generally unknown to you. But what if you were able to basically choose your, your own profile and then [00:19:00] you, you give it to these, to these advertisers, say here, you send me what you want, this is the profile that I choose, but that would, that would really give people individual empowerment about their own digital identity.

[00:19:11] I think in the antitrust world, there’s a lot of discussion around Treating some of these really big companies, maybe monopolies as, as public utilities or common carriers where they’re, where they shouldn’t be allowed to discriminate on their own platform in the same way as we have rules for, you know, telcos or public interest, like broadcasters in the UK.

[00:19:32] For example, BBC. I also, something that comes to mind is. The, in the EU in particular, there’s a real lack of diversity of leadership. I mean, on gender, I mean, there’ve been big advances, but there are no people of color whatsoever in positions of leadership anywhere, and there never has been in the EU, and I think that’s, I think that’s pretty disgraceful.

[00:19:56] It’s in our interests to have that because, you know, in, in order [00:20:00] to, it tends to be like people of color who, that they’re disproportionately disenfranchised from, from exercising their rights, digital and otherwise. Particularly if they’re coming from outside the EU. So in order to, for them to have fair representation, I think it’s, it’s reasonable to expect to see some people who can, who look like them or can relate to them within positions of authority.

[00:20:21] So those are a few, a few ideas.

[00:20:25] Luke: You brought up an interesting point to the notion that potentially these big tech companies could be kind of treated like a utility, kind of like a telco or, or like, how the BBC was another example you mentioned, like, is that something that you could see practically like happening where they’re just at this size now where it’s just becoming almost like ungovernable unless we treat it that way, or is that kind of like a, a longer hope for the future?

[00:20:51] Christian: Now’s the time to think of big ideas. Working in cyber security, I’m more conscious than ever before of, you know, how [00:21:00] precarious the geopolitical situation is. You know, democracy is a very fragile thing. Democracy doesn’t mean only resisting like authoritarianism, but it, it also means, you know, having, you know, fairness and justice within society.

[00:21:13] So, so people, are able to get access to the public sphere, digital public sphere in a, in a, in an equitable way. Now’s a good time to think these big thoughts like China, they do not tolerate any private entity becoming a Arrival to the state and they use all sorts of authoritarian techniques for that to happen.

[00:21:36] They make them, they make the CEO disappear, for example, for months. Democracies need to find a democratic way of, of ensuring accountability, whether that means, you know, they have to, they have to appear before their parliaments or, you know, they have to be transparent about the impacts that they’re having.

[00:21:52] What Giovanni was really passionate about was, was getting different regulators to talk to each other and work together on [00:22:00] solutions, which,fulfill common objectives. So for example, you know, in antitrust, it’s, it’s to prevent the abuse of dominance in data protection, it’s, it’s to prevent, you know, unfair processing of, of personal data.

[00:22:11] There’s been a lot of work to be done, including in EDPS, looking at, at ways to, to tackle what they consider to be, you know, sort of unaccountable, you know, harmful data practices at scale in a way which, you know, harnesses both the available antitrust and data protection tools.

[00:22:31] Luke: Looking like five next five, 10 years, are you optimistic about where we can end up based off of, you know, what you’re seeing and some of the things we’ve been talking about?

[00:22:40] Christian: Yeah, we’ll see. I mean, I think, you know, this is going to be a very big year in so many ways. We just had the European Parliament elections. It’s everyone’s calling it the year of democracy. You’ll be, you’ll be voting at the end of the year. One of the positive things I’ve seen is like a, a kind of coalescing among the G7 as a kind of [00:23:00] block of democracies.

[00:23:01] using their, their soft power to, to try to really show how they’re an alternative to, the authoritarian approaches to close down the internet, censorship, abuse of data. I think a lot will ride on whether democracies are able to, to cooperate with each other and see, I think it’s important to situate data protection and other rights within the wider framework of democracy and the rule of law, because, because if you don’t have that structure, then it’s very hard to exercise individual rights.

[00:23:32] I mean, the Russian invasion of Ukraine was a real game changer, I I think we started to realize then that, you know, the, everything is connected, whether it’s like our, our human rights framework, but also our, our economic posture, our trade posture, et cetera. It’s all, it’s all interlinked and I would hope to see in the coming years, a real convergence between security interests, you know, economy and competitiveness, but also the safeguarding of fundamental rights.

[00:23:59] No, [00:24:00] it’s fantastic.

[00:24:01] Luke: Well, I, you’ve been extremely gracious with your time, Christian, and that we really appreciate it.

[00:24:05] for folks that might be interested in following kind of your work and, and, and the latest on what you’re doing, um, where would you suggest that they go to learn more?

[00:24:13] Christian: And they’re welcome to follow me on Twitter. They’re welcome to check out my piece in the EDPS book, which is available as a PDF for free online EDPS 20th anniversary book.

[00:24:24] They’re welcome to get in touch anytime. Excellent.

[00:24:27] Luke: Well, Christian, we really appreciate having you on. Love to have you back to, as things go forward and check back in, when, when time allows and, thanks so much for joining us today.

[00:24:36] Christian: Thanks Luke. Been a pleasure.

[00:24:38] Luke: All right. Thanks for listening to the Brave Technologist podcast.

[00:24:42] To never miss an episode, make sure you hit follow in your podcast app. If you haven’t already made the switch to the Brave browser, you can download it for free today at brave. com and start using Brave search, which enables you to search the web privately. Brave also shields you from the ads, trackers, and other creepy stuff following you across the [00:25:00] web.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • The use of anonymized data during COVID-19, and the vital lessons learned from this unprecedented period
  • The evolving landscape of data protection laws—including the Digital Markets Act and the Digital Services Act—and their interaction with GDPR
  • Critiques to the prescriptive nature of GDPR and its expectations
  • How to ensure GDPR is upheld in an evolving technological landscape, and continues to be effective in the future

Guest List

The amazing cast and crew:

  • Christian D'Cunha - Head of the Cyber Coordination Task Force

    Christian D’Cunha is Head of the Cyber Coordination Task Force in the Directorate-General for Communications, Content, and Technology in the European Commission. In recent years he oversaw a study into the future of digital advertising and its impact on privacy; drafted the Data Act proposal and EU cybersecurity strategy; and led the roll out of interoperable contact tracing apps during the COVID pandemic. He was head of the Private Office of the European Data Protection Supervisor from 2015-2020, advising on privacy-related legal and policy developments in the EU, including online manipulation, digital monopolies, digital ethics, and scientific research. He also served for several years as private secretary to the Chairman of the UK Labour Party.

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.