Back to episodes

Episode 21

Reducing Your Data Footprint and Confronting The Illusion of Privacy

Naomi Brockwell, Founder of NBTV Media, discusses the true cost of “free tools,” and why we need a major mindset shift when it comes to trading value for value. She also discusses the illusion of privacy versus the reality that data brokers are working in, and how AI may make the data broker’s job obsolete.

Transcript

[00:00:00] Luke: From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist podcast, we’re demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I’m your host, Luke Malks, VP of Business Operations at Brave Software, makers of the privacy respecting Brave browser, Now powering AI with the Brave Search API.

[00:00:28] You’re listening to a new episode of the Brave Technologist, and this one features Naomi Brockwell, who’s a tech journalist, creator of MBTV Media, and author of Beginner’s Introduction to Privacy. MBTV teaches people how to reclaim control of their lives in the digital age, and they have over 750, 000 subscribers across platforms.

[00:00:46] In this episode with Naomi, we discussed why we need a mindset shift when it comes to paying for products, how privacy and security come with a cost, and what we need to understand about the true cost of those free tools. We also discussed the illusion of privacy versus the [00:01:00] reality that data brokers are working in and clever ways to make their jobs obsolete in the future through AI.

[00:01:05] She also shared practical tips for making your data footprint smaller. And now for this week’s episode of the Brave Technologist. Naomi, welcome to the Brave Technologist podcast. How are you doing today?

[00:01:19] Naomi: I’m doing great. How are you?

[00:01:21] Luke: Doing well. Thanks for coming on. I’ve been really excited to have this interview.

[00:01:24] Why don’t we give the audience a bit of, how you kind of found your way into this privacy world. A little bit of your background, if you don’t mind. Yeah,

[00:01:31] Naomi: I mean, my background is not in tech at all. My background is in Opera and, performing arts and all that kind of thing. I actually fell into privacy because I started to get really worried about how little I understood about surveillance technology, how pervasive technology is in our lives, and I just realized there was a massive imbalance there.

[00:01:55] I like to be very well informed about things that I’m doing. You know, I’m careful with what I eat. I [00:02:00] do a lot of research and to make sure that, you know, I’m not doing anything stupid and just. Putting anything in my body, the same thing with my technology. And I started to realize that I was at a huge disadvantage, just not having a computer science degree, not being part of the creation of the internet’s infrastructure, like a normal person.

[00:02:19] Right. And I realized that this information asymmetry, it was working to my disadvantage. I started to learn about how. Different companies and governments can use this lack of understanding about how technology works in order to do things that if people knew about it, they wouldn’t be okay with it. So I kind of made it my mission to change that, to start to dive into how the tech works that I use to make sure that I’m making informed decisions as a consumer.

[00:02:47] And then as I learned stuff about this, it became my mission to Share that knowledge with others to make sure that everyone can understand how the technology they use works, because there’s a lot going on behind the scenes that people aren’t aware of. And if they were [00:03:00] aware, they would absolutely not be okay with what’s going on.

[00:03:03] So I just, I’m all about informed consent. And if people want to make decisions, go ahead, but just make sure that you know what you’re signing up for. That’s important to me.

[00:03:13] Luke: Yeah, and I think you do really, great job of kind of boiling it down in ways that people understand. When you have like a lot of like engineering types that are trying to kind of break down what this is, it gets into this very abstract word salad of things that people are just like, eh.

[00:03:26] I

[00:03:26] Naomi: found that too. Right. I found that like in my journey, because I don’t have that technical background, but I do have a very high capacity for understanding densely technical subjects, you know, my backgrounds in communication. So I was like, Oh, maybe this could be helpful if I could understand this stuff and translate it to an ordinary person, that could be helpful because in my journey, I found that the.

[00:03:48] Only resources were highly technical and dense. And I just realized like, there’s no way that most people are going to be able to do this. And especially if you have a normal job, right? Your job is not to [00:04:00] dive into this and understand all these technical concepts. But I figured if I could make this my job, I could dive into all of this and then I could try to package it in a way that is easily consumable by other people so that they don’t have to spend their entire life doing this.

[00:04:13] They can get on with their ordinary stuff and I could just give them easily takeaways that could dramatically improve their privacy. Oh, that’s awesome.

[00:04:19] Luke: What do you think people tend to kind of misunderstand or underestimate the most when you’re hearing feedback from people that are viewing your content that are learning something new?

[00:04:27] Naomi: I think that people really underestimate how much data is being collected about them. Because I get comments all the time from people saying like, I’m not important. No one cares about me. No one’s collecting my data. And they don’t realize that It’s not necessarily targeted, it’s just drag net. Like, yes, you are being caught up in this and yes, it does make you more vulnerable as an individual.

[00:04:48] So you don’t have to be special. You don’t have to feel like you’re out there doing all of this dangerous work. And so you’re in a dangerous position. No, this is applying to all of us and it makes all of us [00:05:00] more vulnerable and we need to understand the ways that it makes us more vulnerable and not stick our head in the sand and think it doesn’t apply.

[00:05:06] It’s incredibly naive today to think that. You know, your data is not being collected and that it is not weakening your security position. I think we really need to understand that fundamentally we are putting ourselves in dangerous positions by being reckless with the amount of data we allow to be collected.

[00:05:22] And I don’t think it’s intentional, like I’m not going to blame people for being reckless with it. The fact is they just don’t know what’s going on. So I think that the response is when you find out what’s going on, Don’t then brush it aside and say, well, it probably doesn’t apply to me. Listen to the people who are telling you what’s going on, because this is important.

[00:05:39] And I think that as time goes on, we will start to see more of the repercussions of this data being out there. We already are, you know, you hear about the targeted attacks that make the news, but I think there’s just going to apply on a wider scale as, we go along.

[00:05:54] Luke: And what do you think, you know, the role that you see of tech journalism kind of playing into this discussion [00:06:00] around privacy?

[00:06:00] I mean, I think it’s really interesting that you mentioned that you’re not blaming the users because obviously like users don’t know, right. But like these tech companies are the ones doing the data collection, right? How has being a tech journalist, like you’re informing the people, what other roles do you guys have in like kind of shaping the discussion around the tech itself too?

[00:06:17] Naomi: Well, one of the phenomenons that you’ll notice is that. So much of what’s going on is behind closed doors. It’s secretive, it’s veiled and that’s by design. I just read a fantastic book by Byron Tao where it’s called means of control and he basically looks at the evolution of data brokers, this kind of four eras, as he defines them.

[00:06:37] And let’s just talk about the most recent era, right? So the most recent era is kind of. Apps that, you know, we’re downloading hundreds of apps onto our devices. We have no idea what the code is doing, but it’s even more insidious than that. The fact is, is that if you’re an app developer and you’re not making any money from your app, suddenly someone will reach out to you and say, Hey, can I put this SDK in your app can, I just add a [00:07:00] few lines of code and you’re not.

[00:07:01] Really going to know what that code does. And they might say, well, we’ll pay you a few thousand dollars a month. Suddenly you go from a developer who was not creating a sustainable product to one who is, and they’ll say, it’s just for analytics. And we don’t realize how common this is. It’s happening in every app that we use, basically, that someone has inserted code.

[00:07:20] The app developer itself has no idea what this code does. The user definitely doesn’t have any idea. And what that. Code is doing is just extracting data. It’s extracting data about your exact location. It’s extracting data about what other devices it can see on your network. It’s taking so much information.

[00:07:36] And then where does that information go again? We have no idea, but it’s data brokers all the way down is this data broker selling it to this one and they can’t guarantee what happens to it. And then they’re selling it to this one and they can’t guarantee what happens to it. And then we find out that the government has all kinds of shell corporations that are literally government contractors that have.

[00:07:52] Half of their business is, oh, we’re just, you know, consumer facing and then half of the business is government contractors. So they can, you know, take the data from just this [00:08:00] consumer facing one and then just give it straight to the government. And this isn’t just the U. S. government, it’s governments all over the world.

[00:08:05] So we get to a point where suddenly you start to peel back the veil of what’s going on. And you realize that we’re being deliberately kept in the dark about all of this. Because this kind of a system only works when people don’t know about it. If people knew what kind of information was being extracted from their apps, they just wouldn’t be installing them.

[00:08:26] They wouldn’t be walking around with their mobile devices on them at all times. And Byron brings up a great point in his book. Which I think is a masterpiece in investigative journalism. And he even went through like a lawsuit against the government to try to get, you know, freedom of information requests from them that they just, you know, and that’s ongoing that he still can’t get the information he’s requesting.

[00:08:45] It’s all redacted, but he’s talking about how some of these contractors will say. Listen, we’ll enter in this arrangement with you, with the government, but you can’t ever talk about us. And you can’t ever bring us up in a court case. And if a question [00:09:00] ever arises about how you got this location data, you can’t mention us.

[00:09:03] And so you find out that literally we’re building into the foundation of all this data collection. Complete secrecy about how these, no one knows anything about these companies. There’s no Wikipedia page for them. They’re all shell corporations owned by shell corporations owned by shell corporations. So we’ve got this entirely shady industry that is making us all vulnerable.

[00:09:23] That is just extracting our data from us without us realizing. And I think that journalism plays a critical role in that, you know, to kind of like loop back to your original question is that without people like Byron. Shining a torch and letting people know what’s going on behind the scenes and actually mentioning some of these companies and what they’re actually doing, I don’t think we can make more informed decisions as individuals.

[00:09:46] So I really applaud people like him. There are people like Joseph Cox or, know, there are so many amazing journalists out there who we owe a lot to because without We’d have even less information about what’s going on. But I think we need more of that. I think we need more [00:10:00] people actually asking questions and digging in because I think we’re too complacent and just, yeah, none of us are really asking questions about this stuff.

[00:10:08] And it’s

[00:10:08] Luke: wild too, because that’s a great picture you gave of the landscape of, you know, for the data brokers and how this all kind of interconnects. I mean, and the wild thing to think about is that all of this is happening under this secrecy with this thing that’s basically used as the whole excuse for why you have a free internet, right?

[00:10:23] Where it’s like, okay, uh, yeah, you know, you like that website? Well, the ads are making it possible. You know, it’s pretty wild how parasitic this kind of system has become, yet it’s kind of the backbone for monetization for the web. You know, just to kind of turn on that, because you cover a lot of like privacy tools and a lot of things like that that people can use.

[00:10:42] I know back in the day, the tools used to be kind of difficult for people to use and breaking a lot of things, etc. Have you been seeing improvements or good signs with the usability of these privacy products over the past several years?

[00:10:53] Naomi: Absolutely. And I think that what you’re talking about paying for all of this is really important subject to [00:11:00] just kind of hone in on because right now I get that comment from people a lot where I’ll mention products and I’ll get retorts like, well, what’s the free version?

[00:11:09] I don’t want to have to pay. And they don’t quite understand that they’re already paying and they’re paying a much dearer price because your safety and your security is worth a lot more than these data brokers are getting from selling your data. Maybe there’ll be getting like a few dollars per person, but the damage that they’re doing to you as an individual, that is worth a lot to you.

[00:11:27] And yet people are very hesitant about paying nine, nine cents for an app. So we need to really come to terms with the fact that. We’re consciously making decisions when we use free products and we’re consciously making trade offs and we need to be aware of those trade offs. And, you know, we may not be paying with money, but we’re paying with something.

[00:11:45] We’re paying with data. We’re paying with our security. I think that we need a mind shift when it comes to paying for products. I think that we need to not presume that developers are going to be making free things for us. And that’s okay. Like even when I use free tools, [00:12:00] I try to encourage people, Hey, donate to them, make it sustainable because they’re the ones who are protecting us.

[00:12:06] And so, yeah, we need to start to realize that Security and privacy and all of this does come at a cost and we need to be okay with that because either an app developer is monetizing because you’re paying 99 cents, which come on, you paid like 7 bucks for a coffee, you do it like several times a day to pay 99 cents or that app developer is going to be approached by some shady shell company that says we’ll pay you thousands of dollars to put this line of code.

[00:12:38] Like I would much rather use apps that are awesome and doing cool things and feel like I can trust them than be afraid of downloading new things because I have no idea what code is in there and have no idea what compromises the app developer has made in order to make this sustainable. So we need to be aware of that trade off.

[00:12:54] But in answer to your question about whether things have gotten better, absolutely. I am delighted [00:13:00] to see some of the tools that are out now. And my core focus of what I do is trying that stuff out. Like there’s a barrier to entry for people to try new products and to make switches. Like you’re going from one ecosystem.

[00:13:11] It’s a lot of work to figure out this new ecosystem. So I try to make it easier for people by doing that. Trying it out myself, letting them know what they can expect, teaching them how to use it. What are the pitfalls? What are the pros and cons? And hopefully that can show people that actually the switching cost is not so high for a lot of these things.

[00:13:29] There’s tools are getting better and better. And if you’re keeping up with it, I think that you can have a really private life online just by choosing better products.

[00:13:38] Luke: To touch on another point too, I mean, like we’re talking about developers earlier too, but you know, I saw you tweeting more recently about the whole thing with Facebook, with the VPN and the root certificates and everything about how, you know, they’re basically like collecting all this other data that people didn’t anticipate.

[00:13:52] Right. Like, and it’s like when these big brands, you know, that you would think, I mean, publicly traded companies, it’s not even just developers that are having to do this stuff at the [00:14:00] early stage of their product. It’s these big companies that are doing things with data that it’s all in secrecy. People have no idea.

[00:14:05] And then these things come out. Like through court cases, where it’s just like huge. I mean, how much has that influenced your, your thinking about what big tech companies say versus what they do? Like, as you’ve spent more time in this space.

[00:14:17] Naomi: I actually wouldn’t even like, I know that we use big tech as this pejorative.

[00:14:22] Like I’m hoping that the privacy focused companies will become big companies. So like, I, like, I want us to take a step away from like the big tech label for a second, because. There’s an interesting phenomenon going on with this data collection in that. Companies like Google and Facebook, because they are these public facing companies, they do have reputational costs to what they’re doing.

[00:14:45] You know, that’s somewhat of a limit. And if you look at how egregious their behavior is, even with that limit, it’s kind of shocking. So then zoom out a little bit and think about all those shady companies that have no consumer facing product. No one knows who they are. They can just change [00:15:00] their company name at the drop of a hat and be part of a different shell organization.

[00:15:03] The things that they’re doing with their data, the SDKs that they’re putting into apps, that is something we should all be worried about. We’re focused on those ones that are public because they’re the only ones we know about, but there’s so much going on that we don’t know about, and it’s actually way worse.

[00:15:18] So, yeah, this stuff that’s going on with, these companies like Facebook and Google, I think is gross and terrible and egregious. And it kind of is a hint, a little, it’s like the shadow on the cave, right? Of what’s going on with some of this other stuff. I think that we need to completely do a revamp of, how we look at a data collection, how we interact with technology, because it’s quite scary what’s happening with some of this.

[00:15:42] I mean, for example, I’ll give you an example from Byron’s book. One of the apps that Byron talks about in his book, which is particularly egregious, is that it’s targeting specific religions. So anyone who’s downloaded an app, and it’s specifically for Muslims, and then it’s extracting your exact location [00:16:00] data, and it’s extracting all of your contacts, and it’s getting access to your WhatsApp download folder, and it’s just having all of these invasive permissions you had no idea about.

[00:16:08] Things like that. People aren’t aware of that. There are governments all over the world that are putting this code into apps to target specific demographics. Perhaps it’s demographics that they don’t like, perhaps it’s demographics that they are worried about, concerned about, want to keep tabs on, but all of this is going, and we’re all under pervasive surveillance.

[00:16:26] You have no idea whether you’re part of a demographic that some, you know, hostile foreign country or, or hostile your native country. You have no idea whether you are someone who’s part of one of those targeted groups. So I think we just need to really. Start to think more about, do we need that app on our phone?

[00:16:45] Do we need to take our phone with us everywhere? We just need to take a step back because until things change with the privacy conversation and the way that governments are able to exfiltrate data from tech companies, I think there’s a lot to lose in terms of freedoms and I [00:17:00] want people to be aware of what’s at stake.

[00:17:02] Luke: It also seems like there’s kind of like this, this constant push deeper and deeper into the user’s life. I mean, we see this a lot where, okay, as a browser, we’re trying to protect what we can on the web for the user. Right. But then all of a sudden you’re logging into your OS and then, you know, your operating system is starting to have advertising too, and it’s starting to now have AI integrated and it’s all these things.

[00:17:24] And every time you update certain things are starting to shift around that you had. Made setting changes on and all this stuff, but it’s just quite a pervasive thing. And I mean, I was in ad tech before I came to brave. And one of the interesting things, like when GDPR started to come to light and people were saying, Oh, we’re going to have to list all of these different data companies that engaging with you on the site.

[00:17:42] Part of me was thinking, well, yeah, like, but people don’t realize, you know, people copy and make different versions of IDs and sync all these things. Like I couldn’t tell you where it ends up, you know, and I was working in the space. it’s just wild. It’s like, there needs to be a whole reboot on how we think about monetization.

[00:17:59] For the web, but [00:18:00] it’s kind of touches back on the paying for the apps questions to earlier. And I mean, are you optimistic for where we’re going with software, with privacy and like the ability for companies like to kind of challenge, the status quo on this

[00:18:12] Naomi: front? I mean, I’m loving that. I’m loving seeing what, companies are coming out with, and I’m loving that there are some really robust and reputable companies out there that I feel very comfortable recommending to people.

[00:18:23] So I don’t do sponsors on my show, I monetize entirely through donations. And so that frees me up to be able to actually look judiciously at different products and not feel biased and swayed and say like, well, they gave me a lot of money, so I’m going to promote them. I am really grateful for the business model that I have, because it really does allow me to, I think, have less bias in what I recommend to people.

[00:18:50] And I honestly am, delighted with some of the tools that I’m able to recommend. And again, like sometimes I’ll recommend tools that people will be like, ah, you have to pay for that [00:19:00] product. She’s a shill for that or whatever, you know, but it’s like, no, I, feel really proud and grateful to them for providing this amazing privacy tool.

[00:19:08] And yes, you should pay for it. You should give them your gratitude and they’re providing immense value in the world. You should absolutely give something back. there. And even the products that I recommend, like, I like things like LibreOffice, so your CryptPad and things like that. Even then, you know, these are free, but you can donate to support the people who are putting a lot of effort into providing these tools.

[00:19:29] So, yeah, I do think that we need to have a mind shift in terms of value for value. I have no hesitation in recommending paid products, free products, as long as I understand their funding model. And I’m just grateful that like every day I’m discovering more things and more people are jumping on the privacy bandwagon and providing awesome tools.

[00:19:49] Just on the topic of free versus paid, a lot of people will say like, Oh, well, you know, if it’s free, It’s free. You are the product. And that’s kind of an oversimplification of the [00:20:00] phrase. It’s not, if it’s free, you are the product. It’s if you don’t know where their funding source is, you should be skeptical of that because there’s a lot of open source tools out there that are free that I actually think are great.

[00:20:13] But I also know that they’re really under resourced, really underfunded, and they’re donation based. So I do understand where their income comes from. If there’s an app out there that seems very well resourced, and it’s free, that’s where you have to start asking questions. How are they actually making their money?

[00:20:31] Because I think it’s an important thing for just People to be mindful of, like, you know, why is your Gmail free? Why is Google okay? Providing you an incredibly useful communication tool where you can email people and they’ll give you a certain amount of your free storage. Why are they doing that?

[00:20:47] Because they’re an advertising company and their entire business model is collecting information about people. You are their product and their clients are advertisers. So they’re collecting as much information about you as they can. And in [00:21:00] order to collect as much information, they’re providing you communication tools that they, people kind of presume a private, right?

[00:21:06] People think their email is private. It’s absolutely not private unless you’re using a quality service that is going out of its way to take data out of their own reach. So yes, free products like Gmail absolutely stay away from, they’re just a data collection tool.

[00:21:20] Luke: Yeah, we’ve seen kind of AI really proliferate more and more, and become more and more integrated.

[00:21:26] Like how much around that are you concerned you from a privacy perspective?

[00:21:30] Naomi: I have a weird nuanced take on that that is a little offbeat. So I like that. Yeah, I think the AI is actually going to be great for privacy because right now we have this illusion of privacy. People. Think that they have privacy.

[00:21:46] And the reality is, is that data brokers know everything about them and they’re collecting every digital breadcrumb they’re leaving on the internet. And they know so much about you that they can predict your habits and your wants, and they can [00:22:00] target you with all kinds of things. They know exactly what to say to you, to persuade you to buy products or to have certain political beliefs.

[00:22:06] People are being targeted all the time with political content and propaganda. So they know. Intimately your routines and your beliefs and your habits to an extent that even you probably don’t understand. And so people think that, oh, well, AI is coming along and, and so people are going to get an insight into our lives.

[00:22:24] It’s like, no, we’re already there, but AI is going to democratize the field. AI is going to put those tools in your hands as well. And I think that one of the best chances we have of making data brokers obsolete is just. Filling the world with so much noise and really believable noise about ourselves that data brokers can no longer reliably say to someone, this is information about Naomi.

[00:22:46] They will have no clue because Naomi is out there generating so much, you know, believable data through AI. That she’s able to somewhat obfuscate, you know, the truth of her actual habits, et cetera. So I think the AI is going to be incredibly powerful [00:23:00] tool in the hands of the ordinary person. Right now, major companies already have, you know, machine learning capabilities that are doing all kinds of stuff.

[00:23:07] They’ve been doing this for a really long time. They’ve been sifting through our data and collecting our data, scraping our data and using our data against us for a long time. And this is the first time that Individuals are going to have similar tools in their own hands. So I’m excited about that world.

[00:23:21] No, it’s awesome.

[00:23:22] Luke: Yeah. And, and I think it’s, it’s interesting, like a long time. I think people forget that, you know, Edward Snowden’s revelations were what, like over 10 years ago. And it’s like over a decade of data collection. That’s just been getting better and better. So, yeah, I mean, like the more this new stuff gets democratized and, you know, even looking at privacy tools, like you’ve got, you know, like even like example, like proton, right.

[00:23:41] Where they’ve got like tens of millions or a hundred million, however, Big they are now. It’s just like, you know, these things snowball and used to be such a thing about, Oh, people don’t care about their privacy, but you can’t really argue against that anymore when millions of people are using these products.

[00:23:54] So I think it’s a great point you bring up. If someone’s listening to this and you can recommend like one or two [00:24:00] things they could do right away to make their online footprint smaller, what would those things be?

[00:24:05] Naomi: I think if we’re starting with low hanging fruit, the things I would immediately go for is browser, you know, it’s an easy thing to switch out.

[00:24:14] And if you’re using Chrome, I think you’re crazy because they collect so much information about people. It is one of the worst browsers that you can use for your privacy. So my preferred one is brave, and I’m not saying that just because I’m on this podcast. It’s actually something that I’ve used for years and I’ve never received You know, any sponsorship money.

[00:24:33] So just to make that clear, but then I would say the next easiest thing is search engine. And again, I use the default in my brave browser, which is brave search. And I was really excited when you guys came out with, with search and we’re no longer just a browser, but there are lots of different search engines out there.

[00:24:49] Like I also like start page, which is like a private front end for. Google, so if you still want Google results, you can get that. But in a more anonymized way that, I mean, there are lots of different tools out there that you can switch. [00:25:00] And I think browser and search engine are probably the ones where you’re just going to see the least friction.

[00:25:04] It’s super easy. Like you can just import all of your bookmarks. You can import whatever prefills you want and just. Go right ahead and use this new product. So that’d be the first thing I think next tier would be something like your email. And again, because of GDPR actually, which is kind of unwieldy and hard to enforce and arbitrarily enforced and all of that, but one of the good things to come out of it is that Google has been forced to make certain APIs public facing, and the result of that is that companies like proton mail literally can now provide you with the single button that you click.

[00:25:39] That says import all of your emails. So changing email now from like, if you’ve set up your whole life on Gmail and you’ve got all your contents there and you’ve got your calendar there and all of that, you literally can just press a button and it imports everything to something that is far more privacy preserving like proton.

[00:25:54] Or you could use something like tutor or, you know, there are a lot of great tools out there as well. So I would say email. [00:26:00] It seems daunting, but it’s one of those deceptive low hanging fruits. The switching cost has far less friction than people imagine. So I would say, just go ahead and do that one weekend, click that button and you will be happier.

[00:26:12] the next one I would say is probably messenger. If you’re using SMS, just stop, just get your family and friends on something like Signal. And again, it’s one of the best ways to just start protecting that privacy because SMS is, you know. It’s not at all secure. It’s not at all private. If you’re using Facebook DMs for all your private conversations, good to switch, stop, just stop doing it.

[00:26:34] The number of people who reach out to me on Twitter, you know, I like having that avenue. I like having people be able to engage me with me on different platforms. But the first thing that I do is I say, Hey, Let’s move this to signal. And if they don’t have signal, well, then they know that they have to get a signal in order to contact me because I’m not going to engage with people on public platforms.

[00:26:52] And I actually had a friend, he sends me so many things on Twitter and we’re connected on signal. And I had to have a conversation. I was like, you got to stop sending me [00:27:00] things. And he was like, well, you know, I keep the high stakes things for signal and the rest of it. And I was like, Nope, I’m not interested in a company being able to just take this data.

[00:27:10] And I’m making a conscious decision to move my contacts out of their reach. he was like, well, you know, There’s a cost. There’s a friction cost to me having to switch applications. If I want to send you a Twitter link, I was like, right. And there is a friction cost to our conversation because I’m never going to read your Twitter messages.

[00:27:25] So just letting you know, like, if you want me to see these things, signal is the way to go. And I, you know, it sounds. It probably makes me sound like, ah, she’s crazy. Like, no, who would write to her and all this, but I just draw these boundaries for myself because I understand what’s important to me. And there are certain things that I just don’t want to compromise.

[00:27:42] Not when there are easy solutions out there that are going out of their way to make the world a better place. I want to support them. I want people to use them. I want to migrate people over to them. So I do draw these boundaries for myself. And if people do want to contact me, then. They know how to do it.

[00:27:59] And it’s [00:28:00] just a way for me to contribute to these privacy ecosystems.

[00:28:02] Luke: Yeah, it makes sense. One other thing too, I think we were kind of touching on it earlier with free versus paid, but I think that people, you know, a lot of times I think, okay, I’m just going to use a VPN or something like that. And there’s so many VPNs, right?

[00:28:14] Like, and there’s free and paid VP. I think you see where I’m going with this. Would you suggest that people do not use free VPNs if they’re out there? Like, have you seen people use VPN? So like, how have you seen people use free VPN data? from what you found? Around how people have used free VPN

[00:28:28] Naomi: data in the past.

[00:28:29] Yeah, I’ve got a couple of anecdotes here. So one of my friends who’s like a big security guy, he taught his children how to use VPNs to bypass school Wi Fi restrictions. And you’ll be like, what? Why? Why would you be telling your kids this? And it’s because they’re going to do it anyway. But the problem is, that when they’re on their iPad at school, They’re not going to know which VPN to download.

[00:28:52] They’re going to download whatever one in the app store is that’s free and looks cool, has a cool image. And that VPN is going to [00:29:00] be collecting all of their personal data. It could even be malicious where it’s collecting passwords, things like that. And so if your children, like we have to be realistic.

[00:29:09] If they’re going to want to bypass school restrictions and things, At least teach them how to do it safely. So get them to download a reputable VPN. The unfortunate reality is that most VPNs in the VPN app store are either shell companies that are just there to collect your data, or there’s something worse.

[00:29:25] They’re actually malicious. So I put out a piece where I interviewed Jonathan Tomac. And he just did a great deep dive into malicious code that he reverse engineered from certain VPNs, where they’re literally collecting your keyboard entries and your mouse movements. And they’re just watching everything you’re doing.

[00:29:44] And then if you look at all these VPNs that are owned by, All of these tech review journals, but literally the only reason for existence for these review sites is to sell their VPN product. Yeah. And then you look at all the, the referral fees that [00:30:00] influencers get where they’re like, sign up for such and such VPN.

[00:30:02] And if you look at what they’re getting paid. Sometimes it’s like 50 percent of a yearly subscription, which could be like 100. So they’re getting 50 bucks a pop for every person they can convince to move to a VPN. And then you look at like, okay, so all these VPN companies are all owned by this one conglomerate.

[00:30:18] And wait a second, half of this conglomerate is just an advertising firm. Like you’ve got to start putting the dots together because there’s no accountability, there’s no transparency. So you’ve got to use your rational faculties and think like, Okay, chances are that all of this data is being aggregated.

[00:30:35] No one’s looking into it. There’s no transparency. No one’s able to look into it. And so you should be very judicious about which ones that you’re using. There are VPNs out there which you can use for free, but they’re not Completely free. They’re freemium models, so you can get faster connections or you get more granular controls if you upgrade.

[00:30:54] I think that some of those are also fine. I think that VPNs are definitely something you need to be careful of though, [00:31:00] because there are so many bad ones out there. Yeah, that’s

[00:31:02] Luke: super helpful. Is there anything that we didn’t cover that you want our audience to know or food for thought or Resources they could look at

[00:31:09] Naomi: that you’d recommend.

[00:31:10] Yeah, I would say if your audience is looking for homework There are some books that I would recommend. I just finished reading Byron Towers book So I’ll first of all recommend that because I think that it was just a fantastic look into how the shadowy world of data brokers actually works. So if people want insight into an area of their life, that’s been deliberately shrouded in mystery, it’s great.

[00:31:33] I think that if you’re looking for a deep dive into privacy, check out Michael Basil’s work. He has lots of books out there. He has an extreme privacy series where he teaches you how to completely revamp your life. Like if you want to go into the rabbit hole and really learn how to be super private, You know, you can check out that and then just, start paying attention.

[00:31:50] Like start asking questions. One thing I’ve noticed is that no one questions privacy policies. No one even reads them. I recently did an experiment [00:32:00] where I started, I did some apartment hunting for a new video that I’m putting out and I took the opportunity to print out every apartment’s privacy policy.

[00:32:09] And go through and highlight all of the crazy egregious things in them. And it was pretty shocking, like some apartments that are collecting olfactory data. It’s like, what, but they collected my sense of smell. What? You know, you, if you say people’s name, you’re giving, you’re like speaking on their behalf.

[00:32:28] And so you’re giving the company permission to then go and look up information about that person. Like there’s some crazy stuff. But anyway, I started just asking about this, , saying like, Hey, just wanted to question like, what does this mean? Are you guys selling data? And every single person I spoke to said, I have no idea.

[00:32:47] I have never read the privacy policy and you’re the first person to ever ask about this. So I do think we need to start normalizing asking questions when you buy a car. Ask them about the privacy policy when you, you know, go to a restaurant, they ask for your [00:33:00] phone number, ask them about the privacy policy.

[00:33:02] You know, when you’re doing any transaction and people demand information that is personal, that is sensitive, ask them about their privacy policy because no one is. So there’s no pushback and that means that there’s no incentive for companies to change behavior. No,

[00:33:17] Luke: that’s great. It’s great advice. If people want to find or follow you or your channel and find you on social media, can they look?

[00:33:24] Naomi: So we’re on a bunch of different platforms. If you are someone who uses YouTube, then you can find us there. If you’re someone who prefers more private ways of watching videos, we’re on the P2P library network. You can download the client, watch us there. We kind of run the gamut of extremes wherever you want to find our content.

[00:33:42] But we put out Long form tutorial explainers. We put out short form clips, just a little, you know, fun snippets about like, yes, Gmail is collecting your data. And we also have a book, beginners Introduction to Privacy. If you are just starting out and you just want explainers about the low hanging fruit and what you can do, so you can [00:34:00] check out that, just look for NBTV or Naomi Brockwell tv or you could go to Mbtv Media to see all the places where you can find us.

[00:34:07] Luke: Awesome. We’ll be sure to link to that too in the bottom. Well, Naomi, I really appreciate you coming on today. It was a great conversation. I’m sure audience learned a lot and I’d love to have you come back sometime revisit some of

[00:34:17] Naomi: these things. Thank you so much for having me. It’s been fun. Awesome.

[00:34:21] Thank

[00:34:21] Luke: you. Have a good one. Thanks for listening to the Brave Technologist podcast. To never miss an episode, make sure you hit follow in your podcast app. If you haven’t already made the switch to the Brave browser, you can download it for free today at brave. com and start using Brave Search, which enables you to search the web privately.

[00:34:38] Brave also shields you from the ads, trackers, and other creepy stuff following you across the web.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • The easiest ways to reduce your digital footprint, and unpacking the switching costs of transitioning to privacy respecting alternatives.
  • How AI can democratize the playing field between data brokers and everyday users.
  • What to look for when choosing a reputable VPN.

Guest List

The amazing cast and crew:

  • Naomi Brockwell - Founder of NBTV Media

    Naomi Brockwell is a tech journalist, creator of NBTV.media, and author of “Beginner’s Introduction to Privacy.” NBTV—which teaches people how to retake control of their lives in the digital age—boasts more than 750k subscribers across platforms.

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.