Why Cyberattacks Are Now a Matter of Life and Death
Luke: [00:00:00] You are listening to a new episode of The Brave Technologist. This one features Ed Gude, who is the CEO and founder of sentt, with third party risk management platform built for healthcare providers to manage threats to patient care. He’s a seasoned software executive with over 25 years of experience driving product innovation, marketing strategy, and sales growth across startups and public companies.
In this episode, we discussed what a modern resilience strategy looks like for organizations today, and how recovery responses have evolved ways to make proactive cultural and operational changes while overcoming resistance from your team security habits every company should adopt immediately, and how AI will improve cybersecurity, and how application in fields like healthcare can be used as examples.
And now for this week’s episode of the Brave Technologist.
Ed, welcome to the Brave Technologist. How are you doing today?
Ed: I’m good, Luke. Thanks for having me on.
Luke: Yeah, thanks for joining. I’ve [00:01:00] been looking forward to this one. I think, you spent decades working in the intersection of security, healthcare, enterprise technology. Like from your perspective, how has the risk landscape fundamentally changed, over the past decade?
Ed: Well, you know, 10 years ago, and, and I’ll talk about it from a healthcare context, although I’ve spent, decades in other areas as well. This is my 11th company, all software. So I have a lot of opinions obviously and experience, but in healthcare, the risk landscape has changed, changed dramatically over the last decade.
Inso much as, you know, 10 years ago we didn’t really have ransomware. So any type of security incident, , or threat was related to data loss. And so it was patient data, right? Obviously that’s a big problem and there’s regulations, and that organizations have to comply with or they end up getting fined, or they have to manage a corrective action plan, for one or two years or even longer.
And data [00:02:00] loss was really the focus of the industry from a risk perspective. And certainly , from a cyber perspective, now obviously there’s always the availability of the systems, that come into play and how risks can disrupt operations. There’s risks outside of cyber that, environmental risks, geopolitical risks, et cetera.
But from a cybersecurity perspective, the biggest change is the introduction of ransomware.
Luke: Yeah. One of the reasons I was ex excited to have you on is because it seems like in a lot of ways, like healthcare, and medical’s had a bit of a headstart a around some of the risk, , evaluation assessment because of the compliance you’ve all had to think about things, a little more holistically, , over the past 10 years or whatever.
Like when you look at. Where we’re at now with AI kind of hitting the landscape, what ways do you see organizations kind of underestimating,, how systemic the risks have become and, and are there any risks that you think are just getting completely [00:03:00] overlooked?
Ed: Really good question. The first thing is from a targeting perspective and a threat perspective, things are evolving rapidly faster than ever before. And, they become exponential in terms of the surface area because now we have AI that’s involved. Right?
And again, I talked about data, the risk of data loss versus, . Ransomware attacks and ransomware attacks, actually cause patient safety issues so people can lose their lives when a ransomware attack hits. And because what you’re doing is you may be losing data, but you’re disrupting care operations and, , care facilities and operations shut down.
And not only that. It could affect things like ambulances. If you’re in an ambulance, Luke, and you’re headed to your hospital and it’s in the middle of a ransomware attack, they have to divert you to the next nearest hospital. Now, if you’re in the city, that may not be a big deal, but if you’re in a rural setting, that could be, an hour away.
And if you’re having a heart attack, that’s gonna obviously affect your outcome. So, the [00:04:00] stakes are much higher as it relates to the attack surface, the threats and the potential risks of life than they were, like I said a decade ago. As it relates to systemic risk, what we’re learning is that.
The ecosystem is connected. It’s not just one vendor, one product, or 10 vendors and 10 products. It’s an ecosystem of interconnected vendors and products, and those have to be assessed across all business processes. A decade ago, a. Not every business process was managed with some type of technology, but today that’s changed.
If you look at a healthcare setting, I would say 99% of the business processes are supported by some type of technology or electronic service. So there’s always going to be the threat of data loss or disruption to those different electronic services.
Luke: The Brave Technologist is brought to you by the Brave Search, API.
Access billions of indexed web results from a simple API call with the brave search API join the leading names in AI and tech [00:05:00] using the Brave search, API to power genic Search. Keep LLMs current with realtime data, train foundational models, and bring the best of the web directly to the leading edge.
Get started today at brave.com/a.
Ed: Systemically, there are some critical functions, or business processes that are critical to the operation or continuing operation of, of a health system, and those are different than, slightly different than every other process in so much as if they get affected.
You can shut down your operations, right? And one example of that is change healthcare. You may be familiar with the change healthcare incident. It was a clearing house for revenue processing. And when it went down, we realized that 70 plus percent of health systems were relying on that to actually collect revenue to fund operations.
Now, that doesn’t sound like a big deal, but if you only have so much cash on hand. And your ability to collect cash goes to zero. [00:06:00] Then now all of a sudden the clock starts ticking for you to pay people and to continue operations, right? So it’s a real problem. Now we dealt with that quickly and the organization that owned Change Healthcare did the right thing in terms of helping organizations sustain themselves based on cash flow.
Until we got through that issue. But we had another one the other day with Stryker. Stryker was affected and that shut down certain operations, , that Stryker, was affected by, by, by the attack. And so these systemic, scenarios have to be understood by the hospital in a way that they can prioritize the risk management of those products or services and vendors in the context of those critical functions.
Luke: I have a feeling that most of our audience probably is aware of what these ransomware attacks are like, but for the folks that aren’t, would you mind walking us through? What that looks like, a ransomware attack.
Ed: Sure. So usually it [00:07:00] comes in through a breached identity, so somebody’s identity gets compromised. They come in laterally into your organization and move laterally. So they’ll go, from one, the controlling of one process based on the identity to then assuming another identity, to eventually getting the keys.
To the servers that are basically storing and running the processes and data that you rely on. So what they’ll do is they then take those and they encrypt them and in a way that you can no longer get access to them. So the processes that again, are running against those data sets can access that data until.
You get the key from the, from the hacker or the, , or the organization quite frankly. ‘cause now it’s organized that is actually holding ransom to release the key so that you can decrypt. The data and continue on with your operations
Luke: Essentially locking the business owner out of their business operation, right?
Yeah,
Ed: locking systems, locking the data that these systems need to [00:08:00] access to actually process the functions, so.
Luke: Right. Okay. Got it. That’s helpful. You mentioned. Getting more organized because I, I remember hearing about these where, it even happened with my, doctor too, and not that long ago where, somebody in the office next to theirs had that happen where, but it sounded more like a one-off.
Person. But you’re saying these are getting much more organized now.
Ed: Yeah, sure. In the past, these attacks used to be, fairly individualized and the bad guys took a page out of the organized criminal book. They began to, create, if you will, microservices. So they took a look at their overall process for hacking and then extorting money from people.
And they said, you know what? We can actually break this down into services and we can have some people do this and some people do that, and then we can be. Much more effective. And we can also scale ourselves significantly because we’ve segmented those services out across a number of different groups and everyone gets paid appropriately.
Based on , that attack service that they [00:09:00] provide to the organization. Sense,
Luke: Like a Fiverr for bad guys or something like that, right?
Ed: Yeah, yeah. No, no. And, and that organization, , is what enables them , to go at scale because one person is the, , is actually collecting the ransom.
One person is creating the actual attack. One person is, , managing the encryption. You know, 10 years ago we didn’t see that, right? It was very individualized, these attacks. Now with that scale and now with AI at scale, I mean the bar raised, it’s gonna get significantly more difficult to manage these risks going forward.
Luke: That certainly sounds like it. I think a mix of these types of organized attacks along with, as you were mentioning these, dependencies or, these, third party vendor kind of supply chain attacks too. That don’t necessarily happen within or originate.
Within the organization itself, but are from something that, the software is dependent upon, right? Through a package or whatever. We just saw that april Fools Day, right? Or something around there, there was a [00:10:00] big, , package, exploit or something like that.
When companies are evaluating things like vendor security, and these types of texts, like what are you seeing as like the biggest blind spots that they’re overlooking?
Ed: Yeah. So there’s a couple things. So if you think about it from an AI perspective, right?
Most organizations are looking at AI adoption through the front door. So they’re saying, okay, as new products or services come in that are AI enabled, or AI first or whatever, we’ll go, we’ll run ’em through this new AI governance committee we have, and these committees are much broader than just cybersecurity because.
They have also non-cyber, non-direct cyber related, risks as associated with them, like bias, , ethical, disruption, or impact.. So that level of governance has gotta be considered across not just the technical areas, but also the clinical areas in a hospital. The operational areas, the legal areas, and so they’re looking at these.
Coming in through the front door, but actually [00:11:00] AI comes in through the bathroom window, through the floorboards, through the attic, through the basement. Right. It’s coming in everywhere. Mm-hmm. And, and what I mean by that is there’s existing products and services that are taking AI through updates or patches, and now all of a sudden, what you thought was a non-AI product or service has now been an AI enabled, and you’ve been using it and you didn’t run it through your governance committee and, and so now you’re at risk.
And so I think that is the biggest issue we see within, within healthcare, as it relates to getting, folks’ arms around the risks and the cyber, risks of AI adoption.
Luke: Yeah, that’s true. It’s one of those things where you blink and all of a sudden, every tool you’ve become accustomed to using over the past, as part of your everyday workflow now has this little symbol in it and it’s, and AI and able just kind of tracking
all of that, adoption and sometimes it just kind of is turned on, what does that mean [00:12:00] for my data? For, for my That’s
Ed: right.
Luke: Patients or whatever, right? Like that’s
Ed: pretty, yeah. Am I sending data outside now? The four walls that is protected health information, which is governed by, by exactly hipaa.
That again, I didn’t think we were, because we had assessed this product and service a year ago and it didn’t have ai, and now it does. And so what does that mean to the, to the list dynamic?
Luke: Yeah. And so in thinking about like a, a modern kind of resilience strategy, for this, what, what do you, what does it actually look like, in practical application?
Yeah,
Ed: great, great, great question. Well, it’s much more holistically, it can’t be siloed based, so cyber working in a silo. It just, just doesn’t work anymore. It’s gotta be,, holistic across the organization. Every business process needs to be inventoried. All vendors and products need to be inventoried against those business processes.
Critical functions, again, that keep the business running, need to be identified. And then, risk profiles and appetites and priorities need to be established and then [00:13:00] managed accordingly. Because you don’t have unlimited resources. You can’t assess everybody and you shouldn’t assess everybody, but you should definitely assess the critical functions and the high risk vendors and products against that.
I think also as it relates to resiliency, again, 10 years ago people weren’t talking about resiliency and Right. And now they are, which is good. And in the context of, if you take the NIST CSF, uh, framework, the. Cybersecurity framework by nist, , it, it’s added , another dimension to it.
But, if you identify the problems,, detect the problems and then protect against those problems, that was pretty much the focus for the last, 30 years or so. What’s changed is that the respond, recover functions now. Within , the six functions, the six one is governed, which is, which has just recently been added.
Tho those are much more important because everyone understands this adage, it’s not a matter of if, it’s a matter of when. Mm-hmm. And so if it’s a matter of when you think about it differently and you apply your resources differently, again, if you [00:14:00] have a limited amount of people and money.
So you really start to think about, okay, we’re gonna get hit eventually. We need to make sure that we’ve balanced our ability to respond and recover specifically is much more important than it used to be. We need to understand, based on the critical functions, what the recovery times need to be in order to continue business operations.
Right? And so, as a rule of thumb, for critical functions, you wanna look at five days or less recovery. Mm-hmm. And, and now you can recover instantly. No one can afford that level of recovery. Right? ‘cause you’re effectively replicating the investment you currently have on those systems right?
Now, in some cases you might do that, but like I said, most people don’t do that just because it’s very expensive to manage. So recovery becomes much more of an issue and a focus, and it does, it raises. It raises foundational questions around structure. So in the past, business continuity, [00:15:00] bus disaster recovery, which are sort of the core functions of most companies as it relates to continuity and resiliency.
They sit in a different group, they sit in a different function. And I would say that those functions need to be considered holistically across the cyber function. Again, these can’t exist in silos anymore. Mm-hmm. And they need to be, integrated in a way that you have a full transaction or a lifecycle view of the critical functions, the business processes, the vendors and products.
The qualities of service, if you will, around some type of SLA, some type of res response and recovery objective. , And then you’re testing those things out. So you’re verifying that the systems and the processes and procedures you have in place and the technologies you have in place actually meet those SLAs because again, ma, not a matter of if, matter of when and when you do get hit, you need to be able to respond accordingly.
Luke: Yeah, , exactly. [00:16:00] And I feel like recovery means a lot more now than it may have in the past where like related to like a specific function or, or business service. But now it’s like, this stuff’s so integrated in everybody’s lives. It’s like how is your response, to not only the situation but also to, informing customers, right?
Or users or patients or whoever, right? The end user is what’s happening, what they need to do to protect themselves, like what they should be aware of. , At a minimum, right? And then what the company’s doing. Like a postmortem, as a response to the incident.
It’s even more important now than ever because of just how many interdependencies there are across all these different things. But
Ed: yeah, like notifying the ambulance service that you’re under attack with a ransomware attack. Right. And they’ve gotta divert things now and then notifying the hospital that’s going to get the diversion to be ready for it.
Right. Because they’re probably not thinking, they’re thinking they’re their typical capacity model and , they’ve staffed accordingly. Now all of a sudden you’re gonna get the capacity diverted from another hospital, and [00:17:00] you have to prepare for it., So these things, again, 10 years ago we didn’t think about because data was lost.
And so when data gets lost, it’s okay. Respond. Recovery is about how do I notify my patients that we’ve lost our identity or their data? How do I recover from that? Right. It’s a, it was a very different problem. Much easier, quite frankly, problem right. Than it is today with ransomware because again, it’s like you may never lose data, but like you’re gonna lose the ability for you to operate because your data now
has been taken hostage. And until you
Luke: pay that ransom, I mean, users have so much more access, right? Yeah. Like to to that, the information in that system where even things like authentication, they might be using the same thing across everything and then all of a sudden, oh hey, the attacker’s got something new, right?
To play with. It seems like much more, , I don’t know, I just keep going back to integrated, but it does seem though that like this strategy, a good amount of, it’s just stuff that probably should have been good hygiene anyway, right? Like for [00:18:00] companies to like make sure that are we actually using all these things?
How well integrated are they? What are the risks and all that. It seems like in a time when people are kind of concerned about. AI and things like that, you’re almost relying on kind of the brain trust, and their experience to kind of influence, hey, like what should we be looking out for here with these functions or that type of thing.
What kind of cultural changes , or operational changes, do organizations like, should they start thinking about with risk across this life cycle of technology and partnerships?
Ed: Well, and again, it goes back to the old approach doesn’t hunt anymore, so. Mm-hmm. You really have to. Think unconstrained about what needs to change, what you need to transform across people, process, and technology to now deal with these things to again, respond and recover in five days or less.
Right? And. The word change isn’t [00:19:00] well received by pretty much anyone, right? People hate, change, people and transformation is like , a larger contextual change, and that freaks people out. What, what ends up happening is without leadership. You can’t transform without leadership.
Leadership is required for transformation and in order to deal with these issues, right? Transformation is required because to your point earlier, the, the bars raised now with AI at a level that most people don’t fully understand because it’s still evolving. Mm-hmm. And so if I take a look at, let’s look at everything prior to ai, right?
Let’s look at ransomware, which is the de novo worst risk available, in healthcare.
Luke: Mm-hmm.
Ed: It had to come through some type of identity breach usually. This notion of someone hacking into your system, , without having an identity is half-baked. Right? Right. Usually they come in through an identity.
Well, how do they get the identity through a phishing attack? And phishing [00:20:00] attacks are probably one of the, one of the largest pain points. The weakest link is humans. As we all know, right? Hey, this is so and so from Hanover Healthcare and hey Luke, , I’m working on the CEO’s, , laptop and I just need to get into,, his system.
Can you just confirm his password for me, right. Or something of that nature, right? One
Luke: of those 50, Google Voice, you know what? Caller Id not calls. I get a day,
Ed: yeah, yeah. Or can you reset his password? Whatever it is, right? And these happen all the time. I got a call from my bank the other day and I just, I said, I’m not giving you infor any information to confirm me.
Like I’m not gonna do it. I’ll call you back.
Luke: Mm-hmm.
Ed: And the woman got all upset. And then when I got to her, she was all upset. So I was like, listen, like how could you be upset at this? You’re asking me for my social security number, I’m not gonna give it to you to verify. No, you called me.
I’m not giving it to you.. So anyway, so I don’t think people need to be much more vigilant and much more paranoid. About these things than ever before. , But. In some ways, that’s [00:21:00] a pretty linear attack.
Luke: Mm-hmm.
Ed: You now add AI and you add agents, and agents require identities, and the problem is the whole identity management stack hasn’t really been worked out as it relates to genic technology.
So now you’ve introduced exponentially this thing that can go out and manage tasks, and do things on your behalf and has no. Notion of an identity and access management system. And, and that’s a problem right now.
Luke: Yeah.
Ed: And people are just rushing out to deploy this stuff, and, and I don’t think they realize the, you know, the Trojan horse they’re putting into their environments right now because they’re enabling, you know, an unfettered access vehicle to all of their things called.
AI agent, and it’s wild. It’s wild, right? So anyway, so
Luke: yeah.
Ed: So that’s what’s changed and I think [00:22:00] the, the ability to like understand it and say, Hey, you know what? Stop. Like I know we need to do this, but we’re not doing this until we understand these things. Like,
Luke: right.
Ed: So.
Luke: Yeah, I think you, you mentioned a good, a really good point around leadership, right?
Which I think is something where certain things can get automated away but good leadership’s gonna help steward the automation, right? And know, these things are probably not the things over here we wanna mess with too much. Or, where do you put the human in this mix?
And then like the accountability thing, if there’s no accountability for these things, what is your take on that? I mean, it is really interesting talking to somebody in, in healthcare because I would imagine there are in, with compliance and things like that.
There are penalties and enforcement things, do you see that? That translating over into other areas in tech, through regulation or other means, to hold, companies accountable for when things happen.
Ed: So there are regulations, in [00:23:00] healthcare with teeth that really drive accountability across the board.
Even the regulation as old as hipaa. It’s still relevant because HIPAA was designed, the forebearers, if you will, had insight , and made it more than just confidentiality. They made it integrity of data. Mm-hmm. Which is a problem, which is a risk with AI drift and halluc hallucinations and other, other issues with models and then availability.
So CIA sort of is the governing principle around HIPAA and A is availability and the availability of the systems we just talked about. Regarding ransomware, right? So from that perspective, there’s a vehicle with teeth on it and, that, that is, that can be enforced to hold people accountable when these things do go off the rails.
Luke: Does it work pretty well in, in healthcare? From your perspective?
Ed: Yeah, I think it works pretty well. There’s a website that HHS puts out by the office of the national coordinator, the [00:24:00] OCR, the Office of Civil Rights is the enforcement. Vehicle for hipaa. Mm-hmm. So when there’s a, when there’s a data breach or ransomware attack or whatever, OCR goes in and they assess the situation and they assess a fine or some type of penalty or a corrective action or both are all, and then the organization’s put on notice and so.
If they’re doing all the right things, and yet somehow this still happened, then they were good stewards of the data. They were good stewards of the operations and sometimes, like you can’t have, there’s no such thing as a hundred percent security, otherwise you wouldn’t be able to, to access it.
Right. I can make my house secure, but nobody can come and go in my house. Right. Yeah. So, so that’s always gonna be the risk. The question is. As a steward, did you do what was available? Did you, did you assess the risk? Did you put the controls in place? Did you manage and validate those controls? , Did you do the things that the [00:25:00] statute says you need to do?
And if you did and you’re able to prove that, then quite frankly they can’t find you negligent. And therefore, they might ask you to do a couple things, but they’re not gonna find you. Mm-hmm. Or if they do find you, it’s gonna be small. Mm-hmm. It won’t be multimillion dollars.
It’s when they go in and they say. Holy cow. You have none of these processes and procedures in place. None of your people are trained. It’s the wild, wild west. You, you, someone clearly said, we’re not gonna invest in cybersecurity and we’re not gonna do these things in support of this regulation that we’re governed by.
And someone’s held accountable for that. And then at the board level, the board has to govern. Who takes personal accountability for it? And what that means from a director and an officer perspective, and who at the end of the day is gonna carry the weight of the, of, of the action. And sometimes, you get the ciso, the Chief Information Security Officer, quite frankly, that’s , the scapegoat and gets fired and maybe [00:26:00] shouldn’t have been fired because maybe the, , she went in front of the board and said, we need these resources.
And the board said, no. Didn’t give her the resources. It’s not always black and white. But, make no mistake in, in, in large events like that, someone does, bear their accountability and responsibility, and it’s usually one person.
Luke: And it, it’s really interesting perspective, and insight because, I feel like in the broader tech space, there tends to be this binary dilemma around regulation and it’s just, either toothless and doesn’t work, or there’s a desire to almost
overreach too much, but like having somebody who’s seen practical examples of this like work, right? Because that’s another thing is we’ve seen things where, you know, broad sweeping or general things like even like privacy. Regulation gets kind of selectively enforced, right?
But having examples that actually work or knowing that there are those, I think is important for people. But just to get back to the strategy, right? From your mind, are there indicators or red flags, that separate [00:27:00] organizations, that handle cyber incidents well from those that don’t, that people could keep in a lookout for?
Ed: Yeah. I think one thing to, to look out for is what level of program. Educational program and otherwise, , , at the employee level. So how does the culture respond to and reflect, a cybersecurity program?
And so are they, are there posters in the hallways or in the elevators? Are people, regularly asked to take tests, take security tests? , And is there a wall of shame for those that don’t? Like, is it, is it managed in a way that’s part of the culture? Mm-hmm. That’s always, for me, I look for that like, is security, , and is cyber hygiene.
As important as washing one’s hands in a healthcare setting. Right. We saw the posters for years about hand sanitization and the ability, to eradicate viruses and other bacterial infections just by [00:28:00] washing your hands, right? Mm-hmm. Well, the same approach, could be applied and is applied to cyber hygiene, and are the, is that part of the culture?
So that’s the first thing I kinda look for. That’s a good signal. And you can just look around and see. Like if you’ve gone in, up and down the elevators, you’ve been through a couple of the hallways and you haven’t seen any posters on cyber insurance, or if you walk by a terminal and it’s left open, you can see a patient record on the terminal.
Luke: Oh,
Ed: oh yeah. And you don’t have to walk. You could actually be in the hospital yourself. You could see how the person operates. And I’ve seen that often. Do they. Are they getting out of their terminal or are they leaving it open and exposed and, are they talking about a patient specifically in an elevator with other people in it?
Like these things are, you’ll be able to recognize quickly, right?
Luke: Mm-hmm.
Ed: No,
Luke: that’s helpful.
Ed: The other stuff is sort of, , do they have two factor? What are they doing for multifactor? Is everything multifactor. If it is, great. If it isn’t, that’s a problem.
How are mobile phones being [00:29:00] managed?
Luke: Yeah.
Ed: So, so there’s a number of things you can look at that could provide signal, do they have an AI governance committee? Mm-hmm. How well represented is it? Do they have a full inventory of all their products and services?
Like they don’t have to be advanced, but
Luke: Right.
Ed: It’s amazing when I go into, when I go into, health systems and they don’t have an idea or they’re not sure how to get the full inventory of products or services under contract. It’s in multiple systems, it’s in, or we have to go to HR because HR is managing identities and managing privilege access, right?
Or the system, the HR system is, and that’s the source of truth, right? Mm-hmm. These things, you can quickly, you can go within a, within an hour, I could tell whether or not a hospital has got issues or not.
Luke: Yeah. And what surprises come along the way? I, I would imagine the, uh, in that one, I, given all of this, I think, it’s getting towards the end.
I was just curious, like, what’s your outlook on the future? Are you optimistic a about [00:30:00] where we’re going over the next five to 10 years with the landscape of the way it is?
Ed: Yeah, I think so. Listen, i’ve been in technology all my career, I’ve never seen anything.
I’ve seen a lot of changes in revolutions, if you will, in technology, but compared to this is those are much more evolutions. AI is fundamentally changing everything and we embraced it day one. We took an architectural approach to it within our risk management product. We built it in from the ground up, but we also made it secure by default, meaning that we didn’t turn it on outta the box.
We gave customers the ability to turn it on, mapped, mapped to their maturity, their adoption maturity. So if they didn’t want to use any of it, they didn’t have to, they could still get the benefits of our product, but the minute they were ready, they could turn on one, a subset of features or all features at once, right?
So we gave them the flexibility to adopt. Based on where they were and where they, you know, wanted to be, within that adoption, spectrum. And I think that, um, AI has [00:31:00] got, has so many opportunities in so many industries that it’ll be exciting. And yet it’s scary to see in some ways.
Where society is gonna end up, right? What, what jobs will actually be replaced. We know jobs are being replaced right now. Right? And part of the issue, I was having dinner the other night and I was talking to someone, at one of these large AI companies. And the issue in my mind is in every other evolution as it were, where everybody was afraid about losing their jobs and everything else, like it’s happening here, but the difference is.
The gap between when it happened to, when really jobs were getting lost, there was soak time, there was time to absorb the losses. There was time for people to mm-hmm. Change jobs. So the overall economy didn’t really feel the effect of it at scale. This has the risk of it really affecting the economy because.
The [00:32:00] rate of the rate of job loss, and then where are folks going to go to absorb that is really problematic, I think. Mm-hmm. And I don’t think anyone’s really talking about that. Mm-hmm. And because we should be thinking about, okay, where’s the, where’s the safety net? Where do people go? They do lose their job.
Where should they be going? Right. , And at what scale and when does the government step in and say, okay, enough or put in some breakers along the way to govern? Mm-hmm. The loss. . No one’s talking about that right now. And that’s the thing that worries me the most because I think we’re, humans are resilient, humans are adaptable, we’ll adapt, but at what loss?
What’s the impact because of that, that, that time concentration now we don’t have that time as we did when the internet came along or when the horse and buggy and then the car came along. Right, right. We had time to to absorb those issues as a society. Yeah, of course it [00:33:00] affected people, but it didn’t affect holistically.
Society at scale in a short period of time, and that’s what I worry about. Right? Does that make sense?
Luke: Yeah. Oh yeah. It seems like a mix of two, the broad application, while at the same time the lack of a lot of market fit. Too, where like the power’s there and it could potentially be impacting almost anything.
The market fit isn’t necessarily there yet. But you feel that’s coming pretty soon, right? Like where it’s all just kind of like this overhanging cloud, but it’s a fast moving storm, I think. That your point on the time, the duration of things and how fast these things are getting, considered in or not considered in is a big problem.
So I think that makes a lot of sense. I have a couple rapid fire questions. Yeah, sure. If you don’t mind. Yeah. Yeah.
Ed: Go.
Luke: One cybersecurity myth. You wish more leaders would stop believing.
Ed: Oh, wow. That technology alone can solve the problem.
Luke: Awesome. What’s one security habit [00:34:00] every company should adopt tomorrow?
Ed: Tr phishing training
Luke: and one technology that will most improve cybersecurity in the next five years.
Ed: Ai.
Luke: Awesome. Most importantly too, if people wanna follow along with your work or follow your company or follow your personal, socials or anything like that, where can they go to check that out?
Ed: Yeah, you can find me on LinkedIn. I have a podcast, I’d love, love for you to follow that or at least listen to a couple episodes. And that’s called what’s called Risk Never Sleeps
Luke: Perfect.
Ed: And it’s on Spotify, it’s on Apple. You’ll be able to find it or you can go to cnet.com, which is my company.
And, you can get a link to the podcast there. I’ve also written for Forbes. I have a couple of articles, in Forbes as well.
Luke: Excellent. Well, ed, I really appreciate the perspective, the conversation. I think it’s really helpful like with shaping a lot of what’s going on with our, audience and yeah, I’d love to have you back some time to check back in on things and, I love that and really appreciate the time.
Ed: Thank you, Luke. I’d love that. [00:35:00]
Luke: Thanks for listening to the Brave Technologist Podcast. To never miss an episode, make sure you hit follow in your podcast app. If you haven’t already made the switch to the Brave Browser, you can download it for free today@brave.com and start using Brave Search, which enables you to search the web privately.
Brave also shields you from the ads trackers and other creepy stuff following you across the web.

