Back to episodes

Episode 35

Creating Inclusive and Responsible AI for the World of Tomorrow

What if the future of work could be reshaped by artificial intelligence in ways that empower rather than displace? In this episode Didem Un Ates, founder & CEO of LotusAI Ltd, discusses strategies for organizations to integrate AI in a way that maximizes benefits while minimizing job displacement. We also explore the complex landscape of AI’s societal impact and the need for diverse perspectives in technology.

Transcript

[00:00:00] Luke: From privacy concerns to limitless potential, AI is rapidly impacting our evolving society. In this new season of the Brave Technologist podcast, we’re demystifying artificial intelligence, challenging the status quo, and empowering everyday people to embrace the digital revolution. I’m your host, Luke Malks, VP of Business Operations at Brave Software, makers of the privacy respecting Brave browser and search engine, now powering AI with the Brave Search API.

[00:00:29] You’re listening to a new episode of the Brave Technologist. And this one features Didem Un Utesh, who is the founder and CEO of Lotus AI Ltd. She has over 26 years of experience in AI and generative AI, including senior roles at Schneider Electric, Microsoft, and Accenture. As a former VP of AI Strategy and Innovation at Schneider Electric, Didem led AI strategies, innovation roadmaps, and responsible AI practices.

[00:00:56] She’s received multiple awards for her contributions to AI and [00:01:00] serves as a senior advisor for organizations like the Goldman Sachs Value Accelerator. In this episode, we discussed challenges companies faced when trying to implement AI solutions, essential job displacement due to AI, and how workers can prepare.

[00:01:15] The need for regulation in this space. Open source projects and innovation in the space, along with ways to get more women to work in AI and tactics that have been successful so far. And now for this week’s episode of the Brave Technologist. Didem, welcome to the Brave Technologist podcast. How are you doing today?

[00:01:35] Didem: Thank you, Luke. I’m doing great. Greetings to the West Coast.

[00:01:39] Luke: Excellent. Excellent. I’ve been looking forward to this interview. So I think just to kind of set the table for the audience a bit, can you tell us a bit of how you ended up going into doing what you’re doing?

[00:01:49] Didem: Absolutely. So I’ll give a little bit of personal background as well.

[00:01:53] Look, if that’s okay, I’m Turkish and I was fortunate to get scholarship after high school. So I [00:02:00] went to the States to study both electrical engineering and management. And I also want to share a bit of personal. context there. The driver for me to leave Turkey was actually, I was coming from a family full of engineers and entrepreneurs and mathematicians.

[00:02:16] So the, my passion for technology definitely is genetic, I would say, but there was also another kind of a sensitive area. There was domestic violence in my family. So as much as I love my parents, I Decided that I need to get out, get a great education and be independent, both financially and psychologically.

[00:02:36] So that was a huge driver. And I couldn’t be more grateful to University of Pennsylvania for, helping me with my education. And especially that degree called the management and technology dual degree program that half technical, half business. Education really set the scene for all of my 26 year career in [00:03:00] technology and also AI, of course, so I’m so grateful because it’s a bit like mixture of left hand side and right hand side of the brain.

[00:03:07] And if I was in R and D, I would be or engineering, I would be closest to sales. If I’m in sales or business, I would be closest to engineering and I cannot emphasize that highly enough. It’s been a treasure. So to answer your question more broadly, Spent about a third of my, of those 25 26 years in management consulting, Capgemini, EY, and most recently Accenture, where I set up their Microsoft Data AI practice.

[00:03:32] And then two thirds in the industry, about 10 years with Motorola, a couple of years with British Telecom, and almost 12 years with Microsoft, I would say. And then I also had a brief period at Schneider Electric, leading their generative AI strategy innovation across the company with 20 functions, internal functions.

[00:03:52] I would say my AI journey started with Microsoft. Interestingly, first at Microsoft Ventures and [00:04:00] Accelerators, which was the startup itself before M12 or Microsoft for Startups, Unicorns, as you may be familiar with them. Where it was very clear when I looked around on our startup portfolio around the globe, the AI, this is like at least 10 years ago, if not more AI startups were the highest potential startups, about maybe 90 percent of the portfolio.

[00:04:22] This is traditional AI, of course, but still, this is where I said, Oh wow, something big is happening. And. This is where I want to go next and I think somebody heard me. I’m not sure if it’s the universe or something else, but one of our corporate VPs, wonderful gentleman who became my mentor later on, sent an email around Microsoft same week saying for the first time in the company.

[00:04:47] We are setting up a business AI team within Microsoft Research to incubate the first algorithms for enterprise. I immediately emailed them back saying, I know you’re not going to hire [00:05:00] outside the U. S. That was the, at least pre pandemic Microsoft world, where you have to be in Redmond core in the corridors.

[00:05:07] And I said, look, I can help with volunteer projects. I know the AI startup ecosystem. I can do it because we’ll. Maybe acquire some or JV with some. I can do that for you or and or or I can also do a competitive benchmark analysis, right? What’s Google doing in the space or AWS doing in this space?

[00:05:27] Again, 30 seconds back, he said yes. And after a very busy summer with, you know, weekends are overrated. I ended up joining his team, and it’s been the most privileged, most wonderful journey since then, because I got to be in the kitchen, or at least one of the key kitchens, if I may say so, that cooks these solutions.

[00:05:48] And when I joined that team, we basically put together some managed solutions. We later on packaged them as SaaS AI platform, which became the Power Platform.

[00:05:58] Luke: Oh, okay.

[00:05:59] Didem: I’m not sure if you’re [00:06:00] familiar, but it is one of the most, I would say, popular favorite SaaS solutions out there. And I also see it almost as a grandfather, grandmother of the copilot suite right now.

[00:06:12] Right. So it’s the same thing. I ended up in AI engineering team afterwards, and then on the, sales organization reported to the chief data officer. Focusing on their largest data AI engagements, usually with Fortune 500 companies, co creating these joint roadmaps that Microsoft had on data AI with the largest companies in the world.

[00:06:34] I also got to lead what we call operationalization of responsible AI. At Microsoft, at Accenture and Schneider Electric, and now actually helping Goldman Sachs and other clients of mine in their ecosystems. What else? I think it’s also worth mentioning I got a chance to help with the OpenAI partnership after the first billion dollar investment.

[00:06:55] That was another highlight of my journey. I think that’s it. Sorry for the [00:07:00] long winded answer, but it’s been a fascinating journey seeing it from inside, I have to say.

[00:07:05] Luke: I love it. I mean, I think there’s a couple of things that I found super interesting about that. Like one is people often think that AI is just something that came out about two years ago, right?

[00:07:14] Like, but you, you mentioned good timescale. It’s like, I mean, you know, the, the, the high level concept has been out for a long time, but, but yeah, you were actively working on this stuff, you know, 10 plus years ago, right? And not just anywhere, right? Like at Microsoft, these are very active areas. And also I think it’s really neat that you’ve got both that business and the engineering background too, because that’s kind of where I live too at Brave and elsewhere, where so much of this gets hyped out on the marketing side.

[00:07:38] And it’s like. Right now, it seems like a really key time when, especially with generative AI, but more broadly too, of like, okay, all these neat little hand wavy use cases, like where are we going to actually meet the rubber meet the road, right? Like, how are these actually going to be businesses? And then how can we make these responsible business decisions around these things?

[00:07:55] It’s really important to have people like yourself with balanced backgrounds that are thinking about these [00:08:00] things from different angles like that. Why is what you do important to you? And how does the work that you do affect the world around you?

[00:08:09] Didem: Thank you. No, these are excellent, excellent questions. We all, all, every single one of us need to be able to answer that question very, I think, Authentically for ourselves going forward, especially with these technologies taking off.

[00:08:23] So thanks for asking why it’s important. Well, if I may start with myself and then hopefully have to cover the broader, let’s say, picture is I shared some very personal information with you, Luke, where I come from, what kind of an emerging market, what kind of a family, what kind of an educational background.

[00:08:43] And that was intentional because I do think that the taking off this incredible acceleration and ignition of these AI, generative AI technologies, coupled with diversity inclusion, Responsible AI, [00:09:00] sustainable AI, and the, the importance of education. In other words, I am where I am because of all those scholarships and all those excellent educations I was lucky to get.

[00:09:11] It’s basically the combination of everything in my history. Stars aligning, the magic happening, being at the right time, right place, but also being very opportunistic and working hard at those opportunities, making the most of those opportunities is what makes it very exciting and meaningful for me. So it’s, to me, it’s my calling.

[00:09:31] That’s very clear to me. And For me, technology is not just a cool thing, exciting thing that we keep learning and learning. It is actually is a huge superpower. It’s a magic wand. And I remember the words of one social impact entrepreneur from Chile. She, I met her in Grace Hopper conference before the pandemic.

[00:09:54] I think it was 2000. 18 or something. And she said, if [00:10:00] women knew how much they could change the world for the better with technology, we would have so many more females in technology. I couldn’t agree more. So to me, that’s why I’m bored. But then if I look more extrinsically, I do think this is way more disruptive.

[00:10:18] Important than anything we have experienced. And trust me, at our age, we have experienced a lot, right? I mean, we’ve seen mobile, we’ve seen cloud, we’ve seen internet, we’ve seen online, countless examples. I don’t think any of us have seen anything like this, so this is huge. And I am. Shocked when people are still asking, Oh, is this a hype?

[00:10:41] Which planets are, I mean, maybe we are, it’s a bit, to me, it’s a bit like still asking is the world flat or something that seriously, but then I am also, I am very, very cognizant that I’m coming from a well oiled machine where things happen. There is no [00:11:00] constraints on, any technology I can try experiment play around and therefore things happen.

[00:11:06] I realized only very recently that that’s just an exception for the rest of the world. And these are very powerful, rich, et cetera, companies or ecosystems as well. They just see and experience a different world. So it’s very understandable that these questions are still being asked, but for us, for me, I mean, This was a question, 12 year old question, even then I wasn’t questioning it.

[00:11:33] You know what I mean? Oh, totally. There’s nothing, no doubt about the disruptive, the scale of the disruption from my perspective.

[00:11:41] Luke: And I think you see different things with, with what’s happening with AI, kind of in how people talk about it too, where, I mean, you will, just to step on the inclusion bit.

[00:11:50] You know, I’ve spoken with a lot of different people in the AI space. And when I ask a lot of them, like, what is your biggest concern or biggest challenge right now, it’s the one field where I hear a lot of [00:12:00] people saying, look, if we, there’s a real opportunity for entire countries where the people to just completely fall behind, if they don’t start playing with this now or start, I have access to it, right.

[00:12:10] From your experience, is that one of the biggest challenges? What are the biggest challenges when you’re looking at how transformative this is in the space and whether it’s with inclusion or, or, or other areas too?

[00:12:20] Didem: Yeah. So I would like to answer that question in two parts. One is more like, how do we implement this and what are the typical challenges?

[00:12:28] But more broadly, what are the challenges? Let’s say as a society, as, as this, and it’s very interesting where to start. When I joined that team at Microsoft research ages ago, when I looked around, these are rough percentages of course, but it’s the reality. I would say maybe 90 percent around me were Chinese men, then a couple of Canadian men, a couple of French men, and then one Turkish [00:13:00] female weirdo.

[00:13:01] Sitting outside the U. S. And I was used to lack of diversity in technology for ages before then. I mean, for decades before that, it’s not like this is new to me, but the scale of the lack of diversity in a I was mind blowing. I mean, genuine, and this is the technology which needs the most, maybe perspective, diversity of perspective to be more successful.

[00:13:27] I mean, it’s not just a nice to have it for the product solutions to succeed. You have to look at. Tons of different angles. So I went home that night when we had our first meet face to face meeting. And I said, this, this is not acceptable. What can I do? And I basically put together a team of 80 volunteers.

[00:13:50] We hadn’t even met yet, the global team of volunteers. And we said, let’s Let’s host girls in AI hackathons, let’s, let’s tackle this, try to tackle this and start with [00:14:00] gender. I mean, it’s not, it’s just one of the dimensions of inclusion, diversity, et cetera, but it’s a very obvious one. So we said, let’s do these with high school girls and not to convince them to pick computer science or something in university, but whatever they become artists.

[00:14:15] Singers, whatever, they at least are not intimidated by technology, but embrace it to become a better artist. That was our mission, volunteer project again. And I love volunteer projects, highly recommend them in every industry, not just big tech. And. We hosted these pilots in Europe and then San Francisco, New York, et cetera, Seattle, and they became so successful.

[00:14:38] Our senior leaders also saw the impact, the clients, the partners like KPMGs, et cetera, saw the impact. The girls themselves started hosting these psychathons at their school, and they became a global program with partners with everything. So why I share this is that While this marginalization is very real, it was real even from years, [00:15:00] years back, not just today, we can all, each one of us can do something.

[00:15:05] It may be a drop in the ocean. Like we had a Facebook group of 8, 000 girls, basically after a couple of years work, even with the pandemic. So, For me, it’s one of the proudest moments in my career. I might have brought in billions of dollars of revenue to many companies by now, but for me, doing that was one of the most meaningful and impactful milestones.

[00:15:26] Now, looking at the pictures since then, and then broadly AI with generative AI, we know that everything only accelerates. Whatever is whoever is marginalized will only be more marginalized because everything is moving so much faster and those with resources, those with talent, those with innovative skills are running, flying faster than ever.

[00:15:52] I’m now recently recognizing look even a more interesting trend. It’s not just about minorities, [00:16:00] et cetera, anymore, like emerging countries or this or that infrastructure problems, access to PC, internet problems, but I’m also firsthand experiencing if you haven’t as a large enterprise or a midsize company, if you haven’t already started your journey.

[00:16:22] At least looking at your data state. Seriously, good luck to you. Because as you mentioned earlier, whoever starts to journey earlier, starts failing, starts hitting, you know, the rubber hits the road, as you would say, they learn faster and it’s a journey for everyone. I’m not saying those who don’t. Think they need a I for anything.

[00:16:46] I respect that. But I do think there will be a very big missed opportunities. I’m more importantly, a lot of disrupted wipes out companies out there, even industries, probably even functions, [00:17:00] right? So that’s why it’s super, super important. This inclusivity Factor bringing everyone around upskilling, re skilling those around us, wherever, professionally, personally.

[00:17:12] I mean, I, feel so responsible for this dimension that I went ahead and got certified for executive coaching because coaching is the only way you cannot tell even a nursery child what to do. Let alone a C level executive on, oh, do AI or else, it just doesn’t work. So I’m trying to improve myself with, you know, learning to ask the right questions, the best questions to inspire.

[00:17:39] So more people can start the journey or continue their journey more effectively.

[00:17:44] Luke: Well, and one of the reasons why I was really excited to have you on, and I think, you know, you just, kind of explained it to, I think a lot of people might be wondering, well, like, how do I get started? And you’re sharing that bit about, Oh, I, we started with high school age people, like, and, and I think it’s just such a key, it’s a, people don’t know necessarily [00:18:00] like where to start or where to begin.

[00:18:01] But I mean, like, I think, you know, you did a really great job of just breaking it down to like, we’re going to go do this hackathons at high schools. And when you reach somebody at that age, they can steer the ship. To get to where they want to go with college and all of that. And just exposing them to that.

[00:18:15] Like it’s, it’s really helpful, I think, for people who are listening, who might be wanting to do something like this. And of course, you don’t have to have buy in from the company, but where do you start, right? Like I’ve seen this work really well with high schools too in the past. So it’s really cool to hear you share that with the audience.

[00:18:30] Didem: That was years ago, Luke. So I think right now the age is probably primary school. And for joining me. It’s not about AI or DNA. It’s about doing cool stuff. So it’s much more natural, but there is an intimidation from minorities, especially female students and so on, at least in most education systems around the world.

[00:18:51] And I realized I forgot to answer maybe the second part, which is more around companies, more around organizations. It [00:19:00] doesn’t even have to be commercial institutions, basically. The challenge is there. I used to say, oh, data, you know, make sure maybe five years ago, that was my talking point, but I corrected myself.

[00:19:13] It actually starts with culture, people, and culture more so. In other words, If the environment that we are trying to implement AI is open minded to looking at this technology A, end to end, not just POCs, experiments here and there in a lab environment, but actually how do I use this end to end across all my value chain, internally and externally, doing that very honest Let’s say self confrontation, looking at each process and saying, okay, this is where I can use this off the shelf tool.

[00:19:49] This is where I can do a use case because they’re all there. I mean, if we do, if we actually step back and carve out the space for our teams, that’s the key. And there are, of [00:20:00] course, other challenges, more typical challenges of responsible AI and so on. But once people get going, actually it does come together.

[00:20:08] So isolated teams or isolated use cases, I think I would humbly say is what not to do because they won’t scale because they won’t be embraced by the team next door because you are looking, making them look bad. Why should they embrace your POC or your very cool experiment when Somebody looking that side will say, Why didn’t you do this?

[00:20:34] Why didn’t you see? You know, it’s just very natural human nature. So I could talk a lot about this. But anyways,

[00:20:41] Luke: no, no, no. That’s why you’re here. This is fantastic. I mean, because I think to like, it is one of those things where again, having your kind of balanced background where you can look at this and say, like, Oh, yeah, These are the value chains.

[00:20:52] You’re not stuck in a silo of like this. Oh, we, we have to do what these guys over here doing or, or, or what this company over there is doing. It’s more of like, I mean, that’s how [00:21:00] you hear about so many startups becoming startups, right? Like with Twitter, right? Like the internal comms tool or something like that, that was used for something that has now become, you know, what it is, which is, it’s wild.

[00:21:10] You kind of switched gears a little bit. People a lot of times are kind of concerned with things like job displacement with AI. From your point of view, like, where do you see the biggest disruption around displacement with AI? And how can workers prepare for it? And what are the areas where you have the most kind of near term concerns around this?

[00:21:29] Didem: So I think I would call it landscape changes and Job or task changes for sure. So I’m it’s definitely going to happen and it’s already happening. And look, I mean even I as One of the cooks in the kitchen had experienced it around I think March 23, you know, this was a couple of months after chat GPT announcement We at Microsoft were literally popping up features for the co pilot suite every few days.

[00:21:58] Even I couldn’t catch [00:22:00] up with the announcement. And looking at sales co pilot and this and that, I remember one morning and just sitting back. It’s almost traumatizing, right? You, you wake up one day and your job is augmented to such a scale so fast that if, okay, where do I add? How do I, I know this is where it’s going.

[00:22:22] So how do I add value? So it’s very, very real. And I wouldn’t even, Separate any industry or any function. It’s going to be across the board. It is happening across the board. Let us look at software engineers, right? There is now, according to some, already an algorithm that’s the best software engineer in the world.

[00:22:44] And guess what? That company is using that software algorithm to build itself. Who was there to sink it? Like we’re already using an algorithm to build itself as a software engineer. So [00:23:00] ironically, I mean, We are experiencing firsthand software engineers or technical people, and it will only cascade. It’s a domino thing, right?

[00:23:10] So everyone, I think I would highly suggest, recommend that each of us looks at their daily life business as usual at work and say, what can I use? To minimize any non value add or low value add task that I’m doing, and how can I make sure I’m focusing on the interesting stuff? Because anything else, whether we like it or not, will be done.

[00:23:39] I think it’s better if we shape it. With the tools that we trust and prefer. So that would be my suggestion, rather than saying, Oh, it will be first marketing people and then HR people. I mean, there are tons of studies out there. Everyone can read them, but I don’t think that’s the point. I don’t think it’s about replacing [00:24:00] humans.

[00:24:00] It’s making us genuinely more accountable for what we stand for. What value do we want to create in our lives and in our companies? It’s tough. You know, the system we created as humanity is actually turns us into robots. Ironic, but you know, I mean, we created these corporations or this work life, whatever.

[00:24:24] set up where people, most people, billions of people, billions of us wake up X hour, go to work Y hour, and then come back at Z hour. And then we do the same thing over and over again with maybe minor changes and go get our salaries. That’s actually quite robotic. So when all of that is disrupted and it is being disrupted, I wholeheartedly believe that then what are we going to do?

[00:24:51] Luke: Yeah, no, well put, well put. It’s a big question too. And I think it’s one that a lot of people are facing right now. Like you said, I [00:25:00] mean, you’ve got, you’ve got the algorithm building the software. It’s just wild. I mean, and how quickly it’s in the operating system. it’s in, The document software, right?

[00:25:09] It’s everywhere. And at the same time, everyone’s saying, how do we monetize this thing? Right? Like, it’s just super interesting. Where do you think people tend to misunderstand or, you know, underestimate the issues around AI or generative AI from, from your point of view?

[00:25:24] Didem: So, it depends on who we are talking about, look, but in general, in my point of speaking, engagements from high schools, universities to corporates, very or very senior, let’s say, audiences, there is, I think, a very prevalent misunderstanding or disconnect around outsourcing the responsibility or blaming the technology.

[00:25:47] I always say, look, this is just a brutally honest mirror. It’s built by humans for humans. And when you say, Oh, the technology is biased or this is biased, that’s blah, blah, blah. This is [00:26:00] the reality of humanity. This is the reality of our. Society is, it’s the data. I mean, what you feed to the technology, it spits out.

[00:26:08] So one of the biggest misunderstandings, I think, is around What the technology is and what it is not. The other thing, I think there is a huge confusion around hype. The word hype really ruffles feathers for me. I guess certain stock prices moving south or north has nothing to do with whether the technology is having impact or not in our lives.

[00:26:35] Can we even think of a life without it? Apps like Uber, I remember that life, but I, I don’t even want to remember it.

[00:26:43] No, I mean, used to be in Philadelphia and all these super cold places in winter where you couldn’t find a cab for half an hour. So I’m at an, Oh, and there’s these misunderstandings about, so there’s, there’s that confusion, maybe not misunderstanding. And I understand, of [00:27:00] course, valuations and stock prices are tied to the returns on investment, but.

[00:27:04] I think people who see the technology at work, who are able to experience it in their hands, able to get things done, there is no way of return. It’s an eye opener. Whereas there are, I think, misunderstandings. There’s the hype misunderstanding for sure. I think that fear In general, intimidation, or am I going to lose my job, is of course not a misunderstanding, but it’s a big area of contention, obviously.

[00:27:37] Again, in all my engagements, exchanges, I try to say, like, how can you not see the opportunity for you? Yes, change is scary, exactly, but at the same time, it’s our generation, And it’s up to us how to shape this up to good people trying to do good things and hiding under [00:28:00] the rug doesn’t make it go away.

[00:28:02] The only way is, you know, getting up and doing something about it,

[00:28:07] Luke: something

[00:28:07] Didem: about our careers, our lives and so on. So I think those three areas seem to come up a lot in my communication.

[00:28:16] Luke: That makes sense. I mean, I think in a lot of it, too, is you have people who are kind of, like yourself, is what I’m gathering from this conversation, where, you know, you see the opportunities and you go seize it, right?

[00:28:26] And then you have a lot of people that are just not right sure what the opportunities are yet. And I think we’re at that really interesting time where people like yourself would be like, well, actually, like, here’s how you. Here’s where you can look and kind of create these opportunities and figure what you’re going to find in your own is going to be amazing.

[00:28:42] Another thing that comes up too a lot is around regulation, kind of with the space. I mean, given how disruptive it is and, but also kind of like what we’re talking about too, where it’s obvious to some, right. But to others, especially maybe politicians or, you know, or regulators may not. See it as clearly or from [00:29:00] the certain, you know, vantage point that you would have.

[00:29:02] Right. But having working around these companies, building these technologies, et cetera. How much is what you’re seeing now in the regulatory space, like impacting your work? Or is it something you’re really concerned about or, are you kind of less concerned about it? I mean, what would your general take on how regulation is going in the space?

[00:29:19] Didem: So thanks for asking. And I have always been a very passionate, responsible AI champion wherever I am. And responsible AI does require some boundaries, which are usually set by regulations. I mean, it could be by the kind intentions of. That, but, you know, we couldn’t rely on kind intentions for murders or whatever, you know, everything going wrong in our society.

[00:29:45] So I do believe we need more regulations. I have spoken at EU commission panels and so on, where I was saying like, we shouldn’t be talking about that. EU AI, this is years ago before the pandemic. We should be talking about responsible [00:30:00] metaverse in the sky. Catch up, please. Let’s, let’s get up to speed. And those were my sincere feelings and thoughts at the time.

[00:30:06] There’s always this catch up of at least five to 10 years. In terms of regulations, having said that, regulations need to be reasonable and implementable, right? I mean, the water will flow only one way. There’s no way we are going back, or let’s pause, or whatever, those things do not sound very feasible to me at all, never did.

[00:30:26] I do think there is a very positive Progress by regulators to try to genuinely catch up, understand the technology, get help from industry and academia. So there is all goodness, but then look, there is, I mean, once I left the big tech, I didn’t really leave, but formally, let’s say when I stopped working for the likes of Microsoft, I realized actually, even Let’s say a step away from regulatory topics, how hard, simply hard it is for employees, regular [00:31:00] people to play around, experience, use this technology.

[00:31:04] I mean, for God’s sake, people can’t even get on a website of an AI app like otter. ai or whatever. You know, I don’t, I’m not trying to promote any app, any AI solution here, but I’m just giving an example. How can you? Decide to invest whether you’re going to build a use case or invest in a company or not, if you cannot even access the thing, if you cannot even, and this made me realize a much broader, much scale, a large scale problem.

[00:31:36] Where AI initiatives are often bundled under CIO, CTO, home umbrella, and with all due respect, fantastic, of course, teams and leaders there, but it becomes priority number 100. In the cost cutting in the cost consciousness or this technology or keeping the lights on type of [00:32:00] worries, rather than looking at it holistically and saying, I need a sandbox.

[00:32:05] These employees have to use this. Not that maybe not 100, 000 people, but these 1000 people will. Have to have the freedom to try access whatever they need so that they don’t have to try things on their mobile or go to their home and try it on their personal PC, which is even more dangerous with company data.

[00:32:26] So I’m tying those two together because I think regulations can only Put guardrails so much. If we have these artificial guardrails, because people are so scared if something goes wrong, their head will roll, then you’re already dying. You just, it’s a slow death, I mean, because you don’t have your talent, cannot get the skills to understand the technology and keep up with it.

[00:32:55] So this is a much bigger problem that I think could be handled [00:33:00] with more training, more awareness, evangelization of, you know, in parallel to regulations, if that helps.

[00:33:07] Luke: No, no, it does. To kind of branch off of that a little bit, one thing that’s really jumped out has been how much open source adoption there’s been around AI, like, what’s your take on how you’ve seen open source projects developing in the AI space compared to like what you’re really familiar with, with the big tech side of things?

[00:33:26] Didem: So on the big tech side of things, I’m a super transparent person to my own detriment, I guess, but I prefer to be so. Anyways, I think it all boils down to the financials, right? I mean, the bottom line, if you care about, If you have other incomes, let’s say like meta or something, then of course you can go open source because you you’re winning your bread from other things.

[00:33:52] If it’s your bread and butter, of course you don’t want open source. So the big takeaway of decision making there is I would think a [00:34:00] little more ROI driven, having said that there are millions and millions of wonderful people who are trying to do great things with technology and share it with the world.

[00:34:10] So that’s It’s a noble, let’s say, motive. I do think it will have to work together, and the key thing is genuinely awareness of responsible AI, training of responsible AI, and standing up for it as individuals. No one, no one, not governments, not corporations, universities or academia, can take care of such high risks.

[00:34:39] I mean, I love these technologies, don’t get me wrong, but of course they come with their huge risks. They can’t. We need to uplift ourselves globally and holistically. While that may sound very touchy feely and idealistic, I do think it’s possible, right? I come from a country that, that struggles with democracy.

[00:34:57] It has been even more so recently. And [00:35:00] that’s one of the biggest regrets. in my personal life, right? I don’t want to repeat the same thing for technology. So that’s why I really shout evangelize responsible AI. So if we get those principles, those awareness of what might really go wrong and right, the balance will come naturally.

[00:35:20] How much open source, how much closed, whatever. I think that’s as much as I would be able to say. I’m also watching this space. Look, to be honest with you, it’s a very interesting debate these times. I would add I’m on the board of tiny amount of foundation. I think that’s also an interesting twist.

[00:35:38] I loved it. I mean, they reached out to me. I think it was January, February to join them. And I did not think a second because the moment we have these AI tools in our handsets, in our glasses and everything. And I’m big fans of those products. For example, 8 billion people basically using it. It [00:36:00] automatically gets democratized and will, the accountability will also be distributed, decentralized.

[00:36:06] So it’s a way of democratizing. Edge AI is what I’m trying to say, is another way of looking at the same topic, I think. And that’s the only way to go right now. The cloud and these closed systems, it’s a very elitist, shall we say, very small circle. Right. You know. I don’t need to give numbers, but it’s a super small, even geographically, it’s a super small space.

[00:36:33] But that will change big time in a matter of months, not even years in my humble opinion. So that’s going to be also interesting to watch.

[00:36:43] Luke: That’s fantastic. Anything that we didn’t cover that you might want our audience to know about?

[00:36:48] Didem: Oh, thanks for asking. Something that I think we touched on, but I would genuinely like to emphasize and amplify is talent transformation.

[00:36:57] So upskilling and [00:37:00] reskilling is the only way to go in this space. You asked about challenges or where to start where to start is really people and talent and culture culture. Because I wrote a blog about this recently in the Forbes tech council. If our organizational cultures are still home to or housing, keeping, let’s say bullies, toxic behaviors, et cetera, all these behaviors will only get amplified and accelerated.

[00:37:31] With these technologies, whatever AI use cases or tools we implement, they will be used for these types of behaviors as well and in much worse ways. So, we need to look very honestly where we choose to live, where we choose to work, and clean up that environment if, if it is, Housing some of these behaviors and say, okay, now, and to end, how can I uplift and make the most of [00:38:00] this opportunity internally, externally for my top line, bottom line, everything, you know, revenue impact and cost cutting impact and so on.

[00:38:07] And that’s only with talent transformation. So I help. I help companies on that front because many people don’t know how to start that. But for instance, Gartner has this very simple framework matrix talent transformation framework for AI. I’m a big fan and they help. Those, that matrix, simple matrix helps you look at your organization.

[00:38:30] Let’s say 100, 000 people. First of all, where are my people from an AI, Gen AI skill set or responsible AI skill set? If 80 percent of the population is absolutely clueless, A, that gives you an idea where you are starting. Because when we do a gen AI scan across a company, the savings, the costs, productivity gains, let’s say, are somewhere between 10 to 40%.

[00:38:54] These are huge numbers, huge numbers. CFOs boards immediately get super [00:39:00] excited, but it doesn’t mean we lay off 30%, 40 percent of the workforce. It just doesn’t. Because even if we did that, there is not enough skill talent to do the good things we want to do. The only way therefore is upskilling and reskilling.

[00:39:14] So coming back to this framework, we look matrix by matrix, team by team, business unit or country by country to say, okay, I am in this box where I have some skills in, let’s say some tools, but absolutely clueless in others. How do I go to this box that has more skills in the area that I care about? It could be marketing.

[00:39:35] It could be HR. It could be whatever, you know, supply chain. So putting down. Sitting with HR, chief people officers, training officers, and laying that training plans team by team is so important, I cannot emphasize enough. That’s something, if any of your audience would be, I would be delighted to help with because [00:40:00] most people are stuck there because HR leaders very understandably are not too much in AI.

[00:40:07] I mean, they may be using some, but they’re not going to be experts understandably, but that’s actually where everything starts and ends. And anything in between that doesn’t care about their people, their talent. We’ll not succeed anyways, it will be a short journey. So really wanted to emphasize that how important it is to proactively, compassionately, because it’s also true if we don’t think about our talent and teams compassionately, there’s also no success.

[00:40:38] down the path. Right. Because this is, you know, people resist, right? Very understandably, they, they are not, nobody’s stupid. So that’s another angle that I think we need to take very seriously and look down on. And unfortunately I see very few organizations actually take Doing something about it.

[00:40:57] Luke: Where would you recommend they follow you to learn more [00:41:00] about some of these things or to contact you or, or just to keep up on what you’re putting out?

[00:41:04] Thank you for

[00:41:04] Didem: asking. Yeah. I mean, I think LinkedIn would be easiest. I’m quite responsive there or they can, I’m happy to provide my company lotusai. co. uk or ddam at lotusai. co. uk. Very, I’m very, very approachable. I may not be able to respond to me, but I promise I will get back to them. So, no, that’s

[00:41:25] Luke: fantastic.

[00:41:25] We’ll be sure to add your LinkedIn to, to the the show notes. And, and yeah, I, you know, I really appreciate your time and, all the insight, this has been really fantastic conversation, love to have you back to sometime in the future.

[00:41:38] Didem: Thank you. Likewise, Luke. Wonderful to talk with you.

[00:41:41] Luke: Yeah. Thank you very much.

[00:41:43] Thanks for listening to the Brave Technologist podcast. To never miss an episode, make sure you hit follow in your podcast app. If you haven’t already made the switch to the Brave browser, you can download it for free today at brave. com and start using Brave search, which enables you to search the web privately.

[00:41:59] Brave also shields [00:42:00] you from the ads, trackers, and other creepy stuff following you across the web.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • How individuals can pivot towards high-value tasks in an evolving job market redefined by AI.
  • The importance of responsible AI regulations and the role of open-source innovation.
  • Strategies to foster a more inclusive and innovative tech landscape, including initiatives such as Girls in AI hackathons that are designed to inspire young women to embrace tech careers.
  • The need for inclusive AI to prevent widening the resource gap.

Guest List

The amazing cast and crew:

  • Didem Un Ates - Founder & CEO of LotusAI Ltd

    Didem Un Ates is the founder & CEO of LotusAI Ltd. She has more than 26 years of experience in AI and generative AI, including senior roles at Schneider Electric, Microsoft, and Accenture. As a former VP of AI Strategy & Innovation at Schneider Electric, Didem led AI strategies, innovation roadmaps, and responsible AI practices. She has received multiple awards for her contributions to AI, and serves as a senior advisor for organizations like the Goldman Sachs Value Accelerator.

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.