"The Data Diva" Talks Privacy Podcast
The Debbie Reynolds "The Data Diva" Talks podcast features thought-provoking discussions with global leaders on data privacy challenges affecting businesses. This podcast delves into emerging technologies, international laws and regulations, data ethics, individual privacy rights, and future trends. With listeners in over 100 countries, we offer valuable insights for anyone interested in navigating the evolving data privacy landscape.
Did you know that "The Data Diva" Talks Privacy podcast has over 480,000 downloads, listeners in 121 countries and 2407 cities, and is ranked globally in the top 2% of podcasts? Here are more of our accolades:
Here are some of our podcast awards and statistics:
- #1 Data Privacy Podcast Worldwide 2024 (Privacy Plan)
- The 10 Best Data Privacy Podcasts In The Digital Space 2024 (bCast)
- Best Data Privacy Podcasts 2024 (Player FM)
- Best Data Privacy Podcasts Top Shows of 2024 (Goodpods)
- Best Privacy and Data Protection Podcasts of 2024 (Termageddon)
- Top 40 Data Security Podcasts You Must Follow 2024 (Feedspot)
- 12 Best Privacy Podcasts for 2023 (RadarFirst)
- 14 Best Privacy Podcasts To Listen To In This Digital Age 2023 (bCast)
- Top 10 Data Privacy Podcasts 2022 (DataTechvibe)
- 20 Best Data Rights Podcasts of 2021 (Threat Technology Magazine)
- 20 Best European Law Podcasts of 2021 (Welp Magazine)
- 20 Best Data Privacy Rights & Data Protection Podcast of 2021 (Welp Magazine)
- 20 Best Data Breach Podcasts of 2021 (Threat Technology Magazine)
- Top 5 Best Privacy Podcasts 2021 (Podchaser)
Business Audience Demographics
- 34 % Data Privacy decision-makers (CXO)
- 24 % Cybersecurity decision-makers (CXO)
- 19 % Privacy Tech / emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6 % Media / Press / Regulators / Academics
Reach Statistics
- Podcast listeners in 121+ countries and 2641+ cities around the world
- Over 468,000 + downloads globally
- Top 5% of 3 million + globally ranked podcasts of 2024 (ListenNotes)
- Top 50 Peak in Business and Management 2024 (Apple Podcasts)
- Top 5% in weekly podcast downloads 2024 (The Podcast Host)
- 3,038 - Average 30-day podcast downloads per episode
- 5,000 to 11,500 - Average Monthly LinkedIn podcast posts Impressions
- 13,800 + Monthly Data Privacy Advantage Newsletter Subscribers
Debbie Reynolds, "The Data Diva," has made a name for herself as a leading voice in the world of Data Privacy and Emerging Technology with a focus on industries such as AdTech, FinTech, EdTech, Biometrics, Internet of Things (IoT), Artificial Intelligence (AI), Smart Manufacturing, Smart Cities, Privacy Tech, Smartphones, and Mobile App development. With over 20 years of experience in Emerging Technologies, Debbie has established herself as a trusted advisor and thought leader, helping organizations navigate the complex landscape of Data Privacy and Data Protection. As the CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, Debbie brings a unique combination of technical expertise, business acumen, and passionate advocacy to her work.
Visit our website to learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E217 - Flo Nicolas and Debbie Reynolds
Debbie Reynolds “The Data Diva” talks to Flo Nicolas, J.D., Chief Impact and Community Engagement Officer, ARMI - ReGen Valley Tech Hub. We discuss shared insights about her career transition from law to technology. Flo discusses her challenges in navigating government contracts and emphasizes the importance of mentorship in her professional growth. The conversation touches on the mental health impacts of corporate life and the necessity of taking risks and learning from failures, highlighting the value of community engagement and personal branding in fostering professional development.
The discussion also addresses pressing data privacy issues and the implications of emerging technologies, particularly for children. Flo expressed her concerns about the dangers of deepfake technology, sharing a cautionary tale that underscored the need for early education on online safety. We acknowledge the alarming trend of diminishing privacy rights in the face of advancing technology, with Flo noting that many individuals are desensitized to privacy notifications. We agreed on the importance of simplifying privacy information to empower users, especially diverse groups like students and small businesses, to better navigate complex privacy settings.
Additionally, we examine the dual nature of artificial intelligence, recognizing its benefits while addressing significant risks such as algorithmic bias and the need for human oversight in AI decision-making. We raise concerns about the effectiveness of current regulations and the necessity for companies to comply with ethical guidelines. The conversation concluded with Flo advocating for increased investment in critical technologies, emphasizing the importance of oversight to mitigate risks like data breaches. She also mentioned her new role, which involves developing a data analytics dashboard, highlighting the need for a supportive team to assist in this endeavor and her hope for Data Privacy in the future.
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now I have a very special guest on the show, Flo Nicholas, all the way from the east coast of the US she is the chief Impact and Community Engagement Officer at Regen Valley Tech Hub.
[00:38] Welcome.
[00:39] Flo Nicolas: Thank you, Debbie. It's great to be here with the Data Diva. You listen, I bow down. People can't see me, but I'm bowing down to you because you are absolutely phenomenal and I learned a lot from you.
[00:52] Debbie Reynolds: Aren't you sweet? We met many years ago. I think we've been connected on LinkedIn for many years. And I remember you had reached out to me. I think you're in transition with different things you're working on.
[01:03] And back. Back then, well, see, you're. You're a superstar now, so everyone knows you. You're on social media, you're getting all these awards and I see you and all the things that you do.
[01:13] But back then I think you were like a little bit nervous about kind of stepping out on your own in the spotlight, and you really just taking. Taking the reins and just gone like so far.
[01:24] So I get so excited when I see you on social media and the things that you're involved in and, you know, I' proud of you.
[01:32] Flo Nicolas: Well, I appreciate that, but like I said, it's looking at people like you and me reaching out and saying, hey, I don't know where to go, where to start, because I admired you and those words of wisdom really helped me get on the right path and really also to boost my confidence as well.
[01:51] Debbie Reynolds: I don't think you need a confidence.
[01:55] Flo Nicolas: I felt like I did, but I guess it was hidden. You know, I spent several years in corporate, so there's something about corporate that I feel like when I talk to other people that sometimes for some people, not all has a really impact on just your.
[02:09] Your mental psyche.
[02:11] Debbie Reynolds: Right. Because you're. It's almost like a subjugation. Right. It's like you're in Santa's workshop. So Santa only really cares about the thing that you do to produce his toy as opposed to.
[02:22] As opposed to what your real talent is. So I think that you really stepped into what your real talent is.
[02:28] Flo Nicolas: Well, I love that analogy. I think I'm have to use that.
[02:32] Debbie Reynolds: Well, tell me about your journey it's fascinating. So you're a lawyer, you're into tech, you're a great public speaker. I love the things that you share. You have so much empathy when you talk about different things.
[02:46] And so just tell me how did you get from, how did you get here basically in your career?
[02:54] Flo Nicolas: I think the best way to describe it is I woke up and realized that you have to be able to fail and take risks. I'm a self diagnosed overachiever. One of the things in high school I did speech and debate and I really loved it.
[03:09] I loved going to the competitions and debating. And so with that, you know, I ended up taking the path of going to law school.
[03:18] Just after graduating law school, you know, I have these big dreams. I'm going to be this big tech baddie lawyer.
[03:25] And it's just, you know, the opportunities didn't really open like that for me.
[03:32] And so after practicing for a while, I decided to transition into something else. I was doing bankruptcy, so I ended up going corporate and just did a career pivot, ended up on the business side of the house and I was in corporate technology operations working specifically on modifications to cell towers.
[03:53] So I was involved in a lot of projects, you know, that had to deal with. We did 4G before I left. We were working on the 5G project and a lot of it entailed quite a bit.
[04:04] It was cto, corporate technology operations. But my role encompassed vendor management, project management, real estate management because you know, cell tower was a real estate and you know, we were also co located on, on buildings like you'd find us in hospitals and sometimes we're co located on government buildings as well.
[04:23] So we'd have to deal with a VA or maybe a military base or an airport. I was working with a legal team because my team that I would oversee, we, we would have to negotiate, draft and redline our license agreements that allowed us to actually be able to go on sites and change out equipment and do other stuff.
[04:46] I was also working with the regulatory team, right. Because telecommunications industry is heavily regulated so everything needs to be done in accordance to regulation, whether it's your local, state or federal as well.
[04:59] So in that role I had to wear many hats and I'm thankful for that because I was able to leverage my legal background, but I was also able to learn additional new skills and just became just really fascinated with just how things worked.
[05:16] Right. I'm like in this company that's literally helping millions of people be able to communicate right through text, email, phone calls. I don't know if you remember back in the day when cell phones came out, we were just happy we could even make a call.
[05:33] Right now we have all these demands like, well, I need to be on a call and I need to be texting and emailing and being on social media at the same time.
[05:42] So I just became really interested in just how technology worked. But you know, after doing that for seven and a half years and just not happy, as you said, been in Santa's village and producing toys, but not, you know, really elevated to the next level, I, I left cold turkey.
[05:59] I knew there was more out there for me and I knew that I had to go explore because I wasn't happy.
[06:08] And I felt like I lost control of what really determined my, my purpose and my happiness and I had to go out there and explore. So I did that. And you know, had had many failures and have had many successes.
[06:22] But fast forward now that big risk that I took and knowing the power of getting a mentor, why having a mentor is important, understanding why you got to be out there networking, going to industry events, networking with other leaders, looking out for opportunities, being able to articulate your skills and showcasing who you are and your capabilities and not being afraid to ask for opportunities or not being able to propose and say, hey, I've noticed this is where you're lacking and here's where I can come in and be able to help you is what has lended me my role now, which is the role that I was fighting for a couple of years ago.
[07:11] But I've kind of had to go through this three years of self discovery, rebuilding my self confidence, understanding my value, my worth, what I'm able to do. And now, you know, with the Biden Harris administration designating tech hubs across the country, my state was designated as one of them.
[07:31] So I'm fortunate that I'm working with the lead company here for our tech up in New Hampshire called Advanced Regenerative Manufacturing Institute, you know, and I'm now essentially their chief impact and community engagement officer.
[07:47] So long narrative.
[07:49] Debbie Reynolds: Yeah. Well, it's a good narrative. It's a good narrative. Let's talk a little bit about brand. I know some people when they hear that word, makes them nervous. You know, I think everyone has a brand and I'm sure that you agree with this as well.
[08:02] But the one thing that you did that I think is really important, I would love for you to opine on this, is that you are not your job. So I think a lot of people feel like that.
[08:16] So that's not the truth. And so being able to express yourself and who you are, the things you're interested in, things that you want to do, I think it's very important in your career, but I want your thoughts.
[08:29] Flo Nicolas: Yeah, it's funny that you said that the other day as I'm scrolling through LinkedIn, you know, everybody is prospecting to sell a product or offer a service.
[08:38] And when I see some people, I connect them with, oh, this is the service they offer. I think for me, because I've used LinkedIn for self discovery, what I've noticed is that sometimes what you have to offer is yourself.
[08:55] And, and, and when people see what, you know, you as a person. Like I talk about my advocacy in the community, you know, this is something that became really important to me because I wasn't paying attention to what was going on in my community.
[09:09] And I didn't, you know, really kind of know that, hey, you can serve on these boards, right? You're not happy with what's going on? Well, it's not just about being in the house complaining.
[09:19] You can go out there and be that agent of change, you can go on these boards, you can petition yourself, or through your networking, you can have someone essentially nominate you to be on these boards.
[09:32] So I talk about that, which when you think about the advocacy work that I do, it aligns now quite well with the current role that I have, is tell me what you do without telling me what you do.
[09:46] Right. It's like, you don't have to be, you know, so focused on. And we offer a banana peeler, and this banana peeler is great. It's also beyond the banana peeler that you're selling, for example.
[09:59] That's a bad example. But I didn't have anything else I could think of. But beyond that tool or that service you're offering, there is you, the human.
[10:09] And to me, sometimes when I see posts of people when, yes, you know, they talk about how they're great with, you know, data, how they're great with contract negotiations. But one of the things that I noticed that what resonates with people is the human element, right?
[10:24] When I see someone who is like, hey, I, I had a bad day and can we all just agree to be kind to each other? And I noticed those posts, like, blow up because that, that is what some of us go through on a daily basis.
[10:40] You know, just trying to be empathetic, just trying to show respect and get respect. So for me, I use my social platform to really raise awareness of things that matter to me as flow the person and Then also once in a while kind of commingle that with the hands on work that I'm doing in my community.
[11:03] Because for me, I feel like it's one thing to be an advocate and say, oh yeah, you know, I'm for, you know, green energy and planet Earth, but it's another thing to also show the action that you are taken to improve the Earth, for example.
[11:20] Right. So, and that's one of the things when I go to meetings in my role and, and they're like, hey, we should have this meeting and bring all these business leaders together, you know, and I'm like, that's great.
[11:32] But at the end of the day, what is the action item? Right. What is that? Tangible results.
[11:39] Right. That, that we're going to showcase. And that's what I use for my social media is to say, here's what I am and here's what I believe, but here's me in real time showing you the things that I stand for.
[11:52] That way, you know, this is not just for clout, this is not just for notoriety on social media, but that I'm a real person who, who believes in certain things and then I take action to implement those things and then I use what I do to inspire, motivate and encourage others to do it.
[12:11] Because if they see me doing it, I want them to know that they can do it. And that's my brand. That's my brand.
[12:17] Debbie Reynolds: I tell companies I want your talk and your walk to match so we can get wrapped up and talking without actually doing anything. So that action item is really key and I think you're doing a brilliant job of communicating that through your social media and your brand.
[12:35] Flo Nicolas: Oh, thank you. I definitely appreciate that because you know, sometimes you and I wrote a post about this, I'm like, sometimes you need that feedback because change doesn't happen overnight and sometimes it's slow.
[12:45] So when I get those text messages or emails from people saying like, I just got a text message from somebody, they're like, I moved out of state. And I just want to take the time to thank you for, you know, what you've done to welcome me here in our state.
[12:59] And I was like, oh, I feel like I didn't even do anything. And she's like, oh no, no, don't say that you did. So I appreciate that.
[13:06] Debbie Reynolds: Well, what's your thoughts about privacy? So privacy for me I think is a horizontal issue.
[13:15] How that data is handled of humans I think is always a concern within organizations. And so I just want your personal views about where we are in privacy or the things that concern you most right now about privacy.
[13:29] Flo Nicolas: Yeah. And I think I joked about this on your post. I can't remember what you posted about and I think I said, do we even have any left? Right. Because that, that's how I feel as just an ordinary citizen.
[13:41] Never mind the fact that I love just following what goes on in technology and just, you know, trying to keep up to date with the attempt to maintain and protect our privacy, you know, with this advancements in technology that we're seeing, especially in gener AI.
[13:58] But I feel like we're at a point where we're almost numb to the fact that I feel like our privacy control seems to be slipping away.
[14:13] And why I say that is I'm just noticing all the data breaches that happen.
[14:20] I would say if you get a group of people together, you're going to have a couple of people raise their hands and say, yes, I've received some type of a notification from you know, either my, my bank or something saying that, you know, there's been a breach.
[14:35] And a couple of years ago, like I can't remember like going all the way, maybe five over, over five years ago, I would, I would then freak out, right? I would start checking my credit score, I would, you know, start going into my bank account.
[14:50] You know, I would just be a nervous wreck.
[14:53] And now I feel like it's like almost becoming the norm. It's like, oh, there was another breach with the health Institute and you're like oh man, again, you know, or you get a notice and you're just like oh geez again.
[15:07] And I feel like there's a loss of control when it, when it comes to being able. With his advancements in technology, we know that the bad actors now have, are even more empowered and have better means of essentially invading our privacy.
[15:28] In addition to that, what I'm noticing as somebody who use technology especially you know, playing around with generative AI tools, there's some people who don't read the privacy notices, right?
[15:39] They're just so excited to try out this new technology, but there isn't really fully understanding of, well, what happens with this data. Who has visibility to this data? You know, how do I essentially put a pause on their ability to use my data that I input into their software to train the model.
[16:05] Because when I do sometimes presentations about technology and AI and the risks and the benefits, one of the questions I ask is, you know, like, hey, when you get those pop up windows about cookie consent, like how many people actually read Those and it'd be like a room off maybe 40, 50 people.
[16:26] And I might see two people raise their hands.
[16:29] Right. And so people are just now numb, I feel like. And yes. Are there fears of, you know, hey, I don't want somebody to deplete my bank account. Yes, they are.
[16:42] For people in the tech world who are true techies, you know, there's a keeping up with understanding the complications with technology and how they might infringe on our privacy rights.
[16:53] But when we look down, like to what you said, the ordinary citizen, yeah, they're concerned, but I don't know if they understand fully how they easily give up some of their privacy rights by failing to read some of those notices.
[17:12] And then they end up like, wait a minute, that's my picture. Why is my picture there? You know, it's like, well, did you read the notice? You know, so, I mean, I think that's what I feel.
[17:26] I feel like we're becoming numb. We're inundated with technology and it can be overwhelming. I get overwhelmed, you know, I can't keep up with all the different organizations and entities that are trying to figure out how to create these, these guidelines and frameworks for protecting our privacy.
[17:46] If we can't have, you know, the brainiacs trying to figure it out, how can we have those expectations on ordinary citizens of how they can protect their privacy, how they can be more, you know, exercise due diligence in maintaining that.
[18:01] So I think that's where we, we are going. Unfortunately. Yeah, we're becoming nut to what privacy really is.
[18:08] Debbie Reynolds: And I don't think you should have to be a lawyer to understand what people are doing with your data. It should be more simple for a common person to understand that.
[18:17] Flo Nicolas: I agree, and I agree. And you make a great point. I mean, let's just think about some of the people who are using this technology. We have students that are using it, we have small businesses that are leveraging it to write their social media branding and marketing.
[18:36] This technology is being used by various people from various backgrounds, and that means various industries and education levels as well. So it should be simplified, right? It should be simplified for everybody to understand.
[18:53] And when you look at some of these tools, you know, you gotta go click through different parts of the tool to, to go shut off the data. Right. If you don't know where it is, like, like I saw people posted on LinkedIn where they were like, oh my God, LinkedIn now is using the data, our content, to train their AI model.
[19:16] And people were posting, just press this Link and it'll bring you directly. But that's one of the things I'm noticing is that yes, there, it's awesome that there are people out there who are helping educate, but there are some people who wouldn't even know that this is going on unless they're being again, researching and taking the time to understand the platforms that they're using.
[19:40] But in all fairness too as well is sometimes some of these platforms don't really.
[19:47] What's the best word? Enough, sometimes, always fully transparent and make it easy for you to find the information. Like you've got to go dig it. I mean, just look at some platforms where you just try to look for the 1, 800 number.
[20:00] You can't even find that number sometimes. Yeah. So it's definitely a challenge for I think the average person and it shouldn't.
[20:07] Debbie Reynolds: Be like that data breach wise, even though, thank goodness I don't use the service. But 23andMe, so they had this huge data breach.
[20:17] I think where it says now is that they had a settlement for like $30 million and people are supposed to get, you know, maybe in a couple years you get a $5 check from them.
[20:26] I don't know. But one of the things that they did was they said, well, we're going to give you free credit monitoring for three years. And I thought, how does that associate with reach of my biometrics?
[20:39] I just don't see the connection.
[20:41] Flo Nicolas: Right.
[20:42] Yeah. I mean, they're not the only one. We've seen several big companies that have had unfortunate, you know, cybersecurity breaches. I mean, T Mobile, for example, is one. We, I believe ATT had one as well.
[20:59] And we've seen a lot of, you know, in the healthcare industry as well. We're seeing that too. I mean, it's definitely, like I said, it's scary times because one of the things that we have to understand is just like we're leveraging technology, the bad actors are leveraging technology as well.
[21:19] So you have the people who are using it for the benefits for automation and, you know, other additional benefits that, you know, we're seeing in the healthcare industry, auto industry, fintech, legal tech, you name it.
[21:35] But then we're also seeing, you know, bad actors who are like, oh, we can make a lot of money off this. But it's like once the damage is done, there's that now kind of question for subscribers or users or customers of those companies of am I safe?
[21:54] Right. And yeah, getting that $5.
[21:58] I'm not sure how that helps. Right. Because at the end of the day, it's always hard to really track, you know, and this is where you. Now you gotta be extra, you know, vigilant in terms of really monitoring your accounts and your credit because sometimes you don't notice the small things, right?
[22:17] Because, you know, I've gone to cybersecurity summits where sometimes they say it's, it's, sometimes it's 25 cents, you're not going to notice 25 cents, right. Missing out of your bank account.
[22:29] And then, you know, then, and then as they gradually increase the amount, you know, but by the time, you know, you, you realize, you know, that wait a minute, I have all this money that's been taken out and I haven't noticed it because it's been so small.
[22:45] But, but that's what it opens up, right? It opens up, you know, this vulnerability now.
[22:52] But I don't know what the solution is, honestly, you know, I don't know. Yes, they're going to do a settlement, but at the end of the day, does that really appease the users and the customers?
[23:07] Like your information is out there and it's almost like now for the rest of your life, you really have to be just monitoring and checking. And I have alerts set up on my cards where if I spend a certain amount of money, it will verify.
[23:27] Like, hey, you spend a hundred dollars. Is this correct? Right. So there are those steps that people can take. But you know, I feel like these companies, especially ones that get hit not once, you know, twice, it's, yeah, even if they pay the fines and the settlements, it's at the end of the day, people, what, what they want more than that five dollar check is to feel safe and to feel that their information is secure.
[23:55] And I think that's worth more than, you know, the 5, $20 check that you're going to get.
[24:01] Debbie Reynolds: I agree. Someone on social media actually made a brilliant idea. They were like, instead of sending me $5, why don't you delete my data?
[24:10] Flo Nicolas: Yeah, that's even better, right? Or either that, you know, why don't you really take the time to look at for, especially for people who've been breached more than once is an investment, you know, which there's some companies who do promise to invest in their cybersecurity.
[24:30] But at the end of the day, like what we talked about earlier when we started, a promise is a promise until, you know, it's actually activated into action.
[24:39] Right. So, yeah, you can promise to, you know, have a more robust cybersecurity infrastructure, but if you're actually not investing in it and you're habitually, you know, getting breached, then where do we go from there?
[24:53] Debbie Reynolds: What's your thoughts on AI and how you think that is impacting all of us now and then also, I want to talk with you a bit about deepfakes. We chatted a bit about that as well.
[25:06] Flo Nicolas: Yeah. So AI is quite interesting. Listen, I'm not going to lie. I do love the capabilities of AI. I love using AI tools, but I am that person that likes to review the policies off, like, okay, you know, where is my information going?
[25:25] Because it matters to me. Because then I want to be very diligent about, you know, what I'm, you know, perhaps typing in, you know, and uploading. Like, I like using AI for, to assist.
[25:37] You know, I don't, I'm not, I don't have AI create, you know, articles for me or anything like that, but I like using it to, you know, kind of read through my article sometimes.
[25:48] Like, I, I'm a writer, but sometimes I get so excited and I'm just, I'm typing through, right. And I like using it to, to kind of say, does this make sense?
[25:59] Do I. My ideas flow in. Wow. And can you comprehend what I just said? Right.
[26:05] And there's other things that AI is being used for. Like my, you know, my daughter loves music. She started writing music and There is an AI and this is. She's 17 and she started writing music, but she found this app that will actually let you create the song and you don't have to sing the song.
[26:26] Debbie Reynolds: It.
[26:27] Flo Nicolas: She created these songs where the AI, you pick the voice you want, you pick the sound you want. So maybe you want to sound country, maybe you want it to be a rap song.
[26:36] And when I tell you, I looked at her and I said, wait, you wrote this song and then you put it in this? And I'm like, no. And she went in there and she showed me.
[26:49] I can see why we're seeing artists, actors, authors, who are not happy right now because she started uploading her songs to TikTok and they're actually getting quite a bit of views.
[27:10] So with AI, one of the things that I say is, yes, we got to focus on the benefits which we're seeing so many. I, I've. I did an a TEDx in Portsmouth, New Hampshire, and it was an interesting TEDx because there were four of us on stage and we all talked.
[27:28] One talked about the impact of AI and the mental health sector, and another person talked about how AI is being used for, you know, monitoring, you know, food insecurity agriculture.
[27:40] And then I spoke about how, you know, in my previous company, we were HR tech company and how when I went to Las Vegas for one of their biggest HR tech shows, I was seeing how HR was, was leveraging AI for essentially performance management, human capital management, as well as for stuff like diversity, equity and inclusion, you know, using AI to monitor trends in, in terms of, you know, equity compensation, you know, equity in hiring.
[28:11] So we're seeing it being used for many benefits in various industries. However, those same benefits can be turned into risks, you know, and we've seen cases where the EEOC settled a case where AI was being used to essentially intentionally discriminate against older employees.
[28:34] And so those are some of the things I fear about, especially in hr, AI being used for HR is when you look at how some of these softwares are created, some people are saying, let's be careful with that data, right?
[28:52] You know, who's creating that, those algorithms? And is there a potential of bias?
[28:58] Right? It could be intentional sometimes, it could be unintentional, somebody not realizing that, you know, for example, hey, why is it that we're only hiring males here? Like, what, what's going on?
[29:10] You know, and, and we're not seeing this, this equity in hiring and, and maybe somebody inadvertently or intentionally created this kind of algorithm that deliberately likes to target a specific demographic.
[29:27] So those are my concerns with AI and what we talked about with the cybersecurity, how AI is also being used to essentially for phishing and malware and all that. Now it's coming at a rapid pace.
[29:42] I'm seeing an increase in emails, I'm seeing an increase in text messages, in calls, and all that can be used to speed up the process of essentially how to penetrate and, you know, steal your information through AI.
[29:58] So I think what we're, what I'm seeing and what I'm noticing and that we're going to see for a little bit until we figure this out. There's this attempt to have this balance.
[30:09] We know it's good, we know it's bad, but then how can we regulated, right? And we saw what has happened in the eu, they were like, hey, we're not playing around with this, we're going to regulate this.
[30:24] But now what I'm seeing now is that call to action for here in the US that like, hey, we've got to do more, it's moving at a faster pace and we need to catch up to it.
[30:34] And that's what I see is a struggle I'm seeing a lot of these groups coming together, companies, big tech companies, making commitments for responsible AI. But again, going back to what we talked about before is one thing to make a commitment, it's another thing to actually take action.
[30:54] Debbie Reynolds: I agree. And I think, I guess a couple of things about AI. One is I'm concerned that people want to advocate their human judgment to AI. And that's why, like your example around hiring, that's a problem where they say, well, why is this happening?
[31:11] You're like, well, I don't know. I just told the system to do this. Right. So I think laws are made to regulate the behavior of people. Right? So you can't go to trial or file a case against the artificial intelligence system.
[31:27] You have to file against a company or a person. So I think having that person in the driver's seat is really important. And you know, like, like your example about AI helping you make a song, you know, you're is, is you're using it, if you're using in the right way, is assisting, as you say, as opposed to being the thing that's in the driver's seat.
[31:49] Flo Nicolas: Yeah. And I think you make a great point about that regulation because there's been an attempt, like for example, in the New York law, what this is like Local Law 144 or something like that, where, you know, they, they kind of this attempt, especially when it comes to hiring and using hiring software that has AI capabilities, this attempt to kind of put this guidance, the structure of, hey, you've got to audit your software to ensure there are no biases.
[32:19] Right. And is this law working? Is the number of companies that actually went through with the guidelines and actually doing the audits and actually sending the report, the number of companies that actually did it was not a lot.
[32:35] And now it's like, okay, you're going to implement kind of like this law and these guidelines of what it's supposed to do, what companies are supposed to do that are using the AI, especially for hiring.
[32:47] But if they don't do it, right, what's the punishment?
[32:53] Because if they're not doing it and there's no repercussions, then it sends a really bad message that, you know, yeah, okay, do it if you want. You don't have to if you don't want, and you know, we're not going to do anything.
[33:09] And I think that's the problem where I'm really curious, as I see people like yourself and other, especially privacy lawyers who are talking about AI and regulation is I'm like, yeah, that's great.
[33:22] But in reality, you know, how is it really going to be regulated?
[33:28] So we'll see.
[33:30] Debbie Reynolds: So the two things that concern me about that is, which is a great point by the way, one is are you creating a law where the people can technologically actually follow through on what you're saying?
[33:43] And then what is, what is the punishment if you make it too hard for someone to do tech technologically or there really isn't any teeth or any repercussions is really not very effective.
[33:59] Flo Nicolas: Right, right. And yeah, and when you've identified the companies, you know, again, that's the thing. Is our companies really going to self report?
[34:10] That's the other thing that, you know, we've got. If you're going to say, hey, if you're using this type of software then you have to adhere to this law. You know, our company is going to self report.
[34:21] And then who's really, who's the oversight for that? You know that, that's the thing where it's going to be difficult. Like how do you really have oversight on who's using it?
[34:34] You know, how do you track all that? You know, especially if you want them to report how they're using it and submit audit reports. Yeah, that's going to be interesting.
[34:43] I think, I think this is going to take a little bit for us to really kind of figure out how, you know, this gets regulated.
[34:50] Debbie Reynolds: Well, I want you to tell your story about deepfakes.
[34:53] Flo Nicolas: Yeah, yeah. So you know, one of the things that's interesting, me and I've been talking a lot about deepfake and you know, and I just try to, you know, especially to my kids too.
[35:04] Being a mother of three girls, my kids are into technology. I just told you about my 17 year old, you know, who loves, you know, using AI and she's on social media and TikTok.
[35:14] It's, it's hard to shield these kids. I would like to just have them be like in a cave but that's not the reality because at the end of the day they also have to be exposed to technology because they have to understand how to use it.
[35:29] And so, you know, Even with my 12 year old, you know, she's a big gamer and you know, with the gaming industry now, you could be talking to people and gaming with people all over the world.
[35:39] So I just really try to educate my kids on just being safe. But one interesting that happened is, you know, when my six year old was watching a show is Sheriff Labrador is the name of the show.
[35:53] I love the show. Because it's very educational and it kind of, you know, at the end it has that lesson learned off the show. And the lesson learned off the show was it actually involved AI, you know, deep fake technology.
[36:08] And this girl was home alone and she got a video call from somebody she thought was, you know, like, I forgot it was like an uncle or a cousin or something like that.
[36:18] And it looked like, you know, her family member. It sounded like her family member. So she had no reason but to trust that this person was her family member that she knew.
[36:31] And the person said, I'm going to come over so we can go play.
[36:35] And when the person came, you know, she opened the door and the little character got kidnapped. So now they had to go through all this trying to find the person.
[36:46] And then the message in the day is to be very careful who you're talking to because it might not be your family member. And if you're not sure, check in with your mom or your dad or your caregiver.
[37:00] So it just struck me and why that story vividly just caught my attention was we've gotten to this point where we just don't have to only educate adults on being safe when it comes to technology, whether online, whether you get an email, a call or a text message.
[37:20] But we now have to take it all the way into the smaller classrooms, pre K kids. I mean, she's in kindergarten now, but that, that, that's scary as a parent, that.
[37:33] And especially in kindergarten, they start being taught how to use computers.
[37:38] So part of that now is going to have to be having these complex conversations. Obviously you got to, you know, bring it down to their level, but they've got to learn that.
[37:50] They've got to learn now at. At 5, 6 for how to be safe online.
[37:58] Because we have these advancements in technology where we have people who will leverage them to not only prey on the elderly or young, but now our babies, Right, our little ones are now exposed.
[38:13] And it's very scary for me as a parent, and I'm sure probably other parents probably listening in will probably agree as well.
[38:24] Debbie Reynolds: So if it were the world according to you and we did everything you said, Flo, what would be your wish for privacy or technology anywhere in the world, whether that be human behavior, technology, or regulation?
[38:39] Flo Nicolas: Oh, that's a good question.
[38:41] Wow.
[38:42] You know, I think that we need to invest in technology.
[38:47] I think for me, that that's the bottom line. Being in tech and seeing what I see in terms of the benefits, we definitely need to increase in our investments in technology, especially Critical technologies.
[39:01] And, you know, that's one of the premises of those tech hubs is build enough critical technologies to be more competitive globally around the world, meaning the US being globally competitive.
[39:14] But I think in this perfect world where we are leveraging and investing in technology, I think we're also the number one thing that comes with that is we gotta figure out the oversight, right?
[39:28] We've gotta figure out the oversight because we already see the evidence of issues that come with it. Copyright infringement. We've seen cases all over the country with that. We've seen how it's being used with, you know, bad actors using it for trying to penetrate other companies, steal their information, monies like, you name it.
[39:50] Should we be scared? I mean, yes, of course. You know, there's gotta be that fear. But I think beyond the fear is also how we are more intentional as we build this critical technologies in ensuring that, you know, we're really doing an audit and assessment to ensure, again, what we talked about earlier, you know, are there biases in this technology that we're building?
[40:17] So I think if we start at the production of these tech technologies and start implementing the oversight as they are building, I think it's going to make it a lot easier.
[40:31] It's always, you know, there's a saying like, it's hard to, you know, teach an old dog new tricks. Well, once you have somebody who's already deployed the technology and was already invested all the money into creating the technology, and then you want to go back and try to say, well, you know, we're now regulating, we're now assessing.
[40:50] You know, it's always hard to go back, but when you're building something from the ground up, I think when you start having that oversight at the beginning, I think you're going to have more control in terms of the output.
[41:05] Right. So I think that's what we've got to do is invest in those critical technologies, but with the intention of having oversight at that time of creation to ensure that, you know, we're minimizing the risks.
[41:21] I don't think you can get rid of the risk completely. I don't think that we should have that frame of mind. Well, there's always going to be a risk. Right.
[41:29] But I think that if we start implementing the oversight earlier on in the production, I think it might make a bigger difference.
[41:37] Debbie Reynolds: Well, I love that. So going back to the fundamentals is very important, especially as we're trying to build all these new things, so. Well, thank you so much. It's so exciting to see all that you're doing and keep me updated as you always do on social media on what you're up to.
[41:55] And I'm so proud of you.
[41:56] Flo Nicolas: Oh, thank you. I appreciate it. Thank you so much for your time today. It's been kind of fun just to have this casual conversation about all the phenomenal things happening with technology.
[42:06] Debbie Reynolds: I agree. I agree. Well, thank you for being on the show, and we'll talk soon.
[42:10] Flo Nicolas: Sounds good. Thank you.
[42:12] Debbie Reynolds: All right, bye.