"The Data Diva" Talks Privacy Podcast

The Data Diva E27 - Dawn Kristy and Debbie Reynolds

May 11, 2021 Debbie Reynolds Season 1 Episode 27
"The Data Diva" Talks Privacy Podcast
The Data Diva E27 - Dawn Kristy and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, "The Data Diva,” talks to Dawn Kristy, CEO of The Cyber Dawn and VP of Cyber Solutions a CyberArmada. We discuss her passion for cybersecurity, challenges in  AI systems, her work on educating people about cybersecurity, the big targets and the most vulnerable in cyber attacks, the need for privacy and cybersecurity with those outside of technology and legal fields, the impact of the documentary “Coded Bias,” the lack of diversity in AI and how it impacts results, AI mistakes that may lead to irreversible consequences, male advocates for women in tech, the need for more women and girls in STEM, thoughts on the proposed Mexican National Biometric Registry, the trend toward justification of data collection in laws and her idea for data privacy in the future.



Support the show

Dawn Kristy

41:02

SUMMARY KEYWORDS

people, cyber, privacy, data, person, problem, ai, world, idea, talk, Chicago, CEO, biometric, area, cyber-attacks, facial recognition, absolutely, wordings, hear, discover

SPEAKERS

Debbie Reynolds, Dawn Kristy

 

Disclaimer  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

 

Debbie Reynolds  00:11

Hello, my name is Debbie Reynolds. They call me "The Data Diva." This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues for business leaders around the world—talking about information that businesses need to know now. So today, I am very proud to have a friend of mine, Dawn Kristy, on the program. Dawn Khristy has to wear many hats. I put it like that. So she's a vice president at Cyber Armada, correct? A trained lawyer, cyber risk expert. She is involved in girls and women in STEM, an advocate for that. She's a speaker or a lecturer, and author or writer, editor,  mentor in many things. Also, Dawn has her own venture that she's the CEO of, which is The CyberDawn, which I thought was a great play on your name. So hello, hello.

 

Dawn Kristy  01:14

Hi, Debbie. Yeah, thanks for the introduction. It's so nice to see you. And speak with you today. I think we met back in 2019. And we've been sort of geeking out on things ever since I was in Chicago at the time. And we met at the Union League club. And we basically found synergy right away. What's interesting is at the beginning of last year, I left Chicago in the winter and came to South Florida to join a startup Cyber Armada insurance broker. and a month later was the COVID lockdown. Oh, I really spent, you know, the last 14 months doing a lot of Zoom meetings to meet people here and also to do business. So it's all part of the pandemic way of doing things. Yeah.

 

Debbie Reynolds  02:03

I don't think a lot of people feel sorry for you being locked down in Florida.

 

Dawn Kristy  02:09

I literally got off the plane, got a car, got an apartment, started a job, and then locked up, though. Hey, yeah. And I missed like, what was it? Forty-five inches of snow this winter?

 

Debbie Reynolds  02:19

Yeah, so we have a ton of snow. It was ridiculous. The good thing about it, if you didn't have to go out like your work from home, you weren't really traveling around a lot. You just couldn't really do any outdoors and baying for less you were slapping her something hat on them. Yeah, we have spent a lot of time at home. Right? Totally, totally. Well, you and I have a friend in common. So Lee Neubecker. He's kind of our link as well. So yes, he's very involved with the Union League club, and I think his office is in that building. Right or close, close next door. And he's our forensic guy. Yeah, get him on the program if I can.

 

Dawn Kristy  02:57

Yeah, we'll have to get him on, or let's do a panel sometime so.

 

Debbie Reynolds  03:01

You're fine. So well, let's start out by talking about your passion is for cyber. So as a lawyer, I feel like you're a little bit different than a lot of people that I see. So you come from this, obviously, you know, trial lawyer, but then also, you get involved in kind of it, the advisory part on cyber. So can you just explain that?

 

Dawn Kristy  03:27

Sure. So I basically come at things like emerging tech and emerging risk. And earlier in my career, I was in another emerging risk area, which was environmental. And it's really analogous to, you know, at the time pollution claims, and how are we going to handle it, and a whole new line of insurance was developed at that time. GL policies had sight-head pollution losses, and Lloyds took a hit. And so basically, a new area of insurance was developed. And I see that as analogous to cyber with cyberattacks, and then new insurance was developed. And my interest started when you go back to basics in oh, five 2005. I heard about the TJ Maxx data breach. And that was my first ever hearing about a data breach and digging into it and then doing research and writing on it. And then, back in 2012, I joined a large insurance reinsurance broker in Chicago, and there was able to not only handle my claims book but become a cyber subject matter expert. So I did a lot of self-study. And that's where I really started diving deep and lecturing, and we had practice groups and things. And part of it is I'm a wordings geek. So I like looking at the wordings and analyzing them, which is kind of strange. There's probably 5% of people that enjoy diving into wordings, but the other is it did develop over time, and the place I'm at now is this desire to help people. I mean, under the pandemic, we've had an unbelievable increase in cyber attacks. And so individuals, friends, family, small business, and then obviously, large enterprises. Many, many people have been hit that weren't before. So it really sort of sparked this idea to, first of all, come down and join the startup and try sort of a new venture with that. But then also, in setting up this year, The Cyber Dawn, the idea of being to help people fight cyberattacks and learn how to not only be aware but act on the training that they may get. So the other gap that's happening is that they are not really communicating within their family or within their small business. And so it's about communication, and it's about implementing the training that you get to protect yourself.

 

Debbie Reynolds  06:01

So I think, you know, I feel the same way you do about communication, and I feel like we are in dire straits in terms of being able to really communicate to the people who truly need it, you know, educate them about Cybersecurity. You know, let's talk about that a bit. Because I feel like, you know, sure, kind of a divide and conquer the thing that I feel like right now.

 

Dawn Kristy  06:36

And it's like any other area, you know, everybody, many people will say, you've got to target this person and accompany this person in a company, the reality is, a lot of us are remote and will stay remote. So that dynamic has changed. But, you know, top to bottom bottom to tap, you have to communicate the concerns, the risks, you really should test them, do dry runs, and make sure that you know, the lowliest staff person and the head of it, the CEO and the board, all are clued in on what's happening. And that's kind of where this NIST cybersecurity framework comes in. Because it's voluntary, it's not mandatory. But the idea is getting everybody on board with that. So communication has been a gap in this area, even in the purchase of Cybersecurity, cyber risk management, cyber insurance. If the tap doesn't know what the problem is, they're not going to approve the budget for it. So communication is key and privacy issues as well as you know,

 

Debbie Reynolds  07:41

Yeah. And then, you know, I feel like, you know, someone who is a cybercriminal, they go after the lowest hanging fruit. So the thing that they can do the easiest is what they try to pluck off, right. So obviously, there are things like, you know, SolarWinds, which we can talk about, which are more high level, sophisticated attacks, but the everyday day to day things that people go through or source like phishing. So what I hear a lot of times people say, oh, this new scam where people ask you will say that someone has frozen your Social Security number or something like that, but yeah, oh, regardless of what they say, and how they say it, typically, it is a communication that you receive that you didn't ask for. And they're trying to create a panic situation or try to make you act or do something in a hurry, right?

 

Dawn Kristy  08:36

Yes. And you know, you hit it on the head there. Cybercrime is a crime of opportunity. And so, if the bad actors have an opportunity, it's too lucrative for them not to pursue it. So, you know, you think, well, there's got to be a way to defend in this, and part of the defense is just the awareness, implementing things, and being cautious. And I think there are so many stories every day on the news now, and people are aware of like scams with calls to the elderly and what have you. But nevertheless, every day, you hear somebody that was, you know, spoofed, fooled, gave up money, gave up their Social Security number. So, you know, the battle, the battle rages on,

 

Debbie Reynolds  09:23

Right? I have someone I can't remember. Someone gave me the story. I think it was on a podcast I was on. You know, he's a cyber expert. And he said he had caught, he lives in a different town from his mother, and he was just having a call and like, what are you doing? Well, I'm on my way to the police station. It's like, what are you going there for? I guess someone scammed her and said that she was going to get in trouble. She didn't give someone money. And they're going to meet her outside of the police station. So she wouldn't get arrested. She was like, I just didn't want to get in trouble. And like, thankfully, her son had called her it's like, No, don't do that. Don't do that. You know, so yeah, you just never know. I mean, it can happen to anyone. Especially,

 

Dawn Kristy  10:01

It could be kids. It could be seniors. It's all ages. Debbie.

 

Debbie Reynolds  10:06

Yeah. And then too, I'd like to talk about executives. So I've had a lot of experience with executives in the organization that doesn't want to follow the same cybersecurity training or advice that is being given a lot of times. They're super high targets because, unfortunately, a lot of executives end up getting more access than they need to stuff than they actually use. And then very using assistance or whatever, you know, they're a big target, where they say, oh, John said, do this, they are like they're afraid to question. So they, they go ahead and do this thing. And it's like a big deal. But I mean, how many times we have to fall for that trick before we know that we need to change.

 

Dawn Kristy  10:46

And also, you know, habits die hard. So you've got this idea, or this attack vector of business email compromised, and somebody told a story where someone did actually get a request to change a bank account, went to a senior manager could have been CEO or whoever, and said, did you make this request? Yeah, yeah. And didn't even really look at it. And then, you know, a month later, when they realize the money had gone astray. That was two-factor authentication that was sort of brushed by the wayside. And you know, the person handling the bank accounts tried. And if somebody says, Yeah, yeah, so again, it's top to bottom, bottom to top, people have to be aware of it. And if somebody is changing bank details, and I mean, I am so risk-averse. I drive my circle of friends and family crazy because I don't click or do anything. It's like the other extreme. But if somebody wants to change a bank account, I mean, that's a red flag.

 

Debbie Reynolds  11:47

Oh, totally. Absolutely.

 

Dawn Kristy  11:50

I would do three-factor, you know, yeah retakes? 

 

Debbie Reynolds  11:53

Yeah, probably. Right. I think also, one other thing is that you have to sort of switch up the mode of communication. So if they send you an email, call them or, you know, I'm saying called personal verify, or they called you email, you know, so switch it up. Good point. Yeah. Because they like to continue, like, if they're texting you, they want to continue to text. They don't want to do everything. So being able to try to be savvy about that. And, you know, I like your idea about three-factor, maybe four-factor, you know, especially when it comes to like a bank, banking, anything like that, it makes perfect. 

 

Dawn Kristy  12:26

You know, we look at all the authorizations we have, you know, in the corporate world in the business world. And, you know, Sue has access to this. Jim has access to this. But when you're talking about wiring millions of dollars, there should already be a queue of authorization anyway. And in particular, if there's going to be a change of the account number. So this is just kind of like, I don't know, business best practices?

 

Debbie Reynolds  12:54

I think so. I think so, especially with a high amount. Yeah, but I think, too, you know, they're taking advantage of the culture of companies, where some people, depending on who it is, may be afraid to question, you know, something. So like, they get an email from their CEO, saying, do this, you know, they want to be a good employee, they want to be a team player, so they want to do it. But part of that is just the psychology of people and like the culture within the organization.

 

Dawn Kristy  13:27

Yeah. And the other example that we heard, and I used in an alert I wrote, was the long weekend. We have Memorial Day coming up, right? So and actually, a law firm in Chicago did a skit on this one, which was fabulous. So the CEO is on a flight to the Bahamas, and it's a Friday, and you get the call that you're going to change the bank account, the bank manager of the company, or the accounts payable and receivable leaves a voicemail because the CEO is in flight. And that's the two factors, and the money gets wired. And of course, Monday, everything shuts. So on Tuesday, they all discover it. Right? So you know, the best intentions, the best-laid plans, right? You know, so you really almost need to have a substitute if one person isn't available. And this happens a lot on long weekends, holiday weekends.

 

Debbie Reynolds  14:20

Oh, that's a good point. And then one thing I always say is, whenever someone is trying to do this, they want you to do it really fast before it gets found out. So if you just wait, yeah, they'll drive them totally crazy because they want you to do something really fast. So if you're not sure about what to do, don't do anything. And just wait, just wait it out and see Yeah, you know, wait an extra day or wait, you know, a few hours before or you know, wait, do they call you back? You know, I'm saying so. If if a person in cybercrime they're seeing that you're not acquiescing to their request, they'll typically move on to someone else was much easier.

 

Dawn Kristy  15:01

Yeah, and I also think just we all have to take individual responsibility, but also believe that this is not insurmountable. Some of this is just I hate to use that phrase because it's used over and over. But best practices, whether you call it cyber best practices or cyber hygiene, some of this is almost common sense after a while, and maybe because we work in the sector, it seems that way. But to somebody who's experiencing it for the first time, you just hope there's someone they can ask or reach out to, to say, this sounds weird to me, even if it's on your home computer at home, you know, and so that's, that's the thing there, there almost has to be like a collaborative effort and spread the word. It's all you all we can keep doing. Yeah.

 

Debbie Reynolds  15:48

And I highly recommend, like you said, people talk to their families as well. So a lot of us are in tech, you know, are legal and stuff. And yeah, we understand this, because we live this every day, but you know, your grandma may not, or you know, your parents or your children may not. So you're able to talk to those people. Like, I think the first we have a family call recently, and you know, just trying to catch up with each other on Zoom and stuff. And, you know, I just said, hey, you know, I had a, like a cyber thing with them. And they're like, oh that's really helpful, like, don't click on links that you didn't ask for and stuff like that. And I typically wouldn't do that. Because more of that's typically about both calls are typically about, like, who did what, or who's getting married and stuff like that. But I threw that in because I feel like that's really important. And if I didn't say it, you know, I will feel bad if someone was harmed because I could have told them that.

 

Dawn Kristy  16:44

It's interesting because you bring up a point that I've discovered doing all this networking and Zoom meetings. And the other is terminology. I had someone say to me, after a networking event sort of offline, I don't know what cyber is. And I mean, we assume everybody knows what that word is or what cyber insurances are a cyber attack. And so, you know, you'll. This is why I keep going back to basics to think if somebody you know landed from Mars and didn't know all this terminology, how would you explain it so that, you know, everyday people can protect themselves. And so that's the tack I'm taking. I'm going to be doing a talk to 65 and over the group in May of one of the universities here in Florida. And so I'm getting in the mindset of like senior cyber, but also with all of the kids working from home, working, studying at home with their parents at home, I thought the other thing might be to actually try to give a talk to kids, and then it sort of trickles up to the parents, you know, you'll see a child say, well look at this, look at this, look at this, and they could actually teach their parents as well. So it's almost like picking the two, those two age group groups, and then filling it in, in between.

 

Debbie Reynolds  17:59

Yeah, it's kind of weird because you think kids are in college right now. They've always had the Internet, so they don't know what it's like with my habits. So yeah, it was it might be more comforting. You know, they're not as weary as maybe someone like me is, yeah, about that.

 

Dawn Kristy  18:16

But I think that you know, your specialty being data privacy. I also think a lot of them are not so aware of the privacy that they've surrendered by growing up with the Internet, that we that our older are because we knew it before everybody knew everything about us. Right? So you know, pre-social media and all that. So it's kinda like, you know, you could say, have they let their guard down when it comes to their privacy,

 

Debbie Reynolds  18:43

Right? Oh, absolutely. Then, you know, I tell my nieces and nephews, you know, don't announce where you're gonna go. You don't say I'm going to Vegas this weekend. You know, don't do that. You know, maybe after Vegas weekend, you could post a picture or something like that. But I mean, too much information. Really. I would love to talk about this movie that just came out called Coded Bias.

 

Dawn Kristy  19:07

Oh, yes.

 

Debbie Reynolds  19:08

So I thought you saw it. Let's talk about it. Let's talk about it. Yeah.

 

Dawn Kristy  19:14

So I was shocked to discover it through a sort of a LinkedIn post. Maybe a couple of months back, and I'm not 100% sure how to pronounce her name Dr. Joy. But she was not really an AI specialist. She was an MIT student who was working on a project and discovered that because she had a black face. The algorithms in this AI program that was supposed to help with, as I understood it, dealing with, you know, bias and prejudice would not recognize her. And she literally put on one of those like white plastic hockey masks that we know from the  Halloween movies. And then it recognized her. And then she took it off, and it didn't recognize her the facial recognition. So it's funny how one thing like the I created this whole movement for her to get involved in this. So she pivoted and started really focusing on this and gotten involved with some certainty, you know, different action groups and things and really unveil the fact that the bias of the programmer is going to be in the AI or the lack of bias of the programmer. And that was like, a lightbulb moment for her. So what are your thoughts on?

 

Debbie Reynolds  20:42

Yeah, the movie is really great, because it explains it. You know, like you said, from the inception, like, what incident happened that made her see that this is a problem. And this, you know, had her go on and write papers about it and talk about around the world as well as, you know, forming the Algorithmic Justice League, right? Yeah, I follow her work quite a bit, actually. Back Joy. Yes, she and Timnit, who was formerly from Google, had written a paper many years ago. I highly recommend that people go look at it. It's not long. I think 18, maybe 20 pages long. So, but they were explaining bias from AI, they gave very specific examples about, you know, the color-coding of faces of the very narrow spectrum that they were using in that, and a lot of times, because they aren't coded for darker skin, you know, these AI, they really, you know, they're being sold as if they work for everyone when they have never been really tested on everyone. And, and there's kind of a lack of diversity in the people who are, are logged in. So there are people, you know, like me, brown people that are interested in this area. And we can definitely help with that because you don't want people to be harmed as a result of that, right.

 

Dawn Kristy  22:22

Sure. And one thing you weren't seeing, they showed, there's, and again, this is going back a few months, there's a public interest advocacy group in the UK, you may know what the name of it was, but they show this guy, the facial recognition cameras, somewhere in the UK, he had, you know, the police were surrounding this man. And it ended up that he was not the right person, but also the fact that the system can be faulty.

 

Debbie Reynolds  22:50

Oh, yeah, definitely.

 

Dawn Kristy  22:51

So this guy was being, you know, surrounded by police, and there was somebody trying to help him. And so you know, you're really got the feeling of like Big Brother, and you could be mistargeted and caught on this facial recognition camera and stopped and go through all of this stuff. And you may or may not have your civil rights protected. So you do wonder, you know, where are we going to be in 10 to 20 years? Is this, this is going to become more and more prevalent?

 

Debbie Reynolds  23:19

Yeah. It's problematic. Yeah, actually, there's, there's a study from the New England Journal of Medicine. And it's about the pulse oximeter device. So it's a little thing that looks like a clip that they put on your finger. If you go to the hospital, and they are supposed to check the oxygen in your blood. Yeah. And they were saying that they had done a study in the US or over 10,000 patients who use this thing. And they said that people of color had an error rate three times, anyone else, and it gave a false positive reading. So the reading made us who my person was healthier than they were, you know, one doctor said it would have been the difference between being admitted to the hospital or not. So it would give a reading that was too high. They made it seem like the person was their oxygen levels higher than it was. So there are a lot of people who are sent home that should have been admitted to the hospital because. So I think, you know, as we're seeing, and as people are able to test these algorithms, or look at them and see that there's a problem, I will like to see more of an impetus to make those changes or corrections to AI right now that the law isn't forcing companies to do that. So they sort of having to decide on their own whether they do it, but I think the huge there are a lot of gaps here, right? But one of the really big gaps is how is it possible that you can sell something as if it works for everyone and it does my like, that's a huge problem.

 

Dawn Kristy  24:54

And you know, it goes back to again my product recall days I mean, you would think, and also It brings up two things product defect. But also, you know, this idea of security by design and privacy by design. And you could go from there to just why aren't the modems have security by design? Why do their default passwords exist as they do? Isn't there a way that the technicians could make these devices of all types better? And so with that, because, you know, if there's not a legal obligation to do something, where it is human nature, so produced quickly, let's get them out there. If they're not accurate, we'll figure it out on the fly. Right? And how does it change? Well, in a litigious society, like the US, it'll change because there's a product liability suit or discrimination suit, you know, or false arrest suit, you know, and people sue for everything here. So, you know, unfortunately, we use the legal system to make changes through the backdoor, so to speak, you know?

 

Debbie Reynolds  26:01

Well, the problem with this with AI and stuff is that the harm can be rapid, and it can be completely devastating. So there almost is no acceptable redress. You know, you're arrested or killed because the algorithm said that you're a suspect or something.

 

Dawn Kristy  26:19

Yeah. So I mean, this is, this is why we need the next generation probably a good segue into what my other passion, which is getting more people involved in STEM, and particularly girls and women, and I can give you some statistics on that. But we, you know, I say to people, I don't have children, but I, my friends that have kids in college, I'm like, please take have them take at least one computer science course. Yeah, there are so many open jobs in the STEM area, particularly cyber, and you know, my pitches, always we need our brightest minds to go into STEM subjects, they cannot go to Wall Street, right? We need scientists, both genders. And so and that the statistics are basically kind of shocking when it comes to girls. And then the fall off from that of a woman leaving the STEM fields. But a couple of things that I noted here was the Girl Scouts actually track STEM fields because they actually have cyber badges and things they didn't back in my day, but they do now. And the girls show a great interest in STEM subjects, and then it falls away. And they said, without the encouragement of role models or mentors, the focus shifts away. And it's this whole idea of, again, back to basics, who are the role models for the middle school girls who might like science, but nobody in their family studies it. They're discouraged from doing it, and so they don't pursue it. And then when you go to, you know, adults, you look at women who make up 28% of STEM fields. 28%, right. So, you know, this idea that the first woman astronaut, the first woman who were doing all these firsts right now, in sports, and in science and all of that, let's get to the point where it's not just the first that people reach back and bring more up and, you know, it's a big part of our population, let's make sure that they are contributing to these issues that we're discussing right here. We need scientists. We need innovators. We need people that are going to come up with solutions for this in the next 10 or 20 years.

 

Debbie Reynolds  28:36

Totally. And then we also need, you know, men as advocates for to bring you open those doors and bring in people because some of my, you know, most impactful and mentors in my career have been male, thankfully, you know, they were, they were very kind to teach me, you know, I was really hungry to learn. Teach me a lot of things give me great advice. So, you know, we definitely need more and more women and girls in technology. Absolutely.

 

Dawn Kristy  29:07

Yeah. And just encouragement in general. And, in general, there was, there was an interesting article in the Princeton Review, a math professor gave his perspective, and he said women should not be intimidated. At this point, STEM involves learning how to break down a problem, analyze it, and solve it in a systematic way. This is not really about calculation. It's about critical thinking. And my thought was, we women do this every day in our personal lives. So.

 

Debbie Reynolds  29:40

Absolutely.

 

Dawn Kristy  29:41

You know, we get a problem, we break it down, we figure out a solution. And there we are. So, you know, it's, we need the encouragement, we need the role models, and also we need the financial support. So right, you know, the idea of taking a bite out of the elephant. So you know, this is accomplished by including more girls and women in the process. Changing the education system is very difficult. But you can have programs and academies to focus on girls and women in STEM. And that's another thing that I would like to be involved in. Maybe we can address that in a separate podcast focused on that exclusively, Debbie

 

Debbie Reynolds  30:24

Well sure. Absolutely. We've done things. We collaborated in the past. So we definitely discover things. Tell me a bit about this law in Mexico that you're talking about? Yeah.

 

Dawn Kristy  30:37

Because I just literally, it just crossed my LinkedIn page today. I love when you get these prompts on LinkedIn. And it's like, oh, wow. So basically, they have created, they just passed a law, a National Registry of Biometrics. And what they decided was because of the frequency of crime, extortion, kidnapping, and even, like using cell phones in the prisons, they decided to have people give up the new customers of cell phones, not the current ones. They have to give their eye scans and fingerprints to get a cell phone. And again, supported by this need to reduce crime, and it's passed. But you know, the idea is that they're their version of the ACLU, and their advocates have said, you know, this is way over the top. They're crying foul to this. And so the first person that I thought was you in terms of the privacy, and because we've talked about, you know, before in Illinois with the biometrics and some of the California laws that are addressing biometric, so this was this, I think this was a shocker that this was done. And I think it's unique in terms of countries doing this. As far as I know, there might be something like this in China. I'm not sure. Yeah. You know,

 

Debbie Reynolds  31:59

Yeah. Yes.

 

Dawn Kristy  32:01

To this level of saying, you know, here's your Verizon cell phone, right, put your eyes here and your fingerprint. I mean, it is we can't imagine doing that.

 

Debbie Reynolds  32:09

Yeah, I think so biometric is very tricky. So what is happening, what we're seeing is a push towards finding a reason to be able to collect biometric. So one of those reasons, I think it's going to be COVID in the future. So that's, that's what we're talking about these health, or immunity passports about who's been vaccinated and who hasn't, and all that stuff. So we're going to get there. But this, you know, I think, what Mexico is probably saying that we want this information, period. Yeah. And then let's make it about a cell phone. So eventually, it will go the other, you know, it, they'll expand it in some way. So, yeah, this, to me, is somewhat similar to what we have in the US with this TSA precheck, which was, you know, give us all this additional information. So if you keep your shoes on, we go through security or something. And then I haven't done that yet. No, I haven't either. Yeah, I'm like, I'm good. I'm good. I'm, to me, that wasn't a good enough reason. But you know, the thing that concerns me about the TSA precheck, or is just kind of this massive data gathering. So you're basically putting this information in the database, and then if anything happens, they look at that database first before they go to other fields. So right, the problem I had was, the issue that I have with this type of massive data collection is if you're trying to find a needle in a haystack, why create a bigger haystack? That makes no sense. I need a better reason for people to collect this data was not a good reason.

 

Dawn Kristy  33:57

It made me think too of you being a Chicago and that I think it was at O'Hare Airport Delta. And this was a year or so ago started for the international flights, a facial recognition based on your passport photo. I couldn't have thought of the Bourne Identity where somebody has like ten passports. So we're totally in the US. And so they were, I don't know if that was just a pilot, and they tested it if it's still there. I haven't flown out of delta, international, and O'Hare. But you know, there are these little snippets of companies trying this. And is it assessed? Is it used? Do they run it through sort of the privacy data mapping standards that are out there? Or do they just wing it?

 

Debbie Reynolds  34:45

You know, some of it, it has to do with consent. So your consent, the sky is almost the limit to what you can consent to. So I always tell people it is illegal to sell their limbs, right? But that's pretty much it. So if they can get you to consent, and regardless of how crazy it seems, especially if you're you're, especially with free services, I think we see this a lot where they're saying, we're doing all this work for you, and we're doing this stuff for free. And then we get to decide how higher the value is based on what you can give me in return. So that's kind of a problem. So yeah, I really confirm because a lot of times, this data collection, once it's collected for one thing is, is almost always used for something else. So I was really happy to see that sort of purpose limitation in the GDPR. And they're very strongly pushing things about when data goes to a third party, you have to get consent, or you can't transfer. And so we're starting to see that seep into a lot of the privacy laws in the US like CCPA and CPRA. About third-party data transfer, that's like the hot glue thing.

 

Dawn Kristy  36:02

Yeah. And as the laws spread east, you know, you're getting hybrids of all these different things, which actually, you know, from a business owner standpoint, it's very difficult to map this out. So it's like, do you take the least common denominator? Or do you take the most severe, yeah, and comply to that level because it becomes this hodgepodge? And I'm always looking for maps that show the different standards in the US. So you can save this, this, but it's not easy to comply

 

Debbie Reynolds  36:33

It's extremely hard to do. So. Jeff Jockisch actually was on my show several weeks ago. He actually compiled a really good one. And it looks like a quilt, actually. So it's color-coded. You just go on his profile. It is really hard. The one thing that I try to tell people instead of kind of lurching from one law to the next, I know people, it is complicated, but it doesn't have to be as complicated as people make it. Say one thing is you, as a business owner, you need to realize that when people use your services, they're they own their data. And they're giving it to you on loan, and you're like a steward of data. So if you understand that you're a steward of the data that they give you, it contains your whole perspective about how you handle data. So I tried to make the analogy of a bank. So let's say you gave your bank money. And then you said he wants to see your balance. They say, well, we can't show your balance, you'd be like, hit the roof, right? You'd be so upset about that. So think of data that way; think, you know, the person gave it to you for safekeeping to use for a particular purpose. And they have a right to see. And the right to know what you're doing with it. And you need to be responsible.

 

Dawn Kristy  37:56

Yeah, yeah, that's a great analogy. I love that story.

 

Debbie Reynolds  37:59

Yeah. So I will love to ask you. So if there was a world, according to Cyber Dawn and Dawn Kristy, about Data Privacy or Cybersecurity, what will be your wish either in the US or around the world?

 

Dawn Kristy  38:17

I go back to this idea of starting small and building from there. So increasing the awareness of it, not just relying on nightly news programs. But you know, I love the idea of collaboration that's going on with some of the Cybersecurity experts in firms around the world. But really, it starts almost at the grassroots level for people to understand. This isn't just a risk on the news. There are things that you can do. And I like what you said about understanding Data Privacy because it's the same thing with Cybersecurity. So in the ideal world, increase awareness, but have the people that hear the stories and get the awareness training to act on it, you know, awareness on its own doesn't do anything you have to, you have to take responsibility and have some action in your household and your business, in your affairs. So that would be one thing. And then the other is, you know, it kind of comes with a bigger theme of helping and protecting and this the idealist you're asking it in this sort of world, but, you know, just it goes this idea of helping each other. This is one area where we can do that by protecting our information, our privacy, our irises, our fingerprints. I mean, we are not at the point where we've surrendered everything just yet.

 

Debbie Reynolds  39:46

Right?

 

Dawn Kristy  39:46

There's still a chance just to stop the flow, so to speak. So. So improve the training, improve the action on the training, and then also stop the flow. have our information going to anybody for any purpose and getting control of that?

 

Debbie Reynolds  40:06

Yeah, definitely. Well, that's wonderful. Thank you so much. So this has been great.

 

Dawn Kristy  40:11

Thanks for having me.

 

Debbie Reynolds  40:12

It's been a great episode is always fun to talk to you. We could probably talk for hours about stuff like this. So we'll have you back on again. We have to talk about some more stuff. It sounds good. I'd love to come back. Yeah. Well, thank you so much, Dawn. This is great. And we'll talk soon.

 

Dawn Kristy  40:29

Okay, take Debbie.

 

Debbie Reynolds  40:30

Thank you very much. Bye-bye.