"The Data Diva" Talks Privacy Podcast
The Debbie Reynolds "The Data Diva" Talks podcast features thought-provoking discussions with global leaders on data privacy challenges affecting businesses. This podcast delves into emerging technologies, international laws and regulations, data ethics, individual privacy rights, and future trends. With listeners in over 100 countries, we offer valuable insights for anyone interested in navigating the evolving data privacy landscape.
Did you know that "The Data Diva" Talks Privacy podcast has over 250,000 downloads, listeners in 114 countries and 2407 cities, and is ranked globally in the top 2% of podcasts? Here are more of our accolades:
Here are some of our podcast awards and statistics:
- #1 Data Privacy Podcast Worldwide 2023 (Privacy Plan)
- The 10 Best Data Privacy Podcasts In The Digital Space 2024 (bCast)
- Best Data Privacy Podcasts 2024 (Player FM)
- Best Data Privacy Podcasts Top Shows of 2024 (Goodpods)
- Best Privacy and Data Protection Podcasts of 2024 (Termageddon)
- Top 40 Data Security Podcasts You Must Follow 2024 (Feedspot)
- 12 Best Privacy Podcasts for 2023 (RadarFirst)
- 14 Best Privacy Podcasts To Listen To In This Digital Age 2023 (bCast)
- Top 10 Data Privacy Podcasts 2022 (DataTechvibe)
- 20 Best Data Rights Podcasts of 2021 (Threat Technology Magazine)
- 20 Best European Law Podcasts of 2021 (Welp Magazine)
- 20 Best Data Privacy Rights & Data Protection Podcast of 2021 (Welp Magazine)
- 20 Best Data Breach Podcasts of 2021 (Threat Technology Magazine)
- Top 5 Best Privacy Podcasts 2021 (Podchaser)
Business Audience Demographics
- 34 % Data Privacy decision-makers (CXO)
- 24 % Cybersecurity decision-makers (CXO)
- 19 % Privacy Tech / emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6 % Media / Press / Regulators / Academics
Reach Statistics
- 256,000 +Dowloads
- We have listeners in 114+ countries
- Top 50 in Business and Management 2023 (Apple Podcasts)
- Top 5% in weekly podcast downloads 2023 (The Podcast Host)
- 1,000 to 1,500 - Average Weekly podcast downloads
- 2,500 to 5,500 - Average Weekly LinkedIn podcast post engagement
- 12,450 + Monthly Data Privacy Advantage Newsletter Subscribers
- Top 2% of 3 million + globally ranked podcasts of 2023 (ListenNotes)
Debbie Reynolds, "The Data Diva," has made a name for herself as a leading voice in the world of Data Privacy and Emerging Technology with a focus on industries such as AdTech, FinTech, EdTech, Biometrics, Internet of Things (IoT), Artificial Intelligence (AI), Smart Manufacturing, Smart Cities, Privacy Tech, Smartphones, and Mobile App development. With over 20 years of experience in Emerging Technologies, Debbie has established herself as a trusted advisor and thought leader, helping organizations navigate the complex landscape of Data Privacy and Data Protection. As the CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, Debbie brings a unique combination of technical expertise, business acumen, and passionate advocacy to her work.
Visit our website to learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E187 - Becky Gaylord and Debbie Reynolds
Debbie Reynolds “The Data Diva” talks to Becky Gaylord, Head of Client Projects, Risk and Security, Gaylord Consulting. We discuss various aspects of data privacy and cybersecurity in a podcast episode, including car vulnerabilities and privacy concerns. Becky shares her career trajectory and emphasizes the relevance of her journalism background in blending consulting and communications with technology.
We discuss the importance of strategic communications in handling data breaches and the challenges consumers face in understanding and protecting their data. We advocate for better security standards and industry-led solutions, highlighting the need to categorize and protect sensitive consumer data. The conversation also touches on the concerns about collecting and using data without consent, particularly in public spaces and by law enforcement. We stress the importance of regulation and education to address these issues and emphasize that basic steps can be taken to protect personal information without requiring advanced technical knowledge. We highlight the tactics used by scammers, such as creating urgency and fear to manipulate victims, and stress the importance of pausing to break free from the emotional manipulation. Additionally, we discuss the difficulty in recognizing fear, urgency, and doubt during a scam and the need to overcome social constructs that make individuals vulnerable to manipulation.
Debbie Reynolds and Becky Gaylord delve into the complexities of addressing privacy and cybersecurity concerns within organizations. We stress the importance of open communication and greater awareness to combat cyber threats while expressing the desire for privacy to be respected and for individuals to have more control over their personal information. The potential impact of privacy breaches on companies' bottom lines and the need for a shift in mindset to prioritize privacy and cybersecurity are highlighted. We also discuss the importance of effective communication and brand building, highlighting the need for women to support and uplift each other professionally and Beky’s hope for Data Privacy in the future.
37:47
SUMMARY KEYWORDS
data, companies, happen, privacy, call, breach, protect, situation, awareness, organizations, urgency, consumer, people, true, technology, realize, information, deep, put, shopping cart
SPEAKERS
Becky Gaylord, Debbie Reynolds, Debbie Reynolds
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. I have a special guest on the show all the way from Ohio, Becky Gaylord; she is the Head of Client Projects, Risk and Security at at Gaylord Consulting. Welcome.
Becky Gaylord 00:39
Hi, Debbie. It's great to be here.
Debbie Reynolds 00:42
Well, you and I have had the pleasure of having chats, we had some calls, and I've always thought you were such an interesting person, you have such an interesting background. Your work in this area of cyber and privacy and consulting, and also, strategic communications we have talked about, that is really fascinating. But please do tell the audience about yourself. Well, we'd love to know your career trajectory and how you got interested in privacy.
Becky Gaylord 01:16
Sure, well, thanks again for that intro; I really appreciate that. I love seeing your posts and the interactions we've had. In my career, I've had more than 20 years in information security. I am a trusted adviser and a collaborative leader and have my own consulting LLC, and also hold certifications in privacy and cyber and cloud including CISSP. But I didn't break into cybersecurity; cybersecurity broke into me, as did Data Privacy issues. Because 10 years ago, not long after I started the LLC, my website host and the outfit that did my backups, so managed service provider, we call them MSP, got hacked and suffered a data breach. I only started to realize at that point how little I knew about just protecting assets, protecting my network, and segregating. I just didn't realize what I didn't know, and instead of going back out to the market and finding another one, perhaps stupidly, I decided to take that over myself. So, I stumbled my way through disaster recovery and business continuity. I just wanted to also, briefly, actually talk about how, in hindsight, this all matched so well with the earlier portion of my career. Because I started my career, which has two other phases. Consulting is the middle one, but the earliest phase was in journalism. I was a financial investigative journalist and worked for the publisher of The Wall Street Journal. In Washington DC, I was involved with the economics team, who would do the economic indicators like the unemployment report and the things that come out about how our economy is doing, GDP. They have lockups with government officials, where they would collect their phones and not allow us to have open lines and have these really serious forms that we had to sign that if we were to disclose his information, our news organization would be banned. So I had this early sense of CIA triad, the confidentiality, integrity, availability, and so it turned out that was really helpful in a way to blend in with the consulting and the communications. I find it's a really fascinating way to blend all those in, to bring the tech to the people, so that we can try to educate, inform, and engage people around issues that otherwise can feel overwhelming and daunting. In fact, it affects our daily lives, if we can make it accessible, and we can understand things like that.
Debbie Reynolds 04:07
I agree with that. I love your background. I love the fact that you and I have had some conversations about strategic communications. It's so funny because every time I see someone having a data breach, I see the bad message that goes out. Oh my god, they need Becky's help.
Becky Gaylord 04:26
Well, it's funny if you would indulge me, I'll just say super quickly, just yesterday, one broke about Verizon, and the only reason they even announced it, and you would appreciate this, is because Maine has such a good Data Privacy law, and they had 63 of their employees that said we're based in Maine. Well, this letter was not written for the employees, fully half of their employees had their data leaked. It was not just the regular stuff, which is bad enough, but it also was social security numbers. It was your union membership. It was your gender. It was everything. Then the letter said something like, basically, we don't know that it's been used outside. Wow, how about just saying we're really sorry? We are looking into how this happened and what we need to do to do a better job because we fell really far short. Anyway.
Debbie Reynolds 05:17
Isn't it interesting? Obviously, there's some legal maneuvering there. But then the fact that we have such a patchwork of State laws, some laws have different reporting requirements. Some laws are different in terms of what they consider personal data and what they consider a breach. So you really do have to look at these different States. So if you are in a State that doesn't have those same breach requirements, you may have been breached, but they may not have had the obligation legally to let you know that.
Becky Gaylord 05:48
Right, for sure. Yeah. In fact, I don't know if that would have even been announced if it wasn't for Maine. But the other thing that was interesting was that it had happened three months prior. So that's a lot of time to give a bad actor, to start to fish you, or to do other stuff with it. That's another issue is that notification is great, but it's got to be actionable. It's got to basically arm us with information that's going to let us protect ourselves. So yeah, good point.
Debbie Reynolds 06:18
Yeah, not to harp on it too much, but I thought one of the more surprising reactions to a breach was 23Andme.
Becky Gaylord 06:27
Yeah.
Debbie Reynolds 06:28
Where they basically blame the consumer.
Becky Gaylord 06:32
I actually wrote about that on LinkedIn because I thought that was just so crazy, and technically, it wasn't a breach. They were trying to really split the hairs about it, or their lawyers were trying to split the hairs. I've got so much deep respect for lawyers, but you really need a communicator to be involved with writing what you're going to say, or at least someone who's got some empathy because that just came across as so tone-deaf. The fact is, even though it was a case of their users signing up for 23,Andme with credentials that they had used on other systems, and then there was a breach with other systems. It's very automated; it's easy to automate. Now, for threat actors, as we call these folks, it's easy for them to take those breached data and then run them through all these other sites to say, hey, where else do they log on with the same username and the same password, which is common. It's about 60% of the people who will actually admit to doing it; it's probably more because we somehow know we’re not supposed to, but even to just admit to that many, so the breach came through with access came through because it was a credential reuse. But still not to geek out about this. But the National Institute of Standards and Technology guidelines, NIST guidelines, say the verifiers have these log-ons; they can run password and login credentials, and path block lists. If they've already been in a breach, or if they're on these dictionary lists like password123, they can say, no, you've got to try again. Here's why, and we're going to help you; you can get one that's safer. So they didn't use block lists, they didn't actually encourage multi-factor authentication, and they now require it. So I'll just take the easy step first instead of blaming your people, but now I think the last I heard that there were 30 lawsuits going against 23andme. Because that data is the most sensitive, can you imagine it's like your DNA? It's everything. So that has a higher standard, I think, for protecting us. It should, anyway.
Debbie Reynolds 08:39
Yeah, I say if you can't protect it, don't collect it. Love that. I want your thoughts about well, before I go into that, I'll definitely talk about UX design and deceptive design and manipulation and stuff like that. But one thing that I'm noticing, that all these things we're talking about have in common is that they try to put all the work on the consumer, and the consumer is not educated enough to know what to do, and they don't have the tools and the knowledge to figure out how to help themselves. What are your thoughts?
Becky Gaylord 09:17
That is just so true, and it's a great segue, actually, from what we were just talking about. Because anyone who's been online for more than a few minutes, or at least gone shopping once online, has probably experienced situations where they were looking at an item or they had something in a shopping cart and then decided not to check out. Then unless you've got your settings really locked down and are preventing the sharing of cookies sessions and other things, it's almost creepy the way that particular retailer will chase you onto other sites and browse and show you the same shoes you're looking at, or ping you even have emails to say come back to your shopping cart. So, the fact is that persuasive UX is used regularly. They certainly know how to do it for marketing and for sales. So, how about doing it to help protect us with a gentle nudge as you're signing up? Like I was saying, use a pop-up that says, hey, more than 80% of the breaches involve weak passwords and or use credentials. So this is why we want to differentiate ourselves to help you understand we're trying to protect you. Here's the little meter that shows on this strength meter that goes from low to high and well done. Because you're completing something on a form, it shows you how much you've got done as you go across. There's well established best practices for user experience in design, and there's absolutely no reason why we couldn't adopt those and don't adopt those. For these purposes. You've talked about this pretty regularly. But even just the way that data notices are written and the way they're often buried, they're not accessible. But it can be done not just with the text but also in the design, the disconnect of this. Lastly, say on this point, the disconnect is that instead of it being a factor that perhaps your users could use against you, which is why I think that places like 23andme, don't use those, I believe they should seize that proactivity and say, Hey, we're out here on your side. If it does get into the hands of bad actors, but you've got strong passwords, and you haven't reused them, it's almost useless to them if they get something that they can't crack because there are also really easy automated tools that you can craft really easy one. So it doesn't matter how long they have if they have one, and it's bad, it's going to end up exposing us retailers or the companies, the collectors could say, we're actually on your side if you really wanted to, you could still let them go all the way through several screens of that and still set a weak password, but at least allow them to have knowledge around that and just say, okay, all right, okay, I'm not going to do it. We don't have that experience like we do when we leave a shopping cart abandoned.
Debbie Reynolds 12:22
Now, I think the problem and the gap are that it is not evident on a balance sheet the ROI to businesses why they would do that. So they're thinking about how to make money, not to spend money. So maybe they think about it as a cost upfront and not really a benefit. But I think if you look back at a time machine, I'm sure 23Andme right now wishes they had done that.
Becky Gaylord 12:47
That's exactly right. I think this might be because we've nailed it; the ROI is a little difficult. But we can do an equivalent with, say insurance; we wouldn't take the risk of leaving ourselves uninsured, even though it might look like a cost. The risk of not doing that is just too significant to bear, and companies do the same thing. I think that when you see what they're going to be spending on lawyers to defend themselves through this, it's going to be considered almost like a CYA. Cover your you know what around this, because the cost of not doing it is just too great, and they have done it. It's now mandated for everyone who signs up. So it just shows you that it could have been done before.
Debbie Reynolds 13:32
Yeah, that's a really sad situation. I feel like the 23andme breach is probably the worst that I could possibly imagine because of the long-term impacts on a person: they can't get credit monitoring, and someone using their DNA against them for a job, insurance, or health care. I mean, it's outrageous. But I'm hoping that more organizations see this as an opportunity to not fall into that same trench this company fell into and then to me, and I want your thoughts. I feel like the more sensitive in nature that data is, the more those companies need to have even better security. So, I think they should be held to a higher standard.
Becky Gaylord 14:20
I totally, totally agree on that front. As we continue to watch the situation here in the US, really, there's nothing that's going to be happening anytime soon. I'm wondering if maybe possibly there isn't a solution, like industry-led, for example, like what we have around the payment credit card standard because there are real teeth around that it's not a law, but there are real teeth around that, and there are the required scans. There's lots of respect, is what I would say when you are adhering to that. It's like this good housekeeping seal of approval. There's no reason why there couldn't be an industry-led movement around that to say, hey, sure, we could get in front of this because I agree with you completely that, especially with the most sensitive, we have something similar to with HIPAA, but we just don't with consumer data that can have even more in this case, with the level of detail that they've collected and blood samples and biometric it's almost mind-blowing. So why not take the lead? I actually think that will also perhaps educate consumers in the same way that it did when we went from tearing up those old carbon copies to doing what we do know now to see the numbers on a credit card masked except for the last four; it's almost like it helps inform people when they wouldn't have otherwise realized that that was a risk to not have that masked.
Debbie Reynolds 15:52
Yeah, and I think one of the things that has changed in businesses is that businesses have always been very good at trying to protect what they thought was most valuable within their organizations. So their business secrets, their business processes, and things like that those things have very high security, hadn't really thought about the information of people, and also trying to grade data by the sensitive nature of it for the consumer. So I think, hopefully, we're gonna see more organizations try to do that. I think some of these State laws are forcing companies to do that because what they're bringing in is a categorization of data, saying that certain data needs better protection, different protection, and then also creating a situation where companies can't justify keeping data forever, especially if they no longer have a purpose for it.
Becky Gaylord 16:50
I think that's very true. It does come with a mindset shift. I mean, this idea that, for example, someone else gets to assess what we think is sensitive or what we would want to protect, really needs to allow this user opt-in aspect, which is also related to user experience. But it's even just the hurdles that are placed; for example, for someone who wants to go the extra mile and lock down more sharing, selling, or trading, I just opened a new account not long ago, and to be able to forbid the sharing and the selling, even with affiliates. I had to go to two separate screens and then actually call, pick up the phone, and speak to a human, which I was willing to do. But I thought, wow, it is my data. But I'm having to go through all these extra steps just to say you can't do anything with it because it's not yours. Yeah, I understand that. That's a profit-driven aspect. But who's got the ownership of that is really tied up with that. I think it's partly because it's still relatively new, considering we used to have these discussions until relatively recently.
Debbie Reynolds 18:06
So what's happening in the world right now, what's concerning you most as it relates to privacy?
Becky Gaylord 18:14
Oh, my gosh, I don't mean to sound obsessive, or I'm speaking in hyperbole here, but so many things, the amount of data that gets collected without consent, cameras, and how that is used, especially considering the false positives. It's not just that it's gathered, but that the action that's taken, there's a whole series of steps that happens that we're just not informed about. That includes our cars and often where we are in public, and of course, we’re in public, we have to presume that we're going to be watched; it's just how ubiquitous it is. Then, when you hear of situations where that's funneled to law enforcement, or with the Ring, recently with the doorbell, there was a button that could be pushed, and then law enforcement was sent this information, I guess, the theme is that the collection and then the use without our consent, but the fact is, it happens in so many spheres. Now that, to me, it's really concerning. It doesn't even give us the option to say no, I don't want that. Or if you're going to collect it, let me know what you're going to do with it and why and how long you're going to hold on to it. That kind of stuff, without making me look in six-point font, three pages down on a website I can check out, be transparent about it.
Debbie Reynolds 19:37
I agree; it has to become a company's problem and not just the consumer's problem.
Becky Gaylord 19:43
Right.
Debbie Reynolds 19:44
So that's why I think regulation can really help in that regard if that ever happens. Yeah. Give me your thoughts about awareness and education.
Becky Gaylord 19:58
There are so many overlaps between cybersecurity and Data Privacy and protection. This is one of those areas where I think it can seem like it's rocket science to people; both these areas can go quickly into just really deep topics: machine learning and AI. I think most people don't seem to understand that you don't need a Ph.D. in mathematics to take some basic steps about protecting what's out there and why you should, instead of seeing it as a hassle because I've got two teenage sons, I mean, they self report that they don't, but wouldn't look at the questions about sharing, don't care what's traded, will sign up for any form. I love the saying that if you're not paying, you're the product; it's just so true. The fact is that there's often ways that we can just not share our information, and it doesn't have to be complicated. I'll actually just give one really quick example. That, I think, is awareness. When I signed up for a flu shot not too long ago, I didn't leave my social security number; I didn't even put my address down; I put my doctor and my hospital. Yeah, sure, send it to them so that they can correlate that; I didn't put another thing there. Now, I would have been willing to show my ID and verify with the pharmacist that I'm who I am. But no one's telling us that kind of stuff, and it was a paper form. So I didn't know where that was going, who was going to see it, who was going to take the information off, who would have done something with it; I think that there's a lot of really bad people out there walking around, even if most of the people have good intentions, you really only need one case where it's not shredded, it's on a desk, it gets picked up, and then someone else can set up a new account. I happen to have a credit freeze in place. But if I didn't, next thing, you know, that's how identity theft can happen. So just allowing people to know, hey, you have the right to not give your social security number at the doctor's office when they ask, if they can ask it. But I never give that, just these little tidbits that I try to share, especially with older people; the awareness doesn't have to be really complicated to make a difference, and I think that's empowering.
Debbie Reynolds 22:26
I think that's true. Every day, we hear of a new threat that comes out. I know we saw people are really heated about this situation in Hong Kong, where someone had a Zoom meeting. They were told they had these fictitious characters that were deep fakes, have convinced them to transfer all this money from a company and people are all in uproar about that, as they should be. But I think, basically, it is an extension of social engineering. You're convincing someone to take action that they probably would not have otherwise done, probably asked him to do something quickly. Like they have a sense of urgency there. These are all the markers of fraudulent behavior. What are your thoughts?
Becky Gaylord 23:13
This is as old as humankind, that hark, who goes there, the validation and the verification. For sure, that's relevant in this situation that the technology has been advancing so fast that I think that most people, if they're not in this space, professionally, or follow, would almost not even realize that this level of capability exists. But the same could be said for most people, I don't think realize what the data brokers and are scraping from the Internet, which is another reason why I really try to lock down my stuff, what I share online, and yet even on something that complex, I think we can make awareness almost transcend the specific technology changes, having with the IT folks or the cyber folks would call an out of band verification. So if it's coming, and I know, look, when you've got your company people and they're on a Zoom call, it's going to seem maybe really weird to think you would send that same person a text message and say, if my memory is correct, is something like $25 million transaction, it was enormous. So if it leads maybe, over tears, but if it's over a certain amount, you get the second signature; if it's over a certain amount, you verify by sending a text message, but even having something because you have these horrible stories of grandparents called with deep fakes of calls from their grandkids saying I'm in trouble. Can you help me out? Something as simple as a code word? I say pineapple. You say pizza. You can actually teach a little kid that, and it doesn't even have to be privacy as much as just safety. A little kid, if someone says I'm here to pick you up, didn’t your mom tell you that I'm here to pick you up? You can teach anyone to listen for the word; what's the word? If that's not said, but the pressure, you just really picked up on it, is there, the pressure, the urgency? It's just like high-pressured salespeople; we go into that lizard brain part and just think we have to act. And you also can't do that during the call, if you haven't actually stopped and thought about, hey, this stuff is really getting a little crazy. Maybe the people who have the approval to sign checks that big need to have another kind of control; not everyone's going to need it.
Debbie Reynolds 25:31
Yeah, almost like the nuclear thing where there are two keys and they're far apart, right? One person can't do the thing, something like that.
Becky Gaylord 25:43
When the stakes are high enough, if the approval that someone could do goes up to 25 million, it's actually worth putting your heads together. What's the belt and suspenders that we can pull into this? Because it's worth it?
Debbie Reynolds 25:58
Yeah, I totally agree with what you say about a co-worker. I tell people that all the time. So I think companies should have co-workers, as well. Also, parents and families should have, especially with your younger people and the older people. There are so many different threats that are out there. I know someone personally that was fooled by a deep fake voice message. A call sounded like her son said he's away college, she ended up actually transferring money for him before she figured out that there was a fake problem. So this is real. This is not sci fi. This is truly happening. It's not even the first person that told me, they got a call like that; one person actually was able to thwart the attack, because he got a call said his mother, his car broke down crying, and she needed money, and he needed to do whatever. He just thought, this just sounds so crazy. So he hung up the phone., and he called his mother's job. She was there. She was like, what the hell are you talking about? So this is scary.
Becky Gaylord 27:06
It's scary, and it is real, and it's hard to get that stuff back. It's difficult once you've been scammed, and it's violating, yeah, that's before this kind of technology existed, obviously weird ways of feeling like we were taken or that we'd been had. But it's happening in a totally different way that really screws with your trust; it would be difficult to feel safe again.
Debbie Reynolds 27:31
Yeah, I think one thing that cyber criminals don't like is that they will go to the next person; they don't want you to wait. Yeah, if you pause, or you stop, or you wait, they will go to the next person, because they know the jig is up, this person's not going to fall for all this other type of stuff, or whatever. So sometimes it makes sense to even if it feels like it's a super urgent situation. That's why they bring in urgency, because that urgency gets you out of your normal thinking process, you go into emergency mode, you go into Mama Bear mode, Papa Bear mode, you want to do stuff to protect people. So it's very important to take that pause because the cybercriminals know that they are not going to be able to manipulate you, and they want to go to the next person because the other person will probably just go forward.
Becky Gaylord 28:25
It's very true, and speaking of awareness, when I give training, I talk about this concept of FUD: fear, urgency, and doubt because social engineers use that. So it's difficult, if you're in the midst of the fear, urgency, and doubt, to sometimes step back and actually notice it. But if you can, to say, aha, they're the foundational aspects of a successful scam in this realm. Then it's worth, back to the verify and validate, asking to put the person on the phone; maybe they'll have that snippet that's deep fake, but at least at this point, to have like an ongoing interactive conversation is not possible. So just verify in some way or call, as your friend did; pausing is definitely a saving grace in these situations, for sure. You're exactly right. I think that's where the safe word or phrase can help with that because you can almost do it in real-time, or at least it becomes apparent in real-time. If they're not saying it, as opposed to yeah, like saying, hang on, I'm gonna text you. It's because we want to be nice; we want to be accommodating. It's hard. We've got these social constructs built in. If it's your boss and then your boss's boss, it's easy to see how it can happen.
Debbie Reynolds 29:17
Yeah, and then another thing is that when a scammer contacts you in that way, they want to keep you on the mode of communication that you're on. So, just like the Zoom call, like they would have probably said, don't end the call. I'll call you back, right? They don't want you to do that. Because once they have you online, they don't want you to do other things. So they don't want you to email. They don't want you to hang up. They don't want you to call anybody else, because then they know that they can't sustain that fraud without you being on that particular mode of communication. Yeah, especially in an organization. So, I think these cybercriminals take advantage of dysfunction within corporations.
Debbie Reynolds 30:32
Yeah. Where someone from accounting is too afraid to go talk to their boss, because somebody else told them to do something where the right answer is to go talk to your boss. I think companies need to be open to make people feel comfortable to have those conversations because they'd much rather have a conversation than have $25 million wired out or their organization.
Becky Gaylord 30:55
That's very true. If I could just make a comment on that. I think the awareness comes in with that aspect, too, because there's such underreporting of these things that what we hear about is really still the tip of the iceberg. At the cybersecurity infrastructure security agency, Jen Easterly. And Christopher Wray, the head of the FBI, just last week testified before Congress, it was a committee, I believe it's the House committee. But Jen, in particular, was saying, any time you've been affected by these things, anytime, report it because I think that allows us to better track aid. It allows us to better educate too, but there's a sense of shame. I think that happens for folks, but really, it's through that awareness, this like arms race, criminals keep escalating. So we've got to make sure that we're aware of what their latest tricks are so that we can match them.
Debbie Reynolds 31:51
That's true. So, if it were the world, according to you, Becky, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or technology?
Becky Gaylord 32:06
This is almost going to sound like such a fantasy dream, but my wish would just be that our privacy is really respected. Without it seeming weird. I talk about the different things I do right now to protect my privacy. I'm the weird one. What I would love is to see the balance switch so that it's expected. We're respected as private citizens, as private people, so we get to choose what we disclose. That's pie in the sky. But that would be my wish.
Debbie Reynolds 32:40
That's a good wish. Heidi Saas put out a post not long ago about some of the things that she does to try to protect her privacy, and some of it can be exhausting, isn't it?
Becky Gaylord 32:52
Yeah, it can be. I actually saw that post. Yes, it can be exhausting. I think it's also part of the mindset because right now, that balance is not with us; we're having to go to greater lengths. I believe we would if that was shifted; I think if the mindset shifts, it's like these things we’re talking about with the user experience, they meet us more, maybe the respect will come, but not so much through legislation is just as much from the horror stories that can come out.
Debbie Reynolds 33:26
I agree, and I hope that comes true. Totally. Because we totally need that. As someone who does this every day, it is tiring for me. So, I can just imagine someone who doesn't care about this area as much as I do. People just want to live; they just want to live their lives; they don't want to have to go through all these hoops. So I would love to see more balance there. I'd love to see more education, training, and businesses step up and understand that this can be a benefit to you on the bottom line, or it can be a detriment to your bottom line because people aren't going to be bothered with companies in the future that do a poor job of protecting their data.
Becky Gaylord 34:03
Yeah, I'm not trying to beat up on anyone here, but since we did mention 23And me, you know what, I think the punishment that is happening as a result of the lawsuits, and it had some issues already with its share price if you look back, but I specifically looked back since just from the time of this issue, which happened in October, and it experienced a steep decline in its stock price. So I think maybe that impact is really encouraging, maybe that's the wrong word, heartening, maybe for those of us who care about these issues, because that might actually finally get the attention of companies in a way that nothing else quite does, is bottom line impact. Right. Even if it's not because you really are altruistic or you really do want to help people, but because it's going to cost you if you don't one way or the other.
Debbie Reynolds 34:57
Yeah, and there's this type of story could be the ending of a company; shareholders should care about this. Because that means they're losing money and the company isn't doing the right things with people's data.
Becky Gaylord 35:15
Yeah, very true. Ultimately, as far as that legislation goes, I think what we're going to finally find is when people with a high enough profile, and probably politicians, are the target. I'm not advocating for anything like that; I just want to be clear. But if someone who has the influence and is in a position to actually also make a law about it finds themselves or someone they love in the crosshairs of one of these scams or something that butts right up against these issues, then when it becomes personal, I think we'll see that also get attention, because it can seem like it can happen to only other people until it does happen to you. I think that's one of the other things about keeping it quiet. The more we share. the more we realize this is really widespread; it's actually not isolated, but it can seem isolated if we're not talking about it and if the awareness is not commonplace.
Debbie Reynolds 36:13
Yeah, and I guess a perfect example of that is this Taylor Swift deep fake thing that's happened. Where many of us have been talking about this for years, many years. So, when they happen to a famous person, and now it's in all the newspapers, now we have legislators who are really paying attention, unfortunately, or fortunately, I'm glad that people are really paying more attention here. But also, I'm dismayed that something has to happen to someone famous for people to really pay attention.
Becky Gaylord 36:45
It's a great example. It's exactly a perfect situation. It is unfortunate, and yet, it seems like it has to rise to this level that it didn't before on a wider scale.
Debbie Reynolds 36:59
Yeah, I agree. Well, we'll keep on moving forward. Definitely keep going with your advocacy. I love the work that you're doing in cyber and privacy and really talking about this communication stuff. So that's really key, where I feel like people aren't really great communicators, not very empathetic, need to really care about people more.
Becky Gaylord 37:21
Right, right.
Debbie Reynolds 37:22
That needs to come across in people's communication. So, thank you so much.
Becky Gaylord 37:26
Thank you, Debbie. I really appreciate it.
Debbie Reynolds 37:29
Yeah, this was fun. Well, we'll talk soon.
Becky Gaylord 37:32
Great.
Debbie Reynolds 37:33
All right. Bye bye.
Becky Gaylord 37:34
Bye.