"The Data Diva" Talks Privacy Podcast

The Data Diva E162 - Justin Daniels and Debbie Reynolds

December 12, 2023 Season 4 Episode 162
"The Data Diva" Talks Privacy Podcast
The Data Diva E162 - Justin Daniels and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva” talks to Justin Daniels, Corporate Mergers & Acquisitions Counsel and Cybersecurity Expert, Baker Donelson LLP. We discuss the importance of privacy and security awareness in the context of increasing digitization and regulation. Justin explained the new cybersecurity regulation passed by the SEC that requires publicly traded companies to report material data breaches to the SEC within four days and talk about how they manage security risk in their annual reports.

We discuss the challenges companies face in cybersecurity, including the lack of awareness among C Suite executives and the credibility gap between what leaders say in public and allocating resources towards privacy and security. We emphasize the need for education and building general awareness to prevent human error in cyber breaches. We also discussed the double-edged sword of emerging technologies and the need for education and awareness to mitigate cybersecurity risks.

The conversation also touches on the importance of balancing public safety with privacy and cybersecurity concerns, particularly in using digital cameras and facial recognition technology in public spaces and schools. We explore the limitations of current remedies for privacy violations and suggest that more severe consequences, such as criminal penalties, may be necessary to incentivize companies to prioritize privacy and security. Overall, the conversation highlights the importance of proactive measures to address cybersecurity threats rather than accepting the status quo of frequent hacks and breaches and his hope for Data Privacy in the future.

Support the Show.

34:28

SUMMARY KEYWORDS

privacy, people, cybersecurity, talk, companies, security, debbie, cyber, sec, data, breaches, rule, podcast, technology, regulation, digital cameras, work, hacked, ransom, give

SPEAKERS

Debbie Reynolds, Justin Daniels


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show all the way from Atlanta, Justin Daniels; he is a corporate mergers and acquisition and tech transaction lawyer at the law firm of Baker Donelson. So he has a very deep background in security. And he is co-author of the best-selling book Data Reimagined: Building Trust One Byte At A Time. Welcome.


Justin Daniels  00:54

Hi, Debbie. Nice to be with you today.


Debbie Reynolds  00:56

Well, I'm super excited to have you on. So, first of all, we know each other because we both have a podcast; I look at a lot of your work that you do and things that you post, and it's always spot on. So we decided to do this podcast; it was maybe, hopefully, one of many future things we decided to do together; I would love for you to talk through your journey, how you got into your career, and your interest in privacy.


Justin Daniels  01:29

Alright, I will try to give you the Cliff Notes version; we don't want to be in the desert for 40 years. So, I started out as a corporate M&A attorney, doing all kinds of transactions. So around November of 2014, right around when Sony got hacked, I was working on a tech transaction. And then the Sony hack happened. And then I was working. I literally hosted the Israeli version of Steve Jobs who came into Atlanta for 24 hours to talk about cybersecurity and the connected car. And when I was done that day, he said, you know, you have a really good ecosystem for cybersecurity in Atlanta; you should get the word out. And it was like a light bulb went on. And that's when I started to narrow my focus to gaining real subject matter expertise in cybersecurity. So, fast forward six years, I did a Cyber-con conference, I did Atlanta Cyber Week, I've traveled to Israel with delegations, I've traveled to the UK. And of course, as you know, Debbie, Cybersecurity and Data Privacy are like the peanut butter and jelly of the technology industry. It's kind of hard to talk about one without talking about the other. And so I really started to get a focus in that area and really enjoy it because privacy and cybersecurity overlay any technology that we can talk about from AI to social media, to autonomous vehicles to drones, you pick it, privacy and security are some important part of the equation. So I'm excited as privacy and security professionals that we get to delve into all this cool stuff. And there's never a dull day, which makes it interesting.


Debbie Reynolds  03:11

Absolutely. And this is a good segue to talk about your podcast, the He Said She Said Privacy with Jodi Daniels who's your wife and she's an absolute doll. I met her in Washington, DC. I highly recommend that people check out the podcast and the book as well also Jodi; with her company, she does a really great newsletter that comes out on LinkedIn; it's fantastic. So, definitely check that out. I love to have you here because you understand the legal part, you understand the technical part of cyber and you understand how privacy intersects there. What don't people know? What do you find when you talk to audiences, clients, or different people? What baffles people about cyber?


Justin Daniels  04:33

Well, I'm going to start with talking to C-suite executives to narrow down your question. So I think what baffles them about cybersecurity is, in my view, it's the continued lack of awareness. And so what I mean by that is, I meet all kinds of CEOs and they talk about privacy and security and about how important it is. But then when you start to peel back the onion, you learn things like oh, well, those multifactor authentication rules don't apply to me; I'm the CEO, or you hear that, oh, we're the security team, we really don't have a budget. So what I typically see a lot of times is what I'll call the credibility gap, meaning you hear things in public from leaders because they're saying what they think needs to be said. But when you peel back, what are the actual allocations of resources, people, and funds you find to be lacking? And then when you see that you're not really surprised when you read headlines about this data breach, or this company isn't taking advantage of privacy in this way, because companies still are not routinely making privacy and security a feature or design of their product, instead, they treat it like an afterthought.


Debbie Reynolds  05:50

I think that's true. Also, I want your thoughts about having cybersecurity folks at a board level; I feel as though companies have treated cybersecurity as if they were the fire department, right? So you don't think about it until something bad happens. And we know now from a lot of the breaches and a lot of the problems that companies have. And also as the regulations are being put together around cybersecurity that regulators are tired of hearing, I don't know. They're tired of companies making these statements and not really being able to show their work. And we'll talk a little bit about SEC rules in a few minutes. But what are your thoughts about really raising the visibility of cyber folks within organizations, and also making sure that they have their budget, making sure that they have the connections? I guess, so the influence within organizations? What are your thoughts?


Justin Daniels  06:52

So I'll give you a good example. So a couple of weeks ago, we actually had the CEO of my law firm come on our podcast to talk about cybersecurity. One of the things he talked about was I don't claim to be an expert in cybersecurity, but we made sure we hired a really good CISO. He has direct access to the board, meaning he's a C-level officer who comes in directly and speaks to the board; he's not speaking through the CIO, the CTO, or the general counsel. So, my first thing is that your security leader needs to have direct access to the C suite and the board; it can't be filtered through other people; they should be their own important domain. But also from a security professional perspective, you have to, in a way, level up your game. And what I mean by that is having really great security but narrow expertise when you want to have a wider purview into the company isn't going to cut it. And what I mean by that is, from a security perspective, if we have the best security, but the company can't operate efficiently and can't do what it needs to do, it may not be very profitable. And ultimately, that doesn't work. So how do you as a security professional start to understand marketing imperatives and business operations to figure out okay, given what the company wants to do, how do we best slot in security within the context of the strategy of the organization? So I think it's twofold. One, the security people need to have direct access, but also, the security professionals need to up their game to make sure that we're better at strategic thinking about how to understand security in the wider context of how the business needs to operate, and the risks that it may need to take.


Debbie Reynolds  08:34

You know, I wrote an article called Layer Cake, or Data Privacy Layer Cake, and three layers of the Cake. The bottom layer was data, the middle layer was operations, and the top layer was regulations. So I feel like a lot of companies, they feel like, okay, if I handle the regulation part, then magically operations and this other stuff will sort of self out. And what we're seeing is in these breaches, and I read all the news articles and everything, and almost all of them are failures in operation, not the companies didn't understand the law or whatever. It's like, how do you operate? How do you change the way that you operate so that you can reduce your risk? What are your thoughts?


Justin Daniels  09:19

Well, I think my initial thoughts are, one, the company has to have a culture where security matters. So I work at a firm, we have all kinds of phishing training, but we also get articles from our security team about, hey, this is how you can protect the online privacy and security of your family your kids. And so it indicates we care about what you do beyond the office because I think if you care about what your employees are doing in their personal life, the idea is it flows over into what they're doing at work because as you know, Debbie, most of the cyber breaches that happen typically happen from human error. And so how do you start to do that? It's not just training. It's just building a general awareness. In my view, it's how do we make Data Privacy and security, the digital seatbelt of the 21st century? Because I bet you, Debbie, when you were growing up like me, my parents never wore seatbelts. And now if you and I get into a car, what's the first thing we think to do? We don't even think about it. You buckle up? Well, what changed? Well, one, there was education. Part of why we're doing the podcast today make people more aware that privacy and security are really important. And then two laws have been passed around, requiring us to buckle up. And so to me, where we go with privacy and security is really helping to educate our employees. And as we're going to talk about the regulatory landscape is shifting. We now have 12 states that have privacy laws, and we have different cybersecurity laws, the SEC being the latest because you're seeing this huge trend now in the last 10 years with what digitization of assets for companies, migration to the cloud, but also alongside that is increased privacy and security regulation.


Debbie Reynolds  11:03

And actually, to follow your seatbelt analogy even further. The reason why we wear seatbelts mostly, well, I'm sure we use it because it makes us safe, right?


Justin Daniels  11:13

Yeah.


Debbie Reynolds  11:13

But the impetus typically in a newer car is that the car will not stop beeping unless you put your seatbelt on. Right. So that's like a reminder, maybe that's like an analogy to where awareness.


Justin Daniels  11:26

Know, you make a good point. I know, and I don't have mine on a couple of times and the car's beeping. It's like, how annoying. Alright, we'll just, we'll buckle it in. It's a good reminder.


Debbie Reynolds  11:36

Absolutely. So you've read through the cybersecurity rule now for public companies. So it's coming out of the SEC, we're seeing a lot of this bubble up in different sectors around data and around how these government regulators want companies to be transparent and how they want them to take a risk-based approach to how they handle cybersecurity. So, first of all, tell us about this rule that people are talking about is going into full effect in December of 2023.


Justin Daniels  12:14

So on what was it July 26th, the SEC met in an open committee, and they passed the Securities and Exchange Commission's new cybersecurity regulation. And it is targeting publicly traded companies. And so the rule from a high level requires that if a publicly traded company has a material data breach, they have to report it to the SEC within four days. Another interesting part of the rule is now in every annual report, public companies now have to talk a little bit about how they manage security risks, and the impact on the organization. And the thinking and the rationale behind the rule is Chairman Gensler wants companies to have transparent and consistent reporting about cybersecurity because the SEC, his view is, as we know, breaches happen all the time, they can cost 10s if not hundreds of millions of dollars, and that can materially impact an investor and their decision to buy or not buy a stock? What I think is really interesting about the rule for a lot of the companies, Debbie, that you and I might deal with that are not publicly traded, you are definitely in scope. And you're thinking, Well, wait a second, Justin, I'm not a publicly traded company. No, you're not. But if your customer is a publicly traded company, the SEC Rule is specific in saying we don't care if your data or your network, you own it, or it's hosted by some third party that's in scope. Because one of the biggest issues the SEC has is how do we manage third party risk with cyber. So what you should expect, as you're building privately held companies who are vendors is your publicly traded customer is now going to foist a bunch of very onerous contract terms on you about your cyber hygiene, how quickly you have to give them notification of a breach, how much information you have to share with them. If you get breached, that could trigger a material reporting requirement for the publicly traded company within four days.


Debbie Reynolds  14:20

Correct.


Justin Daniels  14:21

Yes.


Debbie Reynolds  14:22

Yeah. That was a good summation. That's a long document with the like, 178 I think, but.


Justin Daniels  14:27

Don't read it. They're sharp objects. Yeah.


Debbie Reynolds  14:30

Well, let's talk a little bit about something that's in there as well around the people who are responsible. So they're also saying you should hire a cyber expert or a cyber person. Can you talk a little bit about that requirement?


Justin Daniels  14:48

Sure. So one of the things the SEC rules wanted to do is make sure you had very specific expertise on the board. That did not end up in the final rule, but what did end up in the final rule is as part of your disclosures from a cyber disclosure perspective, you have to identify, hey, do you outsource your cybersecurity to a third party vendor? And if so, who do you use? So companies that outsource are now going to have to disclose all of that. The SEC declined to actually make it a requirement that board members have cyber experience that was in some of the rules, but there was intense comment, and they declined to adopt that particular part of the rule.


Debbie Reynolds  15:30

Yeah, even though they did not support that part of the rule, I highly recommend that companies have folks on the board, or at least directly having access or a voice toward board members around cybersecurity risk, because all companies are saying they need accounting. They understand that they need insurance and stuff like that, right? But some just feel like if I pretend like the cyber risk isn't there and hope everything goes fine, it probably won't happen to me. And that's just not the reality.


Justin Daniels  16:13

I think that's right. I think you have a bunch of regulators and the government now saying, We can't let this go on the way it has. Because as much as I would love to see market-based solutions, I can tell you, on average, even if you are eligible to get coverage, it goes up an average of 79 to 80%. And so now companies are faced with the determination, well, do I spend it on cyber insurance, or do I take the savings from the crazy premium and invest it in other kinds of prevention activities? But I think you put it really well people think cybersecurity is like the fire brigade; you get serious once you've had a breach. And to me, it goes all the way back to when we talk about startups. Startups only care about two things: one, a minimum viable product, and being able to have customers.


Debbie Reynolds  17:03

Talk a little bit about the recent MGM hack; what's happening there? What lessons can be learned from what we know so far?


Justin Daniels  17:14

So, I want to draw a broader context. So in September, both Caesars Palace and MGM got hacked, and MGM was hacked before. It got a lot of news because that's the casino and gaming industry. So you think they have very high levels of security. But what appears to have happened is there's a group, I think they're called, what is it shattered spider or something like that, young folks? And did they hack into the network with some cool intrusion? No, they basically impersonated the IT department and called an employee whom they had found by doing open-source research on social media. And were able to get that employee to give them their credentials, and they got right in. And so what's interesting is MGM did not pay a ransom. It appears Caesars did to the tune of about $15 million. And so from my perspective, I think what's interesting is how is the SEC going to look at these companies that pay ransoms? Because both the Federal government as well as the FBI say, we don't want you paying ransoms under any circumstance because all you're doing is funding cybercriminals. But I can tell you from personal experience that 80 to 90% of companies pay the ransom because they look at it as a business decision. So one interesting thing it'll be interesting to see when the SEC rules go into effect is how does the SEC treat companies who have paid the ransom. Do they say okay, it's a disclosure great, or do they say, Well, wait a second, why did you pay the ransom? Was there another way to work this out? And we're going to ding you with a bigger fine because of it? I don't know the answer to that. But to me, those are some of the interesting questions that the SEC Rule brings up because what are you going to tell the SEC sec. We paid the ransom, and we're trusting the threat actor isn't going to disclose all that personal data on the dark web.


Debbie Reynolds  19:06

Let's talk about social engineering.


Justin Daniels  19:09

Okay.


Debbie Reynolds  19:10

Right. So, just like you said, I tell people a lot of times when people think about cyber, sometimes they think about these far-fetched scenarios where it's Mission Impossible. And Tom Cruise is hanging from the ceiling, but the reality is that a lot of breaches, in addition to being things that happen as a result of human error or human mistakes, a lot of it is not malicious, right? So the person in this MGM breach they were fooled by a cybercriminal that was able to elicit their credentials, and then if that's how they broke it. So, talk to me about why it is important that companies teach people about social engineering.


Justin Daniels  19:58

I think it's important because, as we talked about earlier, cybersecurity is really about three things: people process, and technology. And people are a big part of that equation. Because we're emotional, we do things we shouldn't sometimes because we're just people. And I think you need to educate people, it's really important from both a privacy and security standpoint, because one, like the seat belt, if you don't know any better, you don't know any better. But if you start to get educated, you start to think differently, the seat belt goes from an afterthought to I just buckle up, because that's just what you're supposed to do. And I also think it's important to teach people about it, because it's like, going to a restaurant and they come they have the bill, they're like, Oh, here's our mobile app, if you just give us your email, we'll give you the receipt electronically, and we can save the environment. But they don't tell you what they're going to do with that email; you and I both know what they're gonna do with that email; they're gonna mark it to you. But education helps people shift their mindset because when do I want to allow people to unlock the door and get at my digital identity? That goes to privacy, but it also goes to security. And that's why education and training are so important because we have to shift people's mindset away from what you talked about earlier about the fire brigade and saying, Hey, how is privacy and cybersecurity a fundamental part of my digital life? Because increasingly, my digital life is really about who I am. That's what we have in the 21st century. And that's why I think, this education and training component that your podcast and my podcast, that's why it's important.


Debbie Reynolds  21:35

Very good. What's happening in the world today that's concerning you that has a privacy impact? What are your thoughts?


Justin Daniels  21:45

From my own experience right now, I would have to tell you it is with the proliferation of the use of digital cameras, especially as it pertains to public spaces. And then I would say on top of that, you've been reading about retailers getting ripped off, and they want to use digital cameras, and then in schools. But here's the thing with that, Debbie; as we talked about digital cameras, it's hard to have that conversation and not also have a conversation about facial recognition and AI. Because once you combine those two technologies, now you go down a route towards surveillance that people don't think about, because everybody wants to have public safety. With the gun culture we have in this country, I think the only thing people can agree on is we should have more cameras in schools. But when they do that, we aren't having the discussion about well, where should the cameras be? What kind of video should they take? And when? And how does that impact people's privacy, and also from surveillance and security?


Debbie Reynolds  22:44

I agree with that. I'm glad you brought that up. Facial recognition is an area or biometrics area that I work with very deeply. And just from a data perspective, if you want to find a needle in a haystack, you don't create bigger haystacks, right? So basically, what a lot of surveillance is doing is creating a data set of mostly people who are innocent, right? And so it makes it harder for you to actually solve those problems. So, I think this is not an issue where collecting more data is going to actually help you solve this problem. It may make people feel better, but it doesn't really help to alleviate the concerns that people have. What are your thoughts?


Justin Daniels  23:26

So another story I wanted to give Debbie on this is that I live in a little city called Dunwoody. And I was reading that our city council is considering a camera law so that in areas of high crime, they'll have cameras at the businesses and that kind of thing. And I'm probably going to write the mayor and email as a concerned citizen. And I said, Well, if you're going to do this camera ordinance, well, what about facial recognition because he was legally allowed. If you put that in place, the business does. And something happens because it then finds somebody who's unlawfully accused of something they didn't do, and facial recognition is routinely wrong. When it comes to minorities, you're gonna have a public relations nightmare on your hands, even if technically legally, you can do it from a public policy perspective. That doesn't mean you should have an open and honest public debate about what this means. And I just find it interesting in the context of what we're talking about, that so many cities or municipalities have these conversations, and nobody's speaking up about, hey, we need to really think about the privacy and surveillance and cybersecurity implications. It just doesn't come up.


Debbie Reynolds  24:35

Yeah, I think emerging technologies like this is a double-edged sword, right, where it can have benefits, but people often are so beguiled by a benefit that they're not really thinking about the downside or risk and then maybe have a cyber time well, where people have a mentality that oh, this won't happen to me. This bad impact that you're talking about is going to happen to someone else and not me, so I don't care about it as much.


Justin Daniels  25:02

Like I've asked people, well, how would you feel if anyone could read your text messages for three months or some period of time? What would they learn about you because people write all sorts of interesting things, you know, and I'm involved in matters where we have to preserve evidence. And you see what people put in writing, I was like, I tell people, anytime you write an email, you have to assume that that email has a life of like forever. So don't put something in writing that you don't want somebody to not read. Because that can happen. So I tried to be somewhat careful about what I write, particularly as a lawyer in certain contexts. But a lot of people just don't have that mindset, which goes back to your point about education and training.


Debbie Reynolds  25:39

Yeah, education and awareness are really key. And it's something that can't be done, I don't think, it can't just be done once a year, you can't check it off on a box, I think it just is something because the threats are increasing, the technology is getting a lot more sophisticated and complicated. And so it isn't like a continual thing; I think that there are going to definitely be gaps there. My concern is we're entering an area with AI, and all these other advancements and technology, which is gonna make compute a lot more complex and a lot more raising privacy and cyber risks. But we still have people who are using 12345 as passwords. Where do we go from here? What do we do?


Justin Daniels  26:29

It's funny, I someone this morning, when I was on a show talking it being interviewed said, Look, my emails out there, they get hacked all the time. Well, what are we going to do? We're just kind of stuck. And I had to pause for four or five seconds. And I'm like, well, to do nothing guarantees more of the same and even worse. So how do we start to really change our mindset? And as we talked about make privacy and security part of the design. Meaning if we're going to do the digital camera ordinance? Well, how are we countervailing the perceived benefits of stopping crime and better public safety against what we're giving up and privacy? And I don't think the average American appreciates how much of their privacy they are truly giving up. Why else would you get all of these free services from Facebook, Google, and everyone else? Because they want your data, which is incredibly valuable. And I don't know how you feel, Debbie. But when I watch all of those executives parade to the White House to talk about how they want to be regulated with AI, I just, yeah, your smirk says it all. I'm like, really, you want to go behind closed doors and talk to the White House about how you want to be regulated? When I really think all they want to do is they want to influence the regulation they know is coming so that it really doesn't have any teeth? Because I'll be honest, where I struggle sometimes, and I'd love to flip the script and ask you this question is a lot of privacy legislation founders because of a private right of action? Companies fear a private right of action, meaning individuals can sue class action lawsuits. But the reality of that kind of remedy the only people who really benefit are the lawyers; you and I get our what $2.95 from having our privacy violated. And I'm just trying to figure out what would be a better way to have enforcement that companies would actually pay attention to besides the one that seems to enrich a very small group of people at the expense of all of us consumers who had our privacy or security violated?


Debbie Reynolds  28:28

Wow, that's a really deep question.


Justin Daniels  28:30

I don't know; I thought maybe "The Data Diva" would have a view because I'm confounded by it. I don't have a good solution just at this moment.


Debbie Reynolds  28:37

You know what, one thing that the FTC has been doing, not necessarily on a consistent basis, but interestingly enough, they have done in a couple of cases around AI and like data collection. For some rulings, they've actually had companies have to disgorge their data or their algorithm. Well, I think that may get people's attention. If you capture someone's data that you shouldn't or do something that you shouldn't have done, just paying fines is probably not great; having a very, very deep pocket corporation go into litigation for as long as they can afford, which is forever. I don't think that's a great thing as well, but something that maybe hits that like the business model and changing something, maybe that will get people's attention.


Justin Daniels  29:29

I just think it's an interesting debate to have because all of the regulations that you and I deal with every day, they're only as good as the perceived enforcement of them is and I think, class action lawsuits. Okay, it's a remedy, but I don't know that that's the best remedy for all the reasons you talked about. So anyway, I didn't mean to throw a crazy question out there. It's just one I've been trying to think about, and I'm just struggling with it.


Debbie Reynolds  29:55

I know that's actually a pretty good question right? We know that a lot of people like us to deal with these privacy regulations, and they're wacky; we would love to see more harmonization in those laws, just so we are pulling our hair out. But you're right. I don't know if that's the best memory. I almost think it's some type of business practice change that will be more terrifying than actually paying a legal bill.


Justin Daniels  30:24

I mean, Debbie, think about this, when we went through the whole financial crisis. If they had held onto one CEO and indicted them, and convicted them, that whole thing changed. Because to your point, if you're a big company, a $500 million fine is a cost of doing business. You take someone's freedom away; that's a wholly different thing. And so again, that might seem a little harsh; I'm just trying to figure out because, at the end of the day, I think what will influence companies’ willingness to care more about privacy and security is knowing that there's some real consequences for not caring different than if you go out and injure someone intentionally. That's called a crime. And there are real consequences for that for most of us. So anyway, just something


Debbie Reynolds  31:13

I've been thinking about it. Yes, sure. Yeah. Well, you know, in Asia, they're going in that route, right? So, a couple of countries in Asia where there are criminal penalties for privacy mishaps like so. I don't know; I don't know that I've come over to the US at all. But I think that you're right that maybe class action or private right of action, it may not be the best characteristic, so to speak, especially with these companies are very well funded. That's like really not a turn at all. If it was the world, according to you, Justin, and we did everything that you said, what would be your wish for privacy anywhere in the world? So whether that be technology, human behavior, or regulation?


Justin Daniels  32:04

I guess my wish, Debbie would be that we could reach a place where we can use technology in a way that allows us to connect better and really mitigate all of the division that we have in society nowadays, like, I feel like a certain level of civility has been lost, because we can hang out behind the social media, computer screen name and say things that you would never say to somebody in person. In my perfect world, I'd like to see us get back to a point where we have a much higher level of civility and how we talk to each other relate to each other. We've all been through a lot in the last 20 years, particularly with the pandemic and 9/11 and a variety of other things. I just like to see technology be more force for benefits and do a better job of mitigating these downsides. On the front end. I really have no excuse after social media. I mean, all the stuff with AI. Really, guys, have we not learned anything from what's going on with social media? Hello, out there.


Debbie Reynolds  33:06

That's a great wish. Yeah, bringing more humaneness to it, I feel like we're entering a place where people don't care about one another as much as we should. Right. So it's more like I see a video, you see someone taking a video of someone being assaulted, it's like, put the video camera down to help somebody. Maybe you should do that, instead of like recording this, this altercation, you know?


Justin Daniels  33:32

Correct. Your story encapsulates how I feel is quick being on your phone tape and some get out there and, and help.


Debbie Reynolds  33:39

Yeah, totally. Well, this is great. Thank you so much, especially around that the SEC Rule is coming into effect. I think a lot of people, not enough people are paying attention to it. So the fact that you're writing and talking about it is very important. It's going to be really important as that rule kicks in December of 2023. And people start to really feel the impacts of that. So yeah. Thank you.


Justin Daniels  34:06

Thanks for having me.


Debbie Reynolds  34:07

Yeah. So we'll talk so for sure, hoping to hang out and do other things together. Have a great one.


Justin Daniels  34:13

Absolutely.


Debbie Reynolds  34:14

All right. Thank you. Talk to you later.