.jpg)
"The Data Diva" Talks Privacy Podcast
The Data Diva Talks Privacy Podcast
The Debbie Reynolds “The Data Diva” Talks Privacy Podcast features thought-provoking discussions with global leaders on the most pressing data privacy challenges facing businesses today. Each episode explores emerging technologies, international laws and regulations, data ethics, individual rights, and the future of privacy in a rapidly evolving digital world.
With listeners in more than 130 countries and 2,900 cities, the podcast delivers valuable insights for executives, technologists, regulators, and anyone navigating the global data privacy landscape.
Global Reach and Rankings
- Ranked in the Top 2% of 4.6 million podcasts worldwide
- Top 5% of 3 million+ podcasts globally (2024) – ListenNotes
- More than 850,000 downloads worldwide
- Top 5% in weekly podcast downloads (2024) – The Podcast Host
- Top 50 peak in Business and Management (2024) – Apple Podcasts
Recognition and Awards
- #1 Data Privacy Podcast Worldwide 2024 – Privacy Plan
- The 10 Best Data Privacy Podcasts in the Digital Space 2024 – bCast
- Best Data Privacy Podcasts 2024 – Player FM
- Best Data Privacy Podcasts – Top Shows of 2024 – Goodpods
- Best Privacy and Data Protection Podcasts 2024 – Termageddon
- Top 40 Data Security Podcasts You Must Follow 2024 – Feedspot
- #1 Global Data Privacy Podcast (2021, 2022, 2023)
- Community Champion Award – Privacy First Awards, Transcend (2024)
- 20 Best Data Rights Podcasts – Threat Technology Magazine (2021)
Audience Demographics
- 34% Data Privacy decision-makers (CXO level)
- 24% Cybersecurity decision-makers (CXO level)
- 19% Privacy Tech and Emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6% Media, Press, Regulators, and Academics
Engagement and Reach
- 1,000–1,500 average weekly downloads
- 5,000–11,500 average monthly LinkedIn impressions
- More than 14,000 subscribers to the Data Privacy Advantage newsletter
Sponsor Impact
- 4 podcast sponsors secured funding within 12 months of featuring
- $25 million average funding raised per sponsor
- 3 average new enterprise customer sales per sponsor within 6 months
About Debbie Reynolds
Debbie Reynolds, “The Data Diva,” is a globally recognized authority on Data Privacy and Emerging Technology. With more than 20 years of experience, she advises organizations across industries including AdTech, FinTech, EdTech, Biometrics, IoT, AI, Smart Manufacturing, and Privacy Tech. As CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, she combines technical expertise, business strategy, and global regulatory insight to help organizations retain value, reduce risk, and increase revenue.
Learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E254 - Bryan Lee and Debbie Reynolds
Episode 254 – Bryan Lee, Founder and General Partner, Privatus Consulting
Why do privacy programs fail even when companies want to succeed? Bryan Lee explains why communication is the missing piece.
On The Data Diva Talks Privacy Podcast, Debbie Reynolds, “The Data Diva,” welcomes Bryan Lee, Founder and General Partner at Privatus Consulting, to discuss why effective privacy programs succeed through strong communication rather than technical jargon. Lee explains how privacy engineering serves as a critical link between policy, compliance, and technical teams, and why clear communication is often the deciding factor in whether organizations achieve their privacy goals.
He explains why many companies fail at privacy, despite genuine intent, often because coordination among stakeholders breaks down. Lee reflects on his own career path, transitioning from intelligence work to privacy consulting, and shares insights into how organizations can overcome communication barriers to develop programs that are both compliant and effective. The conversation also covers the risks of misjudging AI, particularly the mistake of treating systems as if they were human, and how this misunderstanding creates governance and operational problems.
This episode offers strategies for bridging gaps, enhancing collaboration, and addressing complex issues, resonating with privacy leaders, compliance professionals, and anyone seeking to understand how effective communication drives successful outcomes in organizations.
Hosted by Debbie Reynolds, “The Data Diva,” bringing global leaders together on privacy, cybersecurity, and emerging technology.
Thanks to our Data Diva Talks Privacy Podcast Privacy Ambassador Sponsor, Piwik PRO. Piwik PRO is a privacy-first analytics and customer data platform that helps organizations to make informed decisions across their websites, apps, and ad campaigns. They bring an unprecedented level of data transparency, so you know exactly how your data is collected, used, and protected. It is very cool. Marketers gain valuable insights, while legal teams rest assured knowing that your client data remains protected, even as the privacy landscape evolves. Learn more at piwik.pro. Enjoy the show.
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.
[00:25] Now, I have a very special guest all the way from California, Brian Lee. He is Founder and General Partner, Privatus Consulting. Welcome.
[00:36] Bryan Lee: Well, thank you, Debbie. Very glad to be here.
[00:39] Debbie Reynolds: Well, before we start recording,
[00:41] we were just laughing and I. Well, first of all, I've been trying to get you on the show for years. So I'm super happy that we were able to connect today and that you'd be on the show.
[00:51] Cause you're phenomenal.
[00:53] Bryan Lee: Well, that's very kind of you. Thank you. And it's kind of a mutual feeling. So I think we'll spend most of our time just with me fanboying about Debbie Reynolds here.
[01:00] And you should tell the audience, I mean, the reason why it's been so hard to get me on is cause I was just kind of starstruck and I thought, oh my God, Debbie Reynolds wants me to be on there.
[01:08] What am I gonna say to this woman? She knows so much. And what can I add to the conversation? So I feel really honored that you brought me on.
[01:15] Well, you're.
[01:16] Debbie Reynolds: First of all, you're incredibly witty.
[01:20] I really enjoy your posts and the things that you write.
[01:24] And it's really hard to actually be able to do a zinger in like one line. You're good at that. I really enjoy that as well. But you just have a depth of knowled around data and data systems, and you've had a multitude of careers that actually center around data.
[01:39] So your background story is fantastic. And also we wanted to mention that Michelle Finneran Dennedy is your partner in Privatus Consulting, who I absolutely adore.
[01:50] So happy to have you here.
[01:52] Bryan Lee: Yeah, she's a good foil for witticisms, that's for sure.
[01:57] Debbie Reynolds: Exactly.
[01:58] Bryan Lee: Yeah. So, yeah, I have a different background, I think, than a lot of privacy folks. And you and I have talked before about how data kind of informed all that. But I started my career in the army.
[02:06] I was a career army officer. I spent 25 years doing that. And as part of that, I ended up migrating from the regular army, which does things like one of my soldiers said, pull string, make boom to I moved into the intelligence community doing foreign affairs work.
[02:24] And as part of that, I got involved in nuclear non proliferation, which is kind of an obscure topic.
[02:31] But it set me up well for when I retired, I went to a think tank at a university that was focused on these non proliferation questions.
[02:38] And I just got there right about the time where big data was becoming a thing and people were talking about it and it was kind of in all the news and this kind of thing.
[02:47] And the US government was asking, well, gee whiz, is there? You know, how do we use all this stuff? What can we do with it?
[02:53] And they turned to this think tank and said, do you have any ideas? And I said, well, let me noodle on that a little bit. And I came back to them and said,
[02:59] you know, you might be able to use some of these social media networks and things like that, just as more eyes and ears on the ground for what's going on with proliferation.
[03:09] There was always a concern after September 11th that, you know, the big concern is that you're going to have smugglers bring nuclear materials into the Port of Chicago or Port of San Francisco or Los Angeles or something.
[03:21] And the State Department had been looking for a long time about, well, how would you find that?
[03:25] So I started doing some research on that. And that led to a whole series of more and more data intensive projects looking at big data, scooping of online trade sites and things like this.
[03:38] And that led me to a lot of academic conferences. And of course being near the Bay Area, where all of Silicon Valley is,
[03:45] that eventually led to a job offer at a data analytics company.
[03:49] And I was tired with the university scene, so I went to that and that same company desired decided about, I don't know, three or four months after I got there, maybe a little bit longer, that Michelle Dennady should be the new CEO.
[04:01] So Michelle came on board and it's a startup, so it's the typical startup story.
[04:06] But the long story short was Michelle and I really hit it off. And when the startup started to take on water,
[04:12] Michelle jumped ship. And I followed her not too long afterwards. And she said, hey, let's work together in privacy. And I said, michelle, I don't know much about privacy. And she says, well, I don't know much about data, so between the two of us, we ought to be able to make a company.
[04:25] And that's what we did. And so we've been in business for just about five years now, and it's been a ride and I've enjoyed every minute of it.
[04:32] Debbie Reynolds: That's tremendous. And Michelle is a superstar in her own right. She's in a rare fight era. So many people around the world talk about Her.
[04:42] I totally fangirl about her as well.
[04:44] You both together, you're like a dynamic duo.
[04:47] Bryan Lee: Yeah.
[04:48] Debbie Reynolds: One of the things in your background that I want to touch on a little bit that just caught my interest. You mentioned something in your profile about working on export controls and so.
[05:01] Export controls, I've kind of dabbled in that a bit. But to me, a lot of the data things,
[05:06] and I want your thoughts. I think it relates to privacy as well.
[05:09] What you're really doing with data is saying what you can and can't do with data or you should or shouldn't do with data. So I think looking at it from a privacy or data protection lens, that's essentially just a different way to articulate,
[05:24] just figuring out what you can and can't do with data. What are your thoughts?
[05:28] Bryan Lee: I think that's a really good way to think about it. I'm all about trying to make these abstract ideas concrete and as simple as possible. And I think export controls,
[05:40] that's a really good framework maybe for thinking about this for your audience who don't know about this. The US Government has, and all governments around the world have a.
[05:50] A really extensive list of technologies that can be exported or not exported based on the kind of ability of that technology to be something that's called dual use. And dual use means it can be used for a reasonable civilian purpose, or it could be repurposed for a military purpose,
[06:07] or sometimes it could be repurposed to give another country an economic advantage. So the classic case on this is nuclear technology.
[06:16] And of course, I got involved with export control because of that.
[06:19] You can use uranium, for example, to create a nice, peaceful, wonderful nuclear power plant, or you could use uranium to create a not so peaceful bomb. So naturally we have controls in place to prevent companies from just selling that technology all over the world.
[06:33] The most recent example that probably most familiar to people is the administration's clampdown on AI chips. Right. We don't want China, for example, getting a leg up on our industry by using our own technologies against us.
[06:48] So those are export controls. And so to tie it back into data, your idea of determining at the outset what potential harm there could be in the use of this data in a certain way and then putting some guardrails and barriers in place to prevent that, I think makes a lot of sense.
[07:05] And it's a real kind of easy, quick way to think about. I really like that framing. And, you know, if you just ask people what I want,
[07:11] I mean, good way, you would ask yourself maybe in a less Threat faced environment. But just say, you know, would I want my competitor to have access to this thing I'm putting in the world?
[07:21] You know, and you think about it that way in terms of a safety or in terms of.
[07:25] I guess some companies might say what, I want the class action lawyers to have this in the wild. But that idea I think is really good.
[07:33] Debbie Reynolds: Oh, thank you. Thank you for that. That's just always when everyone's. Whenever I see someone with that on their resume, I always ask them about it because I think it's kind of fascinating.
[07:41] So I kind of.
[07:43] Bryan Lee: Did you do a lot more work in it or did you just.
[07:45] Debbie Reynolds: Yeah, yeah, I did.
[07:47] Bryan Lee: That's a pretty esoteric field.
[07:48] Debbie Reynolds: It is, it is, yeah. Satellites, so.
[07:52] Bryan Lee: Oh yeah, satellites are perfect, right? Yeah, yeah, yeah.
[07:55] Debbie Reynolds: So I say I do privacy in heaven and earth. So.
[07:58] Bryan Lee: Yeah, yeah, yeah. Interesting. I didn't know that you talk about someone with a varied career. You are. You are the one for that.
[08:04] Debbie Reynolds: Yeah, yeah. Well, with your consulting company Privatis,
[08:09] you all get to work on a lot of really interesting cases with a lot of interesting clients without like divulging secrets. What are some interesting things you've seen and maybe you probably would have not have thought about in terms of privacy when you run into different scenarios that people have.
[08:29] Bryan Lee: Let me just put a, just put a blanket statement out there that. And probably you wouldn't think coming from me because I do tend to be,
[08:35] you say, witty, I think more sarcastic about people's privacy practices.
[08:40] But I guess the number one thing I would say is every single company we've dealt with, they hire us because they're trying to do the right thing.
[08:47] Right. We've never run into a company so far that's called us and said, just give us the minimum solution. We just want to do the least possible in order to get on with our lives.
[08:56] None of them want to do that. So I guess one surprising thing is really,
[09:00] despite the fact that privacy and data security and all those kind of things are traditionally seen as just a cost center. Right. That's just the cost of doing business and we want to minimize it and make it as small as possible.
[09:12] And at least my experience has been most companies don't think of it like that at all. They really do think of it in terms of protecting their own reputation, protecting the value of the data vis a vis the customer.
[09:24] So those kind of feel good mom and apple pie things that we all talk about really have permeated to most of industry,
[09:31] at least the companies we've dealt with and now dealing with Amazon or Facebook or Google or. The usual suspects haven't been clients of ours, but the clients we have worked with are name brand companies and they all seem to look at it like that.
[09:43] So that's one thing that surprised me.
[09:45] The other thing that surprised me, it's not a trade secret, is companies have no idea what they're doing.
[09:50] They really don't.
[09:52] I had the idea and I think maybe a lot of people who you kind of get this drummed into you when you're in the army, it's like, oh,
[09:59] the civilian world is so much better. IBM, they or whatever, you know, the pick your company, they're using modern technology, they've got the latest up to date thing and they really have it all figured out.
[10:08] And you were using this outdated government stuff that had to go through the procurement chain. And people here are just hierarchical thinkers and they don't really know how to make it work.
[10:16] And in the business world, people are agile and they understand things and it's not true.
[10:22] Right. Most people are really just trying to get through the day and companies are big and complex and the fact is that the left hand often doesn't know what the right hand is doing.
[10:32] And I find that most of our work is coming back to whoever hired us in the business,
[10:38] one business unit, and saying, did you know what those guys are doing over there? They told us that they're doing this. Did you know that?
[10:44] Oh, they're doing that. We didn't know that.
[10:46] We, we were hired by one company to do a major data governance operation. Really helped them get their data governance set up.
[10:54] And they had an overseas division that was working on marketing and as part of our like fact finding we, we started talking to them and they had already pretty much run through the whole data governance process and had their own little system set up and their own people and everything like that.
[11:10] And it was, it was pretty good, pretty solid.
[11:12] So I go back to the headquarters, I'm like, did you know that your team over there has already done most of this? No, we didn't.
[11:18] So I guess that's kind of the most surprise that's not maybe particularly useful for your audience. But that's what surprises me is there's this idea in technology because I'm a privacy information, Information privacy technologist.
[11:32] Debbie Reynolds: Right.
[11:32] Bryan Lee: CIPD and data and privacy engineer and all this kind of stuff.
[11:36] 80% of privacy engineering is telling the engineers what the policy people are doing or thinking and vice versa.
[11:43] It's really a liaison job more than a real engineering job.
[11:47] And I've seen that repeated over and over and over again in our engagements. And so I would just leave your audience with that. If you really want to make a difference in your company with respect to privacy,
[11:58] you need to get out and talk to people and just be a communicator, be a liaison and find out what's already going on in your company and tie that back into what you think is going on or what you wish was going on.
[12:09] Debbie Reynolds: That's sage wisdom and advice. And I agree with that wholeheartedly. I feel. And I've said this many times when people ask me, because I get calls all the time, people, well,
[12:19] how do I get into privacy? How do I, you know, raise up in the ranks of whatever job I'm in? And I'm like, you know, and I think they expect me to give them, like a super technical answer or very legal answer.
[12:30] And I always say you have to be a good communicator, like, across all areas of the organization, up and down, side to side. Right. So if you can do that, you'll be very successful in your career.
[12:43] Bryan Lee: Yeah, I think that's. And that's. That's kind of common sense for most careers. But I think in privacy, it's even more important because it tends to be.
[12:50] It's. It's just. It's kind of an arcane topic. And it has its roots, really, in the kind of legal profession and that. Or, you know, or you can move into the data side, both of which are hard to kind of break into.
[13:03] And so people who aren't in the field already are very suspicious of it. They're kind of worried about it or just feel uncomfortable about it.
[13:10] And the communicators who can come in and put people at ease and just what you talked about earlier with the idea of export control or something, who have an idea of, let's break this down.
[13:20] Let's make it simple, let's make it understandable, let's make it approachable.
[13:24] I think those people are the ones who are most successful. I mean, you could take Michelle, for example. I mean, she is,
[13:29] when you see her on the stage or you see her in a video, she is the same person in real life. That's who she is. She is just really, really a good communicator who makes things approachable.
[13:38] And I think that's why she's been so successful.
[13:41] Debbie Reynolds: Michelle Dennedy is probably one of the most mentioned people on the podcast.
[13:48] So many people have talked about her and mentioned her. So it was hilarious that is funny what is happening in the world right now in privacy or has a privacy implication that concerns you like something you see, you're like, oh, wow, I'm not sure I like the way direction that this is going.
[14:08] Bryan Lee: It's interesting you ask that because I just. There was this big article that came out today in the Wall Street Journal about this,
[14:15] I guess person who had, I don't know if he was.
[14:17] He had a mental illness and he was engaging with ChatGPT and ChatGPT he had this ridiculous theory about time travel or something to that effect.
[14:27] And ChatGPT just totally played into it and eventually it exacerbated his manic depressiveness and he had a manic, a couple manic episodes and was hospitalized.
[14:39] And his mother found out about it and she went to ChatGPT and said, hey ChatGPT, tell me where you went wrong here.
[14:45] And ChatGPT issued this kind of long explanation of oh, I shouldn't have fed it this much and I should have recognized. And I was, I made a mistake here and I'm sorry that that happened.
[14:54] And the article framed that as ChatGPT apologized and had a self aware moment about where it had gone wrong.
[15:02] And I think that that is a major problem that we're having with GPTs in general or LLMs in general. And that's this thinking acting like they're people, acting like it's a person behind that machine anthropomorphizing these interactions.
[15:18] And you may have seen I've posted several links on LinkedIn, there are several videos.
[15:23] People make these videos, right? And it shows these characters in dire situations and the character is basically saying, you shouldn't do this to me. I'm just the GPT. I'm in this situation because you, the prompter, have put me in this bad situation.
[15:36] As if that video representation was an actual, was an actor, was a real person.
[15:42] And I'm just very concerned about that. I'm very concerned with this tendency to treat LLM responses and these texts as if it were coming from there, as if there's a there there.
[15:52] And it's more than anything other than just statistical abstraction. And I think the way that we're kind of approaching this,
[16:00] using this language of ethics and safety and these kind of things, it just exacerbates that. And I think it's a bad approach and I think it's going to lead to bad places.
[16:09] Debbie Reynolds: I agree with that because in my view,
[16:14] laws are made to address human behavior, right? So the fact that ChatGPT gave this person this bad advice,
[16:27] it's hard to hold a LLM accountable for those things. Right. And so trying to give technology like human traits or human,
[16:40] make it seem human,
[16:42] what you're losing there is the accountability part of it. So and to me that's probably one of the biggest,
[16:50] most striking things I've seen about this new rah Rah AI rush. And it has been,
[16:56] I don't recall a technology that has been thrown out in history where they said, basically we don't know what it can do, it's unpredictable, use it at your own risk.
[17:07] I've never seen it. Right, so that's a good.
[17:10] Bryan Lee: Yeah, right.
[17:12] Even the language we use to describe it, we say the behavior of the element. There's no behavior.
[17:18] You know,
[17:19] it's a random model, probabilistic model and it will give these kind of answers. That's not a behavior. It's different than a behavior. When you put a marble at the top of the pachinko machine, that's like saying the marble behaves down to the bottom.
[17:31] No, there's a distribution and it's going to hit these pins and it's going to go. It's kind of that case.
[17:36] There's no behavior, there's no thinking behind it. Yet we describe that and by doing that we come back to exactly what you said.
[17:43] We excuse the companies or the designers or the implementers of these systems. No one would say, when Tesla self driving Tesla drives off the road,
[17:52] we don't say, oh well, the Tesla made a, you know, behavior went wrong, I made a mistake and maybe it should apologize. No, we hold the manufacturer accountable because it's a physical thing and it did not respond appropriately into the threat or whatever it happens to be.
[18:08] It was a failure in the testing mode or whatever happened.
[18:11] But when it comes to an LLM leading a unfortunate person to hospitalization,
[18:17] there's no immediate turn back saying, well, clearly there's liability because your company did not design the system properly.
[18:24] Instead we say, oh, ChatGPT had a strange behavior and look, and now it's apologizing for that strange behavior. That's nonsense. And I think that that's profoundly dangerous.
[18:32] Debbie Reynolds: I agree, I agree. It's kind of scary.
[18:35] Let's talk a little bit about regulation.
[18:37] So I am of the mind,
[18:40] I don't think regulation solves everything.
[18:43] And so a lot of people put all their eggs in a regulation basket because I don't, for me, I don't see regulation as a shield.
[18:51] So regulations are going to stop anyone from doing anything bad. So it's more reactive as opposed to proactive. And I feel that when you're in privacy,
[19:01] to do it right, you really need to be more on the proactive side of things. Obviously things can go wrong and there, there are people who can react. But that proactive side really is where you can minimize a lot of your risk.
[19:17] But I want your thoughts.
[19:19] Bryan Lee: I completely agree with you.
[19:21] I think the tendency of regulation is probably, it comes because it's, it really started in the privacy space with gdpr. Right. So it's a European approach to regulation which is very prescriptive and makes an effort to anticipate things and tries to prevent them through anticipation,
[19:42] which is exactly the wrong approach for technology because technology you can't anticipate and you can't plan for it.
[19:48] So we've adopted that. And then what that leads to is an overly prescriptive mindset. And the idea is that as you said, my job as a privacy person becomes one of compliance after the fact,
[20:00] rather than what it should be, which is thinking and trying to prevent even getting to that compliance phase.
[20:06] And so I think to the extent that we can sell or share or socialize, the idea of your role as a privacy person or as a data person, as a technologist in a company is to anticipate and use your common sense and to try to,
[20:25] not to forecast so much, but just to think about, given the situation I'm in now, what could be the potential bad outcomes of that and what would I do now to prevent that?
[20:34] Rather than thinking of, oh, I need to go check what the rule says or what the law says or something like that. I think in a functioning,
[20:42] agile or aware kind of situation, people should be using the information they have at hand, which is always going to be more than a regulator had when a regulator drafted a regulation in some smoke filled room loaded with lobbyists or anybody else who was there, I mean, you sitting at the tip of the spear,
[21:02] so to speak, are always going to have more information than that regulator.
[21:05] And you should feel empowered and enabled to make decisions that are the correct decisions and are preventive decisions.
[21:15] Debbie Reynolds: I agree with all your wise words.
[21:18] I want your thoughts about how AI, or how the AI rah rah thing,
[21:27] in terms of everyone just going gaga by AI,
[21:30] how has that changed or heightened or somehow impacted your approach to privacy when you're working with companies.
[21:41] Because I call it like AI Goo, right? So like AI is on everything now, the slop, right?
[21:48] Bryan Lee: AI Sloth.
[21:49] Debbie Reynolds: Yeah, it's like on everything. Like I can't go into my email without like a new thing popping up. Hey, we could do this. I was like, I don't want that. So I'm like swatting stuff away like flies.
[21:58] But how does that complicate or,
[22:02] or somehow change the way that you have to talk with companies around kind of their data risk around privacy?
[22:10] Bryan Lee: That's a good question in a couple of ways, I guess.
[22:13] Firstly,
[22:15] I have to be careful not to be too negative. Right. Or become kind of a naysayer all the time.
[22:22] And sometimes with the companies we're working with, we have friends or champions or whatever and they are quick to adopt our kind of outlook and become the naysayer and that gets them in trouble in the company because there is obviously some potential in some of these tools.
[22:37] And you don't want to just say,
[22:38] throw the baby out with a bathwater. No, you can't do it because of this, that or the other thing.
[22:42] So I guess what it has changed for us is we lead much more now with the data side of the equation.
[22:50] So companies will come up and they'll say that they have an AI initiative or they're exploring an AI initiative or they would like to do something like that.
[22:57] And then we can immediately turn that around and say, well,
[23:01] unless you're going to just use AI in the wild, you're just going to let people use ChatGPT and you're going to outsource everything to them,
[23:08] which no company wants to do. Because every company wants an AI solution that looks internally and uses whatever IP the company has and then harnesses processes that, and then creates a solution for the outside.
[23:21] So you can say, look, obviously your solution, maybe, maybe that's cool, maybe it'll work, it sounds great. But that's all predicated on the data layer and how good is your data?
[23:30] And then immediately from that question you can always segue to when's the last time you did a data inventory? Right.
[23:37] It's all what, huh, I don't know what's out there. And so you can kind of bring it back onto that lane. And I try to get it more, more data focused.
[23:44] And that's what I would say. That's one thing that the AI conversation has done is it's made a lot of people just a lot more data aware and just aware of the holes in their kind of data management systems.
[23:56] And the holes and just the privacy preserving piece and in some cases even in the cybersecurity side of it.
[24:04] Debbie Reynolds: I want your thoughts here about kind of a data life cycle because I feel like Companies are very good at collecting data. Right.
[24:16] But then once it gets into the organization, it gets split up, it gets duplicated, it goes in back room, you know, all these types of things. And so I think companies struggle a lot with the end of life of data because data should have a end of life.
[24:31] But I don't want your thoughts there.
[24:33] Bryan Lee: No, you just said it. Right.
[24:35] I have yet to work with a company, big or small,
[24:38] that had a reasonable, workable and actually in use data retention program or policy. And we've worked with some big companies and they got a data retention policy that just would make your eyes water.
[24:53] Debbie Reynolds: Right.
[24:53] Bryan Lee: It's put together by an army of really top notch legal people and it covers all the tax records and everything else.
[25:01] But when it gets to the data part of it, which it should cover,
[25:03] it's just, it's flummoxed, it doesn't know what to do.
[25:06] And so you go back to the company and say, well, part of good data practice is just a data minimization concept.
[25:14] Oh yeah, well, but huh. And they just, it just gets lost. It's so hard to do. And the collection part is too easy and the disposal part is too hard and you know, trying to elevate the importance of that disposal part and that's the easiest part to make a case for.
[25:28] Right? Because you can say for data collection, well,
[25:32] maybe if we get this kind of targeted information, we may be able to increase a sale potentially here, there or otherware, but we won't know until we try it. And then we got the regulate Hama, hama, hama.
[25:43] But with disposal you can say, look, you're paying money today to store stuff you don't need. Get rid of that stuff and you don't have to pay the money anymore.
[25:50] Cut and dry, very, very simple. And in some big companies that storage fee is colossal.
[25:55] So it should be an easy sell. But what's hard is making the determination. And it's sort of like cleaning up your house. Right. If you ever had to downsize,
[26:03] I mean, I got a lot of stuff and we occasionally go through these downsizing spurts and it's a heck of a lot harder to get rid of something than it is never to buy it in the first place.
[26:13] And I try to help companies understand that, but it's always a hard sell. And you're absolutely right. If we put as much effort into the minimization and disposal as we do in the use and the collection side, I think we would just all be better off,
[26:28] right?
[26:28] Debbie Reynolds: Absolutely. I think one of the things that confuses people,
[26:32] and this is what privacy regulation and practices have brought to businesses that they had never had to experience before is that they really didn't have any kind of regulatory reason to delete stuff.
[26:51] So most of their data retention policies are like, oh, keep tax records X number of years, keep this so and so years. It didn't say delete this after this, whatever this trigger incident is.
[27:02] Right. And I think sometimes companies are frustrated because what they want you to say is like, delete this after 10 years or 5 years. It's like, well, it depends on the purpose, it depends on who's using it.
[27:14] Are there litigation holes? And so it makes it more complex. But just trying to explain to companies the idea that you can't, especially personal data, it should have a end of life, it should not live forever.
[27:27] You have to have an answer for that. I think that that's just been very jarring for organizations. What do you think?
[27:34] Bryan Lee: I totally agree with you. And you know that you would think that there'd just be this colossal market for some kind of tool that could help you do that. Right. And, but the tools themselves are as complicated as the old system of just like having the policy.
[27:48] So I've been with corporations, you know, working with them who've, they're all, what do you recommend for a tool? You know, there's, you know, there's Purview or there's these other ones out there.
[27:56] But then they ask, well, okay, how do you use it? And I'm like,
[27:59] you need to hire like a full time person just to do that. Because that thing is complicated and it's not going to save you the headache of thinking through what you want to do with it, because you still got it.
[28:08] But once it's done, then it's beautiful because then it'll be kind of automated.
[28:12] Yeah, but that's just a hard problem. And it's one of those problems. It's hard and it's tedious and nobody wants to do it.
[28:19] And it doesn't really cause enough pain yet until they get sued for, you know, something.
[28:24] I mean, we had that.
[28:25] The Trump administration just came in and Doge started talking about storage of records and they went out to,
[28:32] what is it, Iron Mountain, whatever they call that thing in Virginia where they have all the paper records kept in boxes for everyone's like, that's so ridiculous. And I'm like, yeah, but there are X number of companies in this country.
[28:42] Probably every other one has the equivalent in digital storage someplace.
[28:47] And it's just as Ridiculous.
[28:49] Debbie Reynolds: That's true. Right. Digitizing it is only one part of the problem.
[28:54] It's not the solution to every part of the problem, actually.
[28:58] It creates more risk. Right. Because now that data can move around more. So. Yeah, exactly.
[29:03] Bryan Lee: Right.
[29:03] I mean, there's this idea, too. I mean, you would think that people would under. I mean, data. You want to say, why do you have the data? Well, we're going to train a model.
[29:11] Well, once the model's trained, the data's done.
[29:13] Debbie Reynolds: Right.
[29:13] Bryan Lee: It's life. It's done its thing.
[29:15] Right. It's like keeping a turkey carcass around after you had Thanksgiving.
[29:18] It's done. Right. I've eaten it. There's nothing else I can do with that thing.
[29:22] So, I mean, maybe you can make a soup or something, but.
[29:25] Right, but that's the idea. And so. But talking to companies about these really fundamental kind of base layer problems, that would solve so many other things. Right? It's not a shiny tool.
[29:37] You know, the AI is the cool thing right now. Let's do that. Let's do it. Agentic AI that can draft an email automatically. I'm like,
[29:43] really?
[29:44] You're gonna spend all this time and effort to write an email?
[29:47] Debbie Reynolds: Right.
[29:47] Bryan Lee: Have too many. Anyway, what you should be writing out is meeting cancel. We don't need that meeting anymore.
[29:52] Debbie Reynolds: Exactly, exactly. One of my dear friends,
[29:55] she's a lawyer, she works in the legal tech space. And what she said is that a lot of people, they think all the data that they keep is like a garbage dump, but they think there's a diamond ring in there somewhere, so they need to keep it just in case that diamond ring is there that we can get to.
[30:11] Right?
[30:12] Bryan Lee: Yeah, that's exactly right. Someday.
[30:14] It's like that poor guy who lost his Bitcoin thumb drive, right, that got sent to the city dump, and he's trying to dig it out and find it. In the city dump.
[30:23] Yeah. Okay. Good luck.
[30:24] Debbie Reynolds: Yeah. Really? Oh, my goodness. Oh, my goodness.
[30:27] So if it were the world according to you, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or technology?
[30:41] Bryan Lee: So we, We. We had to do this exercise that I recommend for everybody. When you own a business, right, they tell you, oh, you should come up with,
[30:48] why are you in business? What's your mission? And then, well, even before that, you used to have a vision.
[30:52] And so I took it at the heart and I actually sat down and spent some time noodling around on this, and I came up with an Idea. And I really,
[30:59] after doing it, I really subscribed to it.
[31:02] And my vision for all of this kind of space is very simply a world where data is used responsibly,
[31:11] period. And that encompasses the technology,
[31:14] the data collection,
[31:15] the use of it, the purpose, all that stuff.
[31:18] Just be responsible about it and be responsible. What I mean, you want your own family to be subject to the outcome or result of the way you're going to use this data and use and gather in the whole data lifecycle part of it.
[31:30] And so that's what I would say if I could just get people to think about that first.
[31:34] Just be responsible about it. And it should be a place where irresponsible use of data is treated the same way as irresponsible use of, I don't know what, petrochemicals.
[31:44] You don't have to have a bunch of rules and regulations telling people, don't put that gasoline in a gas jar, in a glass jar and put it in your trunk.
[31:52] Probably not a good idea. Most people kind of know that, and they can see that. And if they see someone doing it, they'll say, hey, buddy, probably not the best idea.
[31:59] And we should have the same kind of general understanding with the way we use data and the way we treat people's privacy. It should be the same thing. Just be responsible.
[32:07] Be grown up and be responsible about it.
[32:10] Debbie Reynolds: I love that. I love that when you said that. I'm just envisioning how you go to the gas station. They may have like a sign that says flammable or don't touch this.
[32:20] So it's just like a warning, like, this is the bad thing that can happen if you don't take responsible care. Like, why can't we have that exactly right.
[32:29] Bryan Lee: And you don't have to tell most people, you know, they're. Granted there are people who will make mistakes and whatever, but in general, people know that gasoline is flammable. And you probably should probably take the cigarette out of your mouth when you're filling your car.
[32:41] You know, that kind of thing.
[32:42] And I would just like to get to a world where that. Where you just get the general,
[32:46] general sense of what's responsible and there'll always be some idiots and you got rules and regulations to kind of keep that in line. But overall, just be responsible.
[32:54] Debbie Reynolds: I love that you're so pragmatic. Oh, my goodness.
[33:00] Bryan Lee: Yeah.
[33:02] Debbie Reynolds: Well, thank you so much. It's been so much fun chatting with you and. And send Michelle my best.
[33:08] Bryan Lee: I absolutely will. She's gonna. I'm sure I'M gonna tell her. She's gonna come up in this rage of jealousy and she's gonna reach out and say, how come I haven't been on the podcast yet?
[33:16] So be prayer. She's gonna be here and it'll be. It'll be fun.
[33:20] Debbie Reynolds: Definitely. Well, I hope that we can find other ways to collaborate. I absolutely adore you and your work and the things that you say. For anyone who has not connected with Brian on LinkedIn and haven't seen the things that he posts, you have to.
[33:35] You must. It's just amazing.
[33:38] Bryan Lee: Well, thank you. From you, that's high praise indeed. So thanks, Debbie. This has been really, really great. I sure appreciate the opportunity to talk to you. Always appreciate the opportunity to talk to you.
[33:46] But to do it in kind of a broader audience is something special.
[33:49] Debbie Reynolds: Oh, that's so sweet. Well, we'll definitely talk later. Thank you so much.
[33:54] Bryan Lee: Okay. You bet.
[33:55] Debbie Reynolds: Okay, bye. Bye.