"The Data Diva" Talks Privacy Podcast

The Data Diva E236 - Michael McCracken and Debbie Reynolds

Season 5 Episode 236

Send us a text

Debbie Reynolds, "The Data Diva", talks to Michael McCracken,  a Data Privacy Strategist and Operational Expert. We discuss his career trajectory, transitioning from accounting to IT audit, which led him to specialize in data privacy, particularly concerning children's information and neurodivergence. He emphasizes the significance of inclusivity in the privacy sector, arguing that diverse teams enhance the effectiveness of privacy strategies.

The conversation highlights the complexities surrounding data collection, especially regarding minors. Michael argues that businesses should prioritize individual privacy and not rely on exploiting data. Debbie pointed out the risks associated with unnecessary data collection, which can lead to breaches and erode trust.

We advocate for treating data as a valuable asset requiring careful stewardship rather than a commodity to be exploited. They discussed the importance of passion and advocacy within organizations to bridge communication gaps and enhance collaboration on privacy initiatives.

Michael raises concerns about the inadequacies of current legal frameworks in protecting children's privacy, warning that companies often prioritize legal defenses over accountability. He and Debbie agreed that a multifaceted approach is necessary for effective privacy regulation. They emphasize the need for transparency and organizational responsibility in privacy practices, particularly as large companies struggle to adapt to regulatory changes. The discussion also touches on the impact of GDPR on U.S. business practices and the challenges posed by the lack of a cohesive national privacy framework.

In concluding,   Michael articulates his vision for a unified privacy ethic that prioritizes individual rights and ethical considerations. Debbie echoes this sentiment, advocating that organizations adopt clear ethical frameworks as guiding principles. 

Support the show

Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

Now, I have a very special guest today, Michael McCracken. He is a Data Privacy Strategist and Operational Expert. Welcome.

Michael McCracken: Happy, stoked to be here. Absolutely stoked to be here. Debbie out a little bit.

Debbie Reynolds: We've had some great conversations and chats over LinkedIn.

Michael McCracken: Oh yes. Oh my gosh.

Debbie Reynolds: It's been quite a, quite a journey and so I'm happy that we were able to get this on the schedule so I could be able to talk to you. You have so much energy, passion and excitement around data and data systems and how to best manage that not only from a company perspective,

but also human perspective. So you had a pretty storied journey in technology. So why don't you tell us how you became a data privacy strategist and operational expert.

Michael McCracken: I will say I did not take what anybody would consider a traditional path, which actually sticking to my guns, just really developing my own instincts in my career, I stuck to what I felt was the right path for me.

It started with a few years in accounting, fresh out of school and fresh out of the last housing Crisis, financial crisis, 2007, 2008, I'm outta school in 2010. Well, okay, everybody's going into accounting, so to speak, or that's how it seems.

So I started to cut my teeth there, realized okay, I have much more of an affinity for technology. What does that look like and what's a holistic perspective here? I didn't want to lose any of that experience with financials or business or being able to understand your basic financial statements or even read an audit report.

So and that's always come in handy even now,

fast forward. I began to cut my teeth around major ERP systems and a lot of controls, typical SOCs, so on and so forth, which then actually opened the door to now what was emerging at the time, gdpr,

which with my background experience around accounting, but then also ERP systems, there were a lot of migrations happening at the time and a lot of that also ends up touching on things like financial currency translation adjustments which end up flowing through places like Cork and Luxembourg.

So kind of wouldn't think it at the time, but that starting out on the accounting side of things and then translating that and building on that and IT audit, IT advisory and then into ERPs, actually ended up starting me on that path into privacy and GDPR at the time.

And that brings me to where I am now,

where I like to focus on a lot of things, really. Just how do we embed privacy? How do we approach it from an actionable way, from, like the, you know, the user experience?

How. How are we connecting that to your privacy notice and then your consent statements and so forth?

Debbie Reynolds: I find that people who wind up in this area and I. And by the way, I love your career path and we could talk a little bit about that in terms of other people trying to pivot into privacy.

But I think one of the things that I find really interesting as I talk to people, when they find themselves in this area,

they tend to have some type of expertise, whether it's like legal or technical acumen, but then they also have like a passion for humans.

Tell me a little bit about that.

Michael McCracken: Oh, gosh. Well, so for me, it starts with children. It starts with kids. It starts with when you learn about what things like COPPA are really intending, which I think has been discussed extensively across various podcasts and including yours.

But when you look at this and you look at how a lot of things are kind of reactively crafted and so forth, there's not a proactive measure taken that places the individual first, much less children and minors, especially not to unpack a whole can of worms here.

But when you look at that ed tech space, there's just a whole, whole lot to unpack there. And when it's talking about minors and children, I don't know,

there's a lot of places you could go there, I guess. But for me, you know, having children myself, it's. It's something I hold very close, and they are very heavily, easily influenced.

And that leads way into so many other aspects when you're talking about apps or mobile apps or otherwise. And so there's all these rabbit holes we can go down there with all the ways children are impacted and how that plays itself out long term, that it's just probably one of the most critical areas where we ultimately need to place the individual unilaterally first,

if for no other reason than for our children. And when I say that, it can be divisive, but I mean it in such a way that there shouldn't be a business or other consideration.

When you're talking about, like, children's privacy, like, there always seems to be this pushback. Well, what about the business? Well, if their business model is. Requires that data,

why does that inherently Mean, it's a valid business model because to me, I, I used, I'm of the mindset that you need an actual product to offer,

like a actual tangible good or service or product to offer. If you don't have that and you need data to offset things in the meantime, do you really have a good business model?

Especially when we're talking children. So that really at a foundational level what it's about for me and you get into other aspects where we're talking ethics, we shouldn't, again with kids, we shouldn't.

Business ethics shouldn't even be on the table. It's about ethics, period. So again, that's where you can go through, go down a whole lot of different routes here. But then also when you look at where you have things like public private partnerships and you look at this is my inner auditor coming out.

But when you follow the money, it, it's got to lend some skepticism to are the children unilaterally? Is it there on an individual, child by child basis, is that the unilateral sole focus without regard for anybody who is invested in it?

Like that shouldn't even be a factor. And many may want to argue with me there. And that can get divisive. But at the end of the day,

we've seen how well the public sector is at protecting data, so I'll leave it at that.

Debbie Reynolds: Yeah, I agree. And I think one thing that I like to say is that the farther away that you get from a business case that doesn't benefit the individual, like the more challenge or privacy problems that you're gonna have.

Right? Because the person doesn't know how you're using their data. If the data use doesn't benefit them, a lot of times they're not aware. So part of that awareness and transparency means that you have a huge gap in the way that you're doing business.

And I agree with you. I mean, I think maybe I'm just hopping into another subject here. But a lot of times when we talk about regulation, and a lot of companies obviously don't want privacy regulation, they just don't want regulation in general.

But my thing is like, are you going to give up a dollar to say, to make a dime, right? So if you don't have a good business case, you don't have a good product that your product or service that you're giving, people aren't going to trust you and they're not going to want to use your product.

So then it doesn't matter about the, the dime that you made on Someone's data when you sold it.

Michael McCracken: I think that's a very matter of fact segue to one year if you. So we've gone very, you know, very, very sensitive topic with children's data. But let's get very matter of fact.

So many orgs have so much data, somebody's paying for that, somebody's paying to store that. Somebody's paying. There's. There is a money trail there somewhere. There is money being made on that excess data.

What is that doing besides more to clean up, more to have to sift through, more to have to just deal with. Whereas if you go from like a privacy centric or like a data centric perspective, like you'll have the purpose over here, you refine those and that will ultimately drive.

Okay, well, now we have a model that says we have a business purpose. We have a defined business purpose that is related to our business model. For every data point we're collecting, it seems like they're, practically speaking, there's a lot less noise.

When you think about it from that perspective too.

Debbie Reynolds: Yeah. I think you save money because you're probably not collecting stuff that you shouldn't have. You probably have a better eye towards, you know, what you need to keep and why.

So you're not like burning money, just storing stuff. And a lot of that stuff that people that companies collect become risks. Right. Because especially if they. The data's aged out and it doesn't have like that high business value, chances are it's probably on some place that isn't as secure and as protected as it should be.

And I think a lot of the data breaches that we're seeing today, you know, because I look at these very closely and a lot of times this is kind of legacy data, like stuff that the business didn't really care about anymore.

And they. No one ever thought to get rid of it or anonymize it or protect it in some other special way. What do you think?

Michael McCracken: So I think you actually are segueing into. Into something I often think about, and that's data as an asset to steward. I don't know if I've heard that so.

Or if that's just something I. A way that I think about it. But if I think about data as an asset to Stuart and I've tied it to a purpose and obviously like, I can quantify its value in that regard.

So then if you have all this legacy data, all these assets, so to speak, well, are they really generating any value for you at that point? Is it, is it really Generating any value.

Can you honestly tell me that? And I think if you view it as an asset to steward, then you're going to be actively managing, monitoring, monitoring and having some oversight and transparent line of sight into every data point you're collecting.

Because you're going to want to know the value that each point has because that's also going to drive, in the event that it's compromised, that's going to drive your risk, that's going to drive the impact.

In my mind, I agree with that.

Debbie Reynolds: I want your thought about career path. So you've had a very interesting career path into privacy, as have I and most people that I talk to and it really isn't a straight path because privacy wasn't even a career many years ago.

Right. So like all of us have kind of fallen into it from different ways. But give me some for the listeners, people who are really interested in this area and they want to pivot into this area.

What advice would you give them?

Michael McCracken: Please do pivot into this area. No, the advice I would give is I'll start with the one caveat and I think you touched on it, but it's really, it's worth it when you're truly invested in it.

I think that's a common thread that runs through the privacy community no matter where you come from. And it's that passion that ultimately brings people over to privacy in general in whatever facet, be it in the legal or marketing or any other industry, operations, technology or otherwise.

And so I think at a core gut check level,

I would say the questions to ask are, do you find yourself drawn to aspects of it? Are there aspects that just kind of really clue you in and you are passionate about that?

Okay, how does that translate to what you're doing now? I've always viewed my career not as stepping stones, but just as holistic blocks being put in place and also being open to where my career may take me versus where I intentionally gonna go.

So a little bit of openness as well.

I also think in a very practical regard and a practical way to do this, I think in a lot of orgs and where I think personally as privacy professionals we haven't have a need for these types of folks are those champions,

they care about it, they're interested, they don't want to do it full time. That's a perfect place to start for any person because those are the people I've seen, I think we've seen on LinkedIn and other areas plenty where I can't remember who it was that posted This, I think it's one of the usual suspects is those translators.

We act as translators but I often think we can only do that when we have those champions who we can work with directly that help us continuously refine the messaging.

Because they're going to know that with their people given whatever silo it is. Because that as outdated as management theory is where you have all these silos in an org,

that's typically how a lot of orgs are designed. And so you need those champions because then that's ultimately how you're going to drive the buy in across the org. So from a practical standpoint, I think there's a way for a lot of stakeholders to get some aspect of privacy and how that relates to what they're doing and really drive that within their org.

Maybe they don't want to switch to it full time, but they have an interest in it and want to figure out how can I do more with this? I do run into that enough of folks who may not be interested in privacy full time, but we have tons of one off conversations about it.

They're always interested. So I want to empower them. I then want to invite them to be champions. And that's honestly what I would call the open invitation for anybody to be a privacy champion.

And I think also that's an open door for folks to, or they are interested and they want to figure out how to kind of ease that transition and test the water.

I think that's another great route,

LinkedIn Learning or Udemy or. I mean I don't know if I'm allowed to say like actual names but I just, there's so many, I've used so many different ones. Cyber is another one that, that I've personally used quite a bit that, that's a great one just to get exposure to a whole lot of different things.

But there's just so many different, different routes there. I think again though, it's that passion, it's that personal drive or motivation about that niche that you just want to dive more into.

Debbie Reynolds: I love that, I love the way that you put it because you said a lot of really important things. Probably one of the things piqued my interest that you said is about kind of this outdated business thinking.

The business training for me, I like to call it like Sam's workshop where everyone does their one little part and they don't care about the rest. Right. And that's just not real.

That's not the way data works. That's not the way data and organizations work. So being able to have someone who's like a translator or have people who understand other different parts of the business and how their part plays into it and having those champions, because in privacy, you have to touch and talk to so many different people within the organization.

And you have to use your tation, your skill, your ability to communicate and translate with all these different people so that they understand. And then, as you say, I think people who do well at these jobs, they will find those champions within those different groups so then they can,

you know, communicate with one another and do things that can really help the organization become more mature. In privacy.

Michael McCracken: You touched on a great point. It brings to mind those, those change management folks, those organizational change management folks, those, those underrated roles that often, like I've from my days in internal audit that you often see get cut in a lot of orgs when they're cutting budget.

And yet those are so instrumental for helping drive adoption of anything across an org, be it privacy or otherwise. So I think when you're talking about driving that change, I think that's also where those champions clue in.

And I know personally when,

when I'm able to work with somebody else who doesn't necessarily think about things the same way. Like for me personally, I have adhd, but what that means is that I've got one bucket up here.

But when I can dialogue back and forth with somebody, it actually gets very efficient at arriving at like, what are our deliverables going to be? How are we going to approach this thing?

A lot of times, though, it's super valuable for me to actually have those folks, the change management folks, or project manager managers, or project managers or otherwise.

It's great for me to have them because now they can help me tailor whatever it is we're working through, whatever initiative or project or otherwise. Those are the folks who help me tailor the messaging.

Those are the folks who help me translate things in an actionable way. Those are the ones who can actually level set and say, this is realistically probably how long X, Y and Z are going to take.

There's just so many important aspects here. And I often think of them as almost like these, these augmenters that for me to be able to do my job well, it's helpful to work.

It's helpful to have those champions and those other people to dialogue with. Because throughout my career I've never really been siloed. I've never really understood how in a lot of orgs people don't tend to think across the org or the implications.

And so I Think it's important to open those lines of communication across the organ. Again, it's happening through all those various parties and groups and initiatives. When you were talking privacy or any else, it's obviously there's very submerging technology that is all the rage.

How do you drive adoption of all the emerging things? Well, you need those people, those champions. You need to drive that. Privacy is playing a huge part in things like AI governance, for instance.

So I'll say this a different way. When you look at a lot of the regulations that are coming out across the org, it is no longer the prescriptive world that any.

That most companies are used to. There is not a prescriptive approach. There's how many, what is it? Something like,

I don't know the exact number, so please don't put it something like 8, 10 something. Various AI governance frameworks globally that are emerging now. And there's value in all of those.

There are different approaches, different perspectives there. But like, how do you approach any of that in these orgs? So I just, I think there's a lot of competing factors and you need a lot of these folks to work together to drive a lot of things forward that are ultimately dependent on things like privacy and having data security and so forth.

But I think it's hard to drive a lot of that, especially if you're not seeing things from a holistic way across an org, because then often you'll have silos. They're all doing the same thing and they're all doing, say, implementing the same tool, yet they're not doing it in a unified way and can't understand why there are hurdles here.

Debbie Reynolds: Valid point. So what's happening in the world right now that concerns you as it relates to either privacy or technology?

Michael McCracken: Ooh.

I mean, I can speak as a parent, Deepfake technology, that, that concerns me. Anything around data with kids concerns me. It doesn't matter what the fact that most of our laws are not equipped to truly address children's privacy in the world that we are in.

I think it's unfortunate that in a lot of respects the legal system is. It is good that it moves slow, but in a lot of other respects it's unfortunate. And now from the standpoint of, from a business perspective, you have all these different laws in the US emerging, for instance.

So how do, how do companies keep up if they're not proactively addressing any of that? And so that's what it concerns me is because if the companies can't keep up, are they going to be Constantly reactive.

There's going to be issues, there's going to be things that happen, there's going to be major events, and we're constantly going to be in reactive mode. And again, I go back to this as a parent when our kids are involved.

That's just so hard for me to wrap my head around. Like, it's so hard for me that, like,

we have to view things through a lens that puts our children secondary in any respect. That is so hard for me personally.

Debbie Reynolds: Yeah, I guess the thing you pointed out is something I grapple with and I think about this a lot, and I think the challenge that I have with regulation is that I feel like people think, oh, we have regulation that'll be like, that'll fix everything.

And we know that that's not true. There's a lot more to data than regulation. And also a lot of the harm that we're talking about can be prevented or should be prevented.

Right. It shouldn't be, oh, well, let's let this bad thing happen and to Johnny. And then now we're going to have Johnny's law, because now this bad thing happened. Right.

So we're in a place where digital transformation and the way that companies are using data with AI, things are just rapidly, you know, going ahead.

And I feel like there, there will be no adequate legal redress for some of the harms that we see. So we have to find ways, as you say, to be more proactive about it, whether that's, you know, standards or frameworks or, you know, just in my view, business.

Right. You know, so to me, it should be a, your business that you not do things that will harm people.

Right. That should be part of your, the ethos of your organization, not that you're going to do something, you don't care what, what happens, and then you want to go to court afterwards.

So what do you think?

Michael McCracken: Oh, you're unpacking it.

My brain immediately goes to, yeah, but we have shareholder primacy and at the end of that can unpack a whole can of worms. But, like, my only response to what you're saying is the only thing that will determine if companies have to care is where they're incorporated and if shareholder primacy is there,

then likely companies beyond shareholders making it happen. Companies won't have any obligation to do that. I'm not, and this is not like this is, I don't, I don't mean it inflammatory.

I don't particularly have any concern if it is inflammatory. I'm just trying to be very matter of fact like, we have things like that that people might conflate in a political way.

But it, when we view through the lens of protecting children here, like, I'm sorry, that's secondary. So it's, I'm kind of with you in a lot of, in a lot of respects here.

Like in a lot of regards the law. And we've seen this throughout history. Go back. I mean, you can see this in the book the Hacker Wars. You can see it on the website.

Frack. You can see it across any number of other. What would have been any of any of these other sites. The MO has been that companies get to lawyer up before they have to be accountable.

That's been the M.O. it's, we will lawyer up against, what was it, like, kids who happened to outsmart major corporations. But the route was to lawyer up and prosecute teenagers,

like. And that has continued to occur. And so there's an aspect here of, well, what can we do? If we're effectively, if our hands are tied by the law and you have this aspect of the law, can't keep up when it comes to actual protections in the world we're in,

then what routes do we have?

It's something I think about frequently. I don't know what the answer is there without getting into things that we don't have time to unpack today.

I mean, we're getting into ethical consideration. Let's ask the question, why do we have GDPR Now? I don't think we need to unpack that here, but we can say that gdpr, what history says and what is documented GDPR was a direct response by Europe.

That's what I've gathered. I think there was a book of privacy and power that goes into it a good bit about all the ramifications and legal aspects and the transatlantic something around privacy.

But it's effectively security and privacy and the overarching aspect where ultimately they really are at odds when you're talking about at a global scale. So you have those things to consider here too.

The question I would ask is it can we really say security is absolute?

So then can we operate under that, guys? And then we have companies that do operate under that guise that more data collection will mean more security when we can see over and over that that is not the case at all.

So I don't know what the answer is. I could speculate a lot. I could unpack a lot more, go down more rapid trails here. But I'm also, you know, I, I don't necessarily think this is a place to get super divisive or sensitive on a lot of that just because there's so much information that folks can read on their own in those areas.

But I think there's a lot of competing interest when it comes to privacy. And really history would also say that it's the privacy activists who are,

who are the ones driving this forward. That's any book I've read on privacy historically is that it's globally the privacy activist pushing back constantly over and over in this area.

So I think that's what we can do.

Debbie Reynolds: Yeah, right. I agree with that. And actually the author of that book has been on the show Privacy is Power, Marissa Bliss.

I don't know, I guess I feel like there has to be a multifaceted approach and I do want to go back to GDPR by the way, so I won't forget that.

I think there does have to be a multifaceted approach and that's kind of why I feel like when people think regulation is like panacea, like it's the end all be all, I think it has to be many things, right.

So if you think about traffic laws, right? So let's say a law is written, a laws legislated, is putting in a book. But then you also have stoplights or stop signs and you also have training manuals and you also have people who teach you how to drive, right?

So it's this multi level thing and then there's also norms in society. So when I think about privacy, I feel like we focus so much on kind of the law in a book or a legislation and we need to focus on, we need to also think about these other multi level approaches to be able to make it more tangible and more real to people in their everyday life.

So it's like a muscle I feel. But what do you think?

Michael McCracken: No, I would agree with you on that. The multifaceted approach that you touched on I think is key. I think that is the one aspect that does make it difficult. At least here in the US you have to come in it almost like a lot of the historic campaigns that we've come at things with.

Be it and forgive any just old reference, old references I make here. But I immediately think of Smokey the Bear or I mean that's just the immediate one that comes to mind or anything around those collective efforts to instill those and children and in people and in us as a culture.

Because it's not just the kids. We're not talking indoctrination here. We're talking about allowing people to be informed enough about what actually is. And I think really it's about ensuring transparency there too.

I mean, I think one of the aspects is how do we get transparent if people can't easily understand and the burden is then pushed on the individual by the company?

I mean, I can't tell you how many privacy notices are out there. When you get to the section on children, it's like, what, three sentences, four sentences maybe? And it's effectively an afterthought.

But when you look at the actual regulations, it's like,

well, it's only a matter of time before the AG sees this. Right? I mean, that's the way they approach it is not in the way that even anybody interpreting the regulations would recommend writing that.

And so you have aspects like that where they don't approach something like that as transparently as they need to. And further, they don't take responsibility.

A lot of orgs don't. And a lot of it can be that they, they don't necessarily know. I mean, that's, that's the aspect. A lot of orgs are so big where they've like, ways of doing things that, I mean, it's like trying to write the Titanic, trying to turn the Titanic in a lot of cases.

So it's kind of where my head goes with a lot of that.

Debbie Reynolds: Yeah, I agree with that. What's your thoughts about gdpr? And I've been thinking about this. So I was on PBS in 2018 when GDPR came out and I. It's funny because I still go back to that video and I'm like, hey, this is like a big deal.

Even though GDPR isn't a US law, it's going to have a huge impact because we are a global world. We do business across borders. The Internet doesn't really have a border.

Right. And so what we've seen play out has been a lot of that influence, whether that be influence and how laws are made on state level or different other jurisdictions.

But then also I think the influence has been a lot on the business practices. So even companies who aren't doing business in Europe, let's say if you did business with a company who is doing business in Europe and they, hey, we want you to follow, you know, align with X,

whatever X is. Right. And so I feel like the fact that the US hasn't developed our own kind of national framework, we're sort of dancing to the tune of these other countries and other jurisdictions.

And I think it's hard to get out of that because if we want to do business with other companies to do business in these other countries without us having something we can stake our claim to and say, hey, this is how we're going to do X.

We kind of dance to the tune of these other nations. What are your thoughts?

Michael McCracken: Yes,

I respond like that because that's the point. I think that's my opinion. That's the entire point. And so I found the book here of Privacy and Power, the Transatlantic Struggle Over Freedom and Security.

And so that book goes into a lot of this and what GDPR was a response to and so on and so forth. And I think that's the point.

You might hear it say you shouldn't be able to buy data that you would otherwise need a warrant for.

That's effectively what this boils down to is,

no, you cannot use private companies for that.

Governments in the public sector cannot use that if they would otherwise need a warrant for it. They can't use that as a route to avoid the legal routes to obtaining data.

So I think that's a big response to it on a global scale. And then you have that, you have that tug of war between privacy and security. And in a lot of cases you have global communities will say that some are pushing for levels of surveillance and others are pushing against that and pushing against more of that happening and pushing against countries being in general being able to utilize that,

utilize the private sector as a route to again circumvent laws in any, any number of countries that would otherwise prevent them from obtaining said data or information. So again, to, I guess to go back.

Yes, and I think that's the entire point. I think the point is to disrupt the. So I don't know if I want to touch on warrant canaries or not because I think that gets a little too close to the edge.

I don't know your thoughts there, but this is one piece that I don't know if I want to cross that line because I'm trying to avoid.

Debbie Reynolds: You don't have to. It's not necessary.

Michael McCracken: Okay, so I'll go back to. So as far as a national framework,

I think this is where we can pick up is as far as a national framework,

I don't know how likely it is to happen in a way that would be commensurate with something like gdpr,

quite frankly. Now I will say so you mentioned going back to 2018. I think back then nobody knew. Is ultimately US approach to privacy going to win out or is the EU approach to privacy going to win out?

Now, fast forward to now. I think it's clear when you look globally at the influence and even over into California GDPR1, I think there's an aspect where now I think more and more companies are accepting that.

I think they're up until maybe the past couple years, two to three years, I don't know. And I'm speculating there, so don't quote me on that. But that's my impression, just from things I've seen and folks I've talked to, like, there has been pushback, but I think it's been progressively lessening because there are so many emerging laws and regulations and they are written in a way that in my opinion,

you have to be proactive. There's no more asking forgiveness when you have to be proactive. And so now I think companies are coming around to accepting that as well. And so I don't know that if they're already doing that, will the US See a reason to craft our own?

I don't know.

There's other people that could speak to that a lot better than I could.

Debbie Reynolds: Yeah, I guess I want, you know, and let's think about the US Level for a minute.

When I think about data breach, for example, so the state of California was the first state to have a data breach law. And as of 2019, all states have data breach laws.

Right. And so California was very influential.

Even though all states have data breach laws, they're different in every state. And I think we're following the same path with privacy in the US and so I would say even though a lot of companies thought GDPR was like a pain and they just felt like they couldn't do business or whatever,

I think they'd be relieved to have something that was some type of reference that was more united as opposed to divided at a state level,

I'd agree with you there.

Michael McCracken: I think with where we're at now, it has matured. There are interpretations of it, there are standard ways. It is now being referenced, say by other regulations or other regulations are being inspired by it.

And if your regulation inspires other regulations, I'd say it's at a mature enough point where a lot of orgs are going to be more conservative, especially more legacy orgs. So here was my experience explicitly, they are going to have people in the background working on it.

They just may not have things that are explicitly established. Yes, they're, they're moving in the background and when they need to do something about it, they will. But I think in other cases you have smaller and smaller companies that are trying to use that for a frame of reference.

They don't have those teams of attorneys or risk folks or business operations or you can insert the title, but effectively those folks across the business that are now part of this initiative to now we know GDPR is coming, we need to be, we need to be compliant going back to,

you know, 2017, 2018.

But not every company was doing that. I think now it's, it's been established enough and enough comp. It's starting to impact enough companies.

That is the best route to go. It's the best because you start there. Okay, well if you're already going with the most established, most conservative, et cetera, well then, okay, you're likely then at that point it's going to be okay.

Now what do we have to respond to on a tactical basis like, or at least that's how I think about it. Not a lot of companies don't approach it that way.

A lot will just want to quote, get compliant. Right? I'm sure you've heard that. I'm sure a lot of our listeners have heard that.

I think in the world of privacy you don't just get compliant. There's not a we're going to go get compliant. There is not a checklist that can be handed to you prescriptively.

I'd be skeptical of one that is granted, if you've already worked with folks to tailor to your org, that's one thing. But if it is a prescriptive checklist and it's related to privacy or even AI governance like that, I don't want to, I don't want to say it's the wrong approach.

I would just say like, I would be skeptical of that approach to being able to respond to most of this. So.

But I think there's some aspects we can unpack there with how this has affected smaller orgs, mid mid sized orgs, global orgs. I also think there's emerging aspects where,

how is this going to affect things longer term from like a global aspect and companies, so to speak? Playing nicely.

Debbie Reynolds: Yep, I agree with that 100%. So if it was a world according to you, Michael, and we did everything you said, what would be your wish for privacy anywhere in the world?

Whether that be regulation, human behavior or technology.

Michael McCracken: I would wish for a, a unified privacy ethic.

I think we see the makings of a lot of that, but I think there has to be a foundational,

the individual comes first period aspect to it and then build from there.

That would be, if I could wave my magic wand, it would be a universal unified privacy ethic. That is accepted and is just is focus on people first and foremost.

Debbie Reynolds: That's a good one. I have not heard anyone say it that way, but yeah, I agree. Right. Something ethical where we say this is our ethos, this is our North Star, this is the way we're going to do things and then coalesce around what that idea is for the organization.

Michael McCracken: It's like, if I think about it practically, you're taking, you're taking your principles, but you're making them centered around the individual and protecting the individual and placing the individual's interest first to the extent that it's not, you know, coercive or otherwise on another individual, like.

And you're, you're focused on all aspects of that individual. Okay. Right. You have those principles now. It's how it works itself out practically. And that's ultimately. And people could debate me on the technical definitions around an ethic, principles and so forth, but I think a unified privacy ethic that,

that drives things that you can point back to.

Debbie Reynolds: Well, thank you so much for being on the show. This has been great. I love the way that you think and I like our chats that we have on LinkedIn and how much excitement and passion that you have for this area.

It's tremendous.

Michael McCracken: Thank you for the opportunity. It's so good to be able to catch up with you here and just really have a face to face. It's a lot more fun than just messaging back and forth about these fun topics.

So thanks again. I so appreciate it. Debbie, this has been awesome.

Debbie Reynolds: Thank you so much. This is amazing and I'm sure we'll find ways we can collaborate.

Michael McCracken: I hope so. Thanks again.

Debbie Reynolds: It.