"The Data Diva" Talks Privacy Podcast

The Data Diva E178 - Carey Lening and Debbie Reynolds

Season 4 Episode 178

Send us a text

 

Debbie Reynolds “The Data Diva” talks to Carey Lening, Privacy Professional, Writer, Chronicles of the Constantly Curious (Dublin). We discuss her journey into privacy, highlighting the significance of the GDPR and the shift in attitude towards data protection. The conversation also delved into the limitations of applying a property rights model to personal data, with Carey highlighting the challenges and nuances involved in understanding and regulating the use of personal data. Additionally, the discussion touched on the broader implications of data ownership and the varying interpretations of privacy rights. We emphasize the need to recognize the complexities of data protection laws and the conflicting nature of individual rights in a global context. We advocate for a shift from absolutist beliefs towards a more grounded and pragmatic perspective, acknowledging the inherent difficulties and trade-offs in addressing privacy concerns. The discussion underscored the importance of moving beyond idealistic notions and embracing a more nuanced understanding of privacy rights in practice. We discuss the impact of privacy regulations, focusing on the aftermath of the ruling against Google Analytics in Europe. We emphasize the need for pragmatism in compliance decisions and expressed concerns about the broad application of regulations to all companies without considering nuanced differences. The conversation also delved into the complexities of cookie laws, data sharing without consent, and Carey’s Data Privacy hopes for the future.


Many thanks to "The Data Diva" Talks Privacy Podcast Sponsor and Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data,  and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.
 

Support the show

41:11

Many thanks to "The Data Diva" Talks Privacy Podcast Sponsor and Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data,  and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.

SUMMARY KEYWORDS
people, privacy, data protection, data, cookie, law, thinking, organizations, regulators, absolutism, talk, companies, europe, dealing, problem, google analytics, rights, bad, personal, postal union
SPEAKERS
Debbie Reynolds, Carey Lening

Debbie Reynolds  00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest all the way from Dublin, Carey Lening. She is a wonderful Data Privacy expert. I really love her writing. She's also the writer and author of a really great newsletter called Chronicles Of The Constantly Curious. Welcome.

Carey Lening  00:48
Thank you so much, Debbie. Thank you for having me on. I’m really excited to be here.

Debbie Reynolds  00:53
I'm excited to have you here. Love your writing.

Carey Lening  00:57
Thank you.

Debbie Reynolds  00:57
I love the fact that you do deep dives into things that maybe some people tried to do in soundbite moments and maybe need a little bit more exposition. You're also a part of the EDPB support pool of experts for the European Data Protection Board, which I'm really happy about. But before we get started, I would love for you to talk about your journey and your trajectory into privacy.

Carey Lening  01:26
Sure, I'd like to say I kind of rolled into it by not figuring out what I wanted to do when I grew up. So I started out as a lawyer; I went in originally with this grand notion that I wanted to go help people who were involved in crimes, particularly hacking and tech-related, tech-adjacent crimes, and realized very quickly that wasn't really for me. When I got out of law school, I had notions, as they say, here in Ireland, and I did stick with tech, and IP privacy or data protection in the United States at the time wasn't really a thing. But I did stick in those adjacent spaces. I practiced for a couple of years and then just got completely and totally fed up and burned out. As I tell everyone, I am now a recovering lawyer, which is a much better place to be because you learn how to think like a lawyer, which is a very important skill for this sort of nuance-heavy world that we live in. But I don't have to spend all the time doing billable hours and dealing with clients who don't want to pay and other terrible things. So once I left law, I bounced around; I was a journalist for a couple of years, and I worked in Information Security. I worked in Competitive Intelligence and Research and things like that. Then I eventually made it into data protection and privacy. California was an early adopter, as many of us know, of data protection laws or privacy laws. I started working with organizations segwaying, from the security thing into data protection. Then when I moved to Europe in 2017, when I moved to Ireland, I picked all that up with gusto. Because the GDPR had finally come into force, everyone was really happy and excited about data protection. This time, unlike under the earlier manifestation of the data protection laws in the EU, the GDPR had teeth. So everyone was taking it seriously. So that's what I've been doing ever since. It's been fun. I do a lot of writing. I do a little bit of evangelism and thinking about what are very, very hard problems on a regular basis.

Debbie Reynolds  03:51
I love that. I love that we get to have a front-row seat as you talk about some of these concepts that are really fascinating. You being an American who lives in Europe, I think you'd probably have a very unique perspective. I actually recently saw, and I think I chimed in a bit on a post that you had done about, I think you were talking about the difference between data protection and Data Privacy. Yeah, yeah, I think it's very much a cultural difference. What are your thoughts about that?

Carey Lening  04:23
Yeah, I think it's really interesting. You raised some good points. I think I had written a recent post about data ownership, and whether or not you can own your personal data, and what that means. It was inspired a little bit by Jeff Jockisch's post asking the same questions. You're right. I have a weird perspective because I have seen both sides, and I have intimate familiarity with both ways of thinking. The kind of thing that I have realized is that, fundamentally, it comes down to whether or not you view data protection, and data protection and privacy are different. Privacy is rights around autonomy and the right to be left alone and rights around control and choice of not just your data but physical personage and deals with a lot of things around reputation and other rights to be free to do things of your own choosing without interference from others necessarily. Whereas data protection has a lot of overlaps with privacy, it's really focused on how organizations and other people deal with your personal information, your personal data. I think that distinction is really important. I think the big reason is the data protection laws are, in many respects, about forcing organizations to treat you like a human being. I think that's an easy way to think about it. You're not just your data. So that is an important thing to recognize: those data protection laws are designed to smack the the Googles and the Facebooks and the other data brokers and everyone else around and say, hey, it's not just numbers on the screen; these are people. The thing that I realized that differs and why I think a lot of us folks, in particular, tend to conflate those two concepts is because they don't look at it in the same way that Europeans look at it. Fundamentally, in Europe, data protection and privacy are enshrined as fundamental human rights. It's like the First Amendment in America, the right of speech and association and religion. We recognize those as rights. America didn't really have that foundational principle in the same way as privacy. Privacy came about as a penumbral right that kind of manifested through court decisions and things like that, with some interpretations to the First and Fourth and 14th Amendments and things. But it wasn't a recognized codified right in that way. What's happened is a lot of the case law has taken the approach of privacy being a thing, grounded in tort is grounded in consumer protections, protecting you against harms, which is important, but not recognizing it as a fundamental intrinsic to be kind of a thing. So it's treated differently in law; it's treated differently by people. I think the mental framework, the ability to think about it in these two different ways, is what really distinguishes the laws between Europe and the laws in the United States. It makes it fun for some nerd like me, who actually understands both sides of the story to write about and pontificate on LinkedIn.

Debbie Reynolds  07:51
Perfect. You touched on the initial polls that got us chatting about data protection and Data Privacy, but the initial post you were talking about was around data ownership, which was amazing. Let's talk about that. I think one great point that you made. It's so funny because every time someone says this, I shake my head, like, oh God, I don't have time to explain this whole thing. But a lot of times, when people try to talk about privacy in the US, they try to articulate it as if it were a proper right, which is problematic for many reasons. But tell me your thoughts on this; I thought your summation of this issue was spot on.

Carey Lening  08:36
The problem with property rights as a model, again, goes back to the idea that case law was developing in the United States; they were trying to glom this on to other things to intellectual property or to physical property rights as a consumer, right. So you have like, invasion of privacy as a tort or the reputational rights or defamation, a lot of these have overlaps with privacy are now more likely data protection. But being grounded in a property right, or being grounded in that idea of a consumer protection type thing, means that I think a lot of people have a very warped idea of how much control and how much possession they have of their personal data pointed out in my post that in many respects, it doesn't make a whole lot of sense to think about it from a property perspective. Because, unlike the typical ownership, when we talk about, say, ownership of land or ownership of a physical good, what that means in practice is you have the ability to control what you do with that thing. You get the choice of if you want to sell it to someone if you want to share it with someone, if you want to destroy it. You can't do that with your biometric data. You can't tell the tax man, as I say, I'm exerting my property rights over my personal data, and I'm not going to share it with you. The IRS will just laugh at you and tell you to go to hell; that understanding makes the property right argument makes the idea of ownership such a non-existent thing. A lot of people have the idea picked on Jeff a little bit here with the post. But Jeff is actually pretty nuanced about it. A lot of folks have a very, very, almost libertarian position about this property, right, like they think that they really can just consent to things or that they should have the ability to monetize their data. The way I always look at it is, okay, that might work. If you're just talking about Facebook, or you're talking about some service that becomes increasingly more challenging when you're talking about all the other myriad times when our personal data is being used by others, I use an example of opinions. For instance, in GDPR, land, we think of personal data very broadly, anything that identifies or can identify a living person or natural person. That's really broad. That's way broader than most conceptions of PII in the United States. It's covering things like inferences, it's covering things like opinion, it's covering statements made by others about you. Because if it's a decision, or if it's some kind of a statement being made about you, that can be personal to you that can identify and, unfortunately, or fortunately, probably, depending on what world you live in, you don't have rights to people's thoughts, you don't have rights to inferences that people might make about you, if I decide that I don't like how Sally looks. And I don't want her to be my friend. There's personal data about her that I'm at a decision around, but she doesn't have any ownership right to that she doesn't have any control over that. And the GDPR kind of recognizes that GDPR understands that these rights are not absolute. The control model or the ownership model, I think sometimes misses that quite a bit. Now, to be fair, I have some counterparts in Europe, that also seemed to miss that bit as well. So it's not just like I'm picking on the United States here. There's definitely some privacy absolutist folks out in the world, they're thinking a little bit prescriptively.

Debbie Reynolds  12:18
Yeah, well, you're segwaying into a point I definitely want to talk about, which is privacy absolutism. Well, in two ways. One is the thought that, somehow, privacy is an absolute right, which is not true. Then the other part is kind of the churchiness of privacy, where some people have to come, like for Moses from on high with all these rules, and only they can interpret it and things like that. So two different ways to think about absolutism. What are your thoughts?

Carey Lening  12:53
In an earlier post, I talked about GDPR myths that persist. That's one of those myths, was the idea of data protection, or privacy is an absolute, again, full control over your data. There are a lot of absolutists out there, and the absolutism sometimes comes from a good place, if that makes sense. There is the ideal versus real, a lot of people, I think, on the advocacy side, kind of live in the ideal. That's good. We do need those people; we do need to move the needle towards our better angels, as they would say. But we also need to be kind of pragmatic about the fact that we live in a world with a billion people, and our rights bump up against one another all the time. You cannot have one person's right be supreme over others because those are the kinds of totalitarian or evil dystopian nightmare-type scenarios that most of us don't want. I don't think anyone actually wants that. So recognizing that balance and recognizing how that needs to play out in practice is really, really important. There's the other problem of absolutism that comes from just a lack of fundamental understanding about things. They talked in the myths article about the idea of consent being your only man, right? If you think that all data protection or all rights to your personal data should be grounded in consent, you're going to think that you can revoke consent for everything. That's just not real. Then this is the law. At least the GDPR recognizes that there are boundaries, that there are points where no, you don't have an absolute right to delete your data. You don't have an absolute right to access everything. That's different than what a lot of controllers or organizations share with people. They're not necessarily doing it right, either, but it's not absolute. These are all bounces, these are all given to me, I want to move away from the Church of on High holy data protection beliefs, as much as possible towards something that's more grounded in reality and pragmatism and thinking about the fact that these are hard problems and don't have simple binary solutions.

Debbie Reynolds  15:21
Absolutely. I guess I'll give you an example of something, and I went to talk at the time when Europe had ruled against Google Analytics; when this ruling came out, there was a rush of people like, hey, you know, it was illegal use of Google Analytics. So you need to use something different, right? What I saw was a lot of companies not doing anything. Right? I think part of that pragmatism, I think, needs to be realizing that companies make choices, whether you like them or not, and that you, as a consumer, could definitely make a choice based on companies that you want to support. But just because it really came down about a particular application doesn't mean that companies are going to rush to change to something else. What are your thoughts?

Carey Lening  16:09
I totally agree; there's no way to not agree with that because that is the reality of where everything is. I think there's an interesting discussion there, which is not that all the companies are necessarily bad for not complying. Sometimes it is just not feasible. Sometimes it is just not realistic. Sometimes, to be honest, especially the Google Analytics cases, were troubling because they were absolutist; they were just purely absolutist. Yes, in the strict sense. These companies were violating the law because they were transmitting IP address information in cookies and things like that to the United States. At that time, there was no adequacy arrangement. There were standard contractual clauses, which were a bit of a mess, and all these other things because of the past Schrems 2 decisions. That's all true. But if you think about it at a logical level, it's also completely absurd to think that the level of risk of Google obtaining your IP address is so bad that it warrants shutting all that thing down. I think that Europe, I think one area, maybe attention blindness on their part, is that they are so focused on big tech, the focus on going after the repeat offenders, that they cast a very wide net in terms of how they apply their regulations. They do it to everyone, regardless of risk, regardless of the situation, regardless of looking at nuance, or factors that very much distinguish a Google from a small mom-and-pop or a small website that's using maybe this may be using Google Analytics, you know, the small website that's using Google Analytics is not sending information to the NSA. To be perfectly honest, Google's ad campaigns and cookies are also not sending things to the NSA, the thing that the NSA is getting data from is by buying it from data brokers. I mean, there was a recent article that just came out about that, right? Yeah, they're not getting it from Google Analytics. So the idea that we spent all this time focused on that and screaming at small controllers because Max Schrems decided to go on his crusade was really misplaced. We should have been focusing on much more pressing concerns; we should have been focusing on things like ClearView AI, facial recognition, or the fact that law enforcement is doing all sorts of dodgy things all the time. But they didn't. They focused on the fact that it was Google and Google bad. That's deeply unfortunate. So I tend to rail a lot in my articles about that problem.

Debbie Reynolds  18:48
Yeah, I was concerned, and I want your thoughts about or I've always been concerned about all the cookie laws and cookie legislation and cookie talk because cookie to me is like a vehicle for something, right? So a lot of the laws and a lot of people railing against cookies, what they really should have been railing against is this sharing, without consent of people. What we have now is that companies are saying, okay, we're going to move away from cookies. But what they didn't move away from was this sharing, which should have been the point or should have been the thing that we should have been focused on, right?

Carey Lening  19:34
The cookie thing is hilarious to me. So the law in Europe is the E privacy directive. So it's actually less to do with the GDPR and more to do with a very much older law from 2002. Believe it or not, it deals with the whole cookie thing, and cookies are so stupid. The cookie pop-ups make my head hurt, and I cry every single time and die a little. The thing is, we got here because a handful of organizations, including IAB, the International Advertising Bureau, or whatever, tried to be clever. This could have been very easily solved in the browser. If organizations had accepted Do Not Track, it's what California is trying to actually get done now. I think that is the right approach; it should have been a technical solution to a very technical problem. Because cookies do have benefits. There are reasons why the law is as convoluted as it is, especially with recent interpretations, it creates a distinction between strictly necessary or sometimes called essential cookies and kind of everything else, right? The ABS and entities like Google tried to, again, play fast and loose with that and play shenanigans and try to get around that. All of that screwing around for the last 15 or so years is why we have endless amounts of cookie banners and pop-ups, and everyone is sick of it. It's not solving any problem. It's not actually protecting anyone's data. Because you're right, the main risk, the threat, is the sharing, not the cookies. Now what we do is we have just loaded a bunch of obligations onto individuals instead of having this technical solution to a technical problem. We've made it a thing where we have to No, see the cookie banner pop up and click OK. OK, OK. Or if you're more diligent, you go and try to find the reject button, and woe betide if you cannot find the reject button. Then you just have to silently gripe to yourself; the cookie thing is, in some respects, a law problem. But it's also, in many respects, an industry trying to be a clever problem. If they had spent less time trying to be clever, we would not be having this debate now in 2024. So that's my thoughts on cookies.

Debbie Reynolds  21:54
I agree wholeheartedly. What's happening in the world right now that concerns you most as it relates to privacy?

Carey Lening  22:02
I think the thing that concerns me is that we're not, and by we, I mean both regulators and lawmakers, but also even the pundit class, like myself and others, are not really thinking about what the big picture is, and what it's likely to be. We are very focused on the shiny objects. So right now, it's AI, or facial recognition, or cookies, or whatever the new shiny is at the moment. Next week, it'll probably be quantum computing or the Apple Vision Pro will scare everyone into submission, again, about VR; who knows, we focus on the shiny objects, and we don't look at the larger picture; we don't look at how our responses to putting out what seems like an endless number of fires through bad behavior are going to have consequences down the line. I talk in a couple of articles at this point about the idea of fractal complexity and complexity theory in general. I think most people don't look at problems that way. Complexity theory says, look, you frequently do not have a single answer solution to a problem. A problem is based on many, many different failure points, many different issues, many different complications, and many different factors. You'll find that if you dig into a problem long enough, you start to see, okay, well, so this choice was made by this organization to do this thing to achieve this goal. That goal might have been good; it might have been a very noble, reasonable goal. Then that choice influenced other subsequent choices. Those subsequent choices may have negative downstream effects that no one was anticipating. No one was thinking far enough through to realize, okay, well, this might have a problem. AI is a really good example here. I think that fundamentally, the folks that were working on generative, large language models, were thinking of the benefits to society. They weren't thinking about the harm nearly as much as they probably should have. They certainly weren't thinking about the harms that were more likely. There's a lot of people who are like accelerationists and AI people who are more than happy to think about whether or not the robots will turn us into paperclips, but there's not so many people thinking about, oh, hey, how do we deal with this copyright thing? Or how are we going to deal with this deep fakes issue? They weren't thinking about that at the design stage. So we now have the consequences of that. That's a big problem. And I see this frequently in how we are trying to legislate solutions to problems. The EU has come out with, God, six different laws or something over the past couple of years, all related to the constellation of dealing with data. Probably more like 10 laws at this point. But anyway, They keep coming up with new laws all the time. It's a very big fanfare. Everyone talks about the new law, but no one actually really reads the law, or few people read the law. But the bigger thing is, is that very, very, very few actually look at how one law impacts the others. That's just in Europe, you started multiplying that to the fact that we live in a global connected world. We have China's laws, we have America's laws, and we have the individual State laws within America that are gonna be different than the Federal laws. Then we have Brazil, and we have India, and we have China, and it just keeps going. Those laws are well integrated. They all have different opinions and beliefs about what, say, personal data is or personal information. They all have different thoughts about what is legally acceptable, what is permissible to do with personal data, and whose rights are more paramount. They all have different interpretations of what obligations are on individual organizations. The problem is there's a lot of conflict between those whose law takes precedence and priority, those kinds of questions. I'm worried they will cause everyone to just give up, because how do you comply with things that are just mutually exclusive conditions? It's not really possible; I call the term, I take it from computer science or society of code that when you write some piece of software, and you have to add new pieces of code to it, and you're not necessarily looking at the old code, you're not necessarily thinking about how your new code affects your old code, and you're adding workarounds to solve a problem immediately, instead of thinking bigger picture about the larger problems you're trying to solve. That creates something called code debt. Well, the law is the same. There's a lot of legal code debt that's floating around right now. I worry that's just gonna get worse.

Debbie Reynolds  27:03
I agree with your concern. I guess I have a lot of concerns in that area. But one thing that really sticks out to me that I'm very concerned about, and part of this, I think we saw in some of these cookie laws, is that the way some of these laws are written, they don't really understand the technology.

Carey Lening  27:16
Oh yeah, that's a huge problem. There's a big disconnect between technologists understanding the law and legislators and regulators understanding the tech, and they do not communicate in the same language at all. That's a big problem. So I think they need more folks that are going to be fluent in that. I'm not just talking about people who are posting their AI experts on LinkedIn, I'm talking about folks who actually fundamentally understand here's the limits of what the technology can and can't do. The AI is not at all seeing omnipotent thing that can do whatever you want all the time, there's limits, there's things that that thing can do really well and that it can't, but a lot of people don't think like that. Then similarly, a lot of technologists I know because I've worked with many, and I am married to one, all kinds expect a binary answer; they want the law to be certain. Because computers can be a certain sort of, you can, at least in context, get a binary answer, a zero or one. Not everywhere, not all the time, but you can get a little bit more certainty and a lot of things in the law, especially the law lately, dealing with these dynamic problems. This fractal complexity, you're really not getting that you're getting a lot of nuance. You're getting a lot of, well, it depends on this situation. That, and engineers hate that it's never going to be a happy world until we start getting more folks who can translate between those two universes, and try to explain and move over the uncertainty and the problems that exist with that.

Debbie Reynolds  28:53
I agree with that; I want your thoughts about complexity. So you brought it up a bit. I want to talk about complexity as it relates to the enterprise. I feel like a lot of companies are run; maybe this is the way people used to use MBAs, where they teach you how to run a company, right? Where it's like Santa's workshop, and everybody has their own little thing that they do, right. Then at the end, some magic toy comes out at the end. But I feel like the complexity of computing is such, especially around the way companies handle data, that if you're thinking about it in terms of a sales workshop, a type of deal where someone has their blinders on, they only do one thing. They don't understand all the different ways that data is used in the organization. I think companies like that are going to be in big trouble. What are your thoughts?

Carey Lening  29:02
Okay. If that's true, we're all in trouble because I don't think anyone has a perfect understanding of how they use data. I think there's a lot of people who think they do. But it turns out this sort of thing is really tricky unless you have a very small organization and small, I mean, like less than a dozen people, you're doing some sort of like widget manufacturing, right, where you're not really dealing with people so much as you're dealing with other entities, selling widgets or physical goods. If you're doing anything online, you're probably going to be processing people's data and understanding exactly how that works. Exactly how the systems that you use and rely on are taking that data and using it. However, your cloud storage, or your cloud services or your website, or the billing program that you're using to invoice customers, or your HR program, or your third parties that are providing you services, understanding how all those organizations are using that data, in those specific ways is just insane. It's wild, it's impossible. I think that we all have to, in some respects, be okay with that. Because we're human, and we're not perfect things. We're not all knowing, all seeing omnipotent. I've been watching Star Trek lately. So we're not Data from Star Trek; we don't have a, you know, an immediate connection to all the things and all the data, and being okay with that uncertainty needs to become something that everyone is comfortable with. To a certain extent, that is not the same thing as saying we should just be okay with companies doing abysmally terrible things with our data; we need to look at it from a risk perspective; the GDPR did one thing really, really well. Or at least they tried, it took the model that we in fact, cannot know everything all the time. We cannot, in fact, solve all the problems or be 100% compliant; what we can do is take a risk-based approach to looking at a problem. That risk-based approach says, hey, look, if you're doing stuff that's creepy, invasive and really affecting people and could have a profound impact on their lives, you're dealing with making assessments about whether or not a person should get a loan or whether or not a person should be denied health care or using sensitive characteristics about them to make a profile so that you can sell it to data brokers. Yeah, you should have to think more critically; you should have to do more work to protect those people's data because that's going to have an impact. Yeah, I was reading the 23andme breach, for instance. 23andme should absolutely get nailed to the wall. Because they were dealing with extremely sensitive genetic information about millions of people. It wasn't just the 1700 accounts that had passwords breached. It was the 6.9 million relations of those people that also got impacted. 23 and Me just didn't think about risks; they didn't think about impacts necessarily, or at least they didn't think enough about them. I'm sure someone in their compliance team thought about some of these things. But they should get nailed to the wall. Because that has an impact on people, you can't change your genetic code, right, you can change your email impact needs to be a consideration of shouldn't look at and regulators really shouldn't treat every organization that doesn't have, maybe their privacy notice isn't 100% Perfect, they shouldn't be getting nailed to the wall in the same way that 23 and Me should be getting nailed to the wall, we should be looking at things like cookies in a more proportionate light relative to their impact on individuals. I think if we refocus and prioritize a little bit more, and some regulators are better at this than others, some of them actually do look at high impacts and risks when they're making decisions. Others look at the low-hanging fruit. Others look at, well, this organization gets 1000 complaints because they have shitty customer service. And people like to complain. So, they use data protection as a mechanism to get a regulator's attention. Sometimes that's grounded in a reasonable bit of frustration. But sometimes the regulators look at it, and they're like, oh, well, this is a clear violation. So this is easy for people; let's just go after that. And they prioritize those kinds of efforts. It's really unfortunate because they're not actually protecting people. They're just racking up fines, or they're just kind of making. They're just they're just going after an organization for superficial reasons. I don't think that that's a particularly productive use of time. I would much rather see regulators actually spending time going after the worst offenders in the world. For the things that are actually the worst offenses. I would have loved to have seen more action against the Cambridge analytical thing, a more actual concrete response to the Cambridge analytical thing. We had the decision in the United States, but not really so much in Europe; Meta has been getting beaten down over what are, in my opinion, much smaller, less impactful to individuals. rights and freedoms kinds of things, and that's unfortunate.

Debbie Reynolds  34:56
So if it were the world, according to you, Carey, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be law, technology, or human behavior?

Carey Lening  35:18
If I could just get people to stop being crappy to each other, that would be great. But I think in a more pragmatic sense, I would love it if we started to develop laws that were thinking along the lines of global standards and global agreements that could be done more like treaties than individual nation-state laws, at least around these very global enterprises. My legal code, that article really goes into this in a lot of depth. But I really do see this as going to be a thing that will cripple a lot of organizations, a lot of industries, or works; it will lead to people just giving up and throwing up their hands, and the loss of privacy and data protection, because you can't comply with all of these conflicting things. I would love to see all the laws stuck together. We try to make sense of definitions and terms of art and things. So we all agree on a common standard. I joked in one of my articles that I found out a fact that my mom sent a package for Christmas to me, and I always tell my mom never to send packages to Ireland because it's a disaster. This time, it was absolutely a disaster. The package she sent sat in Jamaica for, like, a month. No one knew where it was. I was deeply frustrated because Ireland didn't know where it was, the United States didn't know where it was, and no one knew where anything was. I was on the phone with a customer service person for On Post, which is the Irish Post Office. He says, oh, well, we need to get the Postal Union involved. I'm like the what? Postal Union? I thought he was blowing smoke at me. It turns out that there is a Universal Postal Union that was formed in like 1860, or something. It's one of the oldest inter-governmental entities that has existed. What was fascinating about that, is that the evolution of the Postal Union was to solve a global problem. It was designed to deal with the fact that tracking and getting payments for tariffs and all these other things for mail was really hard when you were shipping it across countries and continents. So countries got together, and they said, alright, well, we need to set up a common set of standards, we need to set up a common set of rules, baselines of what we want. I would love to see privacy and data protection go in that direction; I would love to see the United States get on board and actually recognize human rights for one. That includes a right to data protection and privacy, I would love to see countries start to develop common standards; the GDPR is a good start for a lot of things, there are parts of US laws that are really, really good starts for other things, I would love to see people come together and actually come up with something that people could follow a baseline, a common framework, the UK-EU of data protection would be really nice because I think then we would actually start to see behavioral change because then it would be discernible and it could be something that's actually achievable. Right now, it's just a mess; I want it to be a little bit more clear for everyone. So that's what I would do if I was clean for a day or whatever, for data protection.

Debbie Reynolds  38:36
I love that. That's something I wish too. Before the GDPR came out. I thought, maybe foolish of me, that maybe someone in The Hague would decide that they want to do something internationally about data protection and that never happened.

Carey Lening  38:53
There was like convention went away that I think there were efforts to get human rights as a concept and I think that's a foundational approach. You really do need to change the narrative on that. Because otherwise we're not going to get anywhere, or I agree. There were efforts, but yeah, I think it's just really hard to get countries to support this sort of thing. It's all political now and even worse than it was in the 50s and 60s and 70s.

Debbie Reynolds  39:18
Yeah, well, we'll move forward and definitely keep fighting, as this is a worthwhile endeavor for sure.

Carey Lening  39:25
Hope so. Or I'll just go crazy, and then I'll be gibbering to myself with my cats. It'll be great.

Debbie Reynolds  39:34
Oh wow. Well, thank you so much for joining me. I really appreciate it. You're always illuminating. Definitely for anyone. Check out Carey's Chronicles of the Constantly Curious on SubStack. It's wonderful.

Carey Lening  39:47
Thank you, Debbie, so much. I appreciate you for having me on. I love having the conversation. It's always fun.

Debbie Reynolds  39:54
Thank you so much. I'll talk to you soon.