"The Data Diva" Talks Privacy Podcast

The Data Diva E219 - Robert Bateman and Debbie Reynolds

Season 5 Episode 219

Send us a text

Debbie Reynolds, “The Data Diva,” talks to Robert Bateman, Owner of KnowData Ltd in the United Kingdom, for an in-depth discussion on the evolving complexities of data privacy regulations worldwide. Robert begins by sharing his privacy journey, starting during his law studies when he began analyzing the GDPR and CCPA. His experience writing extensively about these topics laid the foundation for his role as a leading voice in the privacy community.

The conversation highlights the unique challenges of the U.S. privacy landscape, where fragmented state laws and federal legislative debates create an unpredictable regulatory environment. Robert and Debbie also tackle the intricacies of international data transfers, particularly the tension between the EU and the U.S., as seen in the Uber fine controversy. They explore the confusing regulatory landscape caused by conflicting interpretations of data transfer mechanisms and the far-reaching implications for businesses.

Additionally, the discussion goes into the influence of global standards organizations such as IEEE, ISO, and NIST on shaping privacy and safety frameworks. Robert and Debbie discuss how emerging technologies like AI further complicate privacy matters and stress the importance of responsible AI decision-making, emphasizing the need to retain human oversight in critical processes.

Looking toward the future, Robert advocates for formally recognizing privacy as a fundamental human right globally. He highlights the progress as businesses increasingly take privacy compliance seriously, even amidst the challenges. Debbie adds her insights on the evolving regulatory landscape, predicting heightened activity and complexity in privacy law as new technologies and regulations emerge. Robert shares his hope for Data Privacy in the future.

Support the show

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:26] Now I have a very special guest all the way from the United Kingdom, Robert Bateman. He is the owner of no Data Limited. Welcome.

[00:35] Robert Bateman: Great to be here, Debbie.

[00:37] Debbie Reynolds: Well, I've known you over the years. We've gotten to know each other over LinkedIn over a number of years because of a lot of your great writing and advocacy around privacy.

[00:48] You do a lot of deep dives. I love the videos that you do, the presentations that you do. I actually had a chance to meet you in person in London at one of the Global World Forum events.

[01:01] I think. I don't know if it was Risk. I can't remember which one it was. But anyway, that was fun to meet you in person and I love the fact that, you know, you are a LinkedIn top voice.

[01:11] You do really put out a lot of content. That's very helpful. And I always make sure that when I'm talking to people and they say, who should I follow? Like, definitely follow Robert.

[01:21] Like, he brings, brings a lot of knowledge and a lot of information to the space. But welcome to the show.

[01:28] Robert Bateman: Thank you, Debbie. That's so nice to hear. And yeah, I recall we met quite briefly in the audience at London event.

[01:37] One of those put on by my old employer could have been privsec or Risk. So that was nice. And yeah, I've been following you on LinkedIn since I started thinking about this stuff, really and very much appreciate your work too.

[01:52] Debbie Reynolds: Excellent, excellent.

[01:54] Well, you are very prolific in the, what you write and the things that you present on privacy and you do a lot of writing with, you know, some other, other companies as well.

[02:06] And you know, I just want to get your thoughts on, first of all, how did you get into this? So how did you become such a. A champion for privacy?

[02:16] Why is this important to you?

[02:18] Robert Bateman: Well, it started when I was doing a law degree and I guess it would have been 2017.

[02:26] I started during the break.

[02:29] I wanted a kind of side hustle. So I started writing for privacy service provider about the gdpr, which was quite new at the time and the ccpa, which was also very new at the time, that was 2018, of course.

[02:47] And I used my knowledge again from the course, you know, law, and found I was, I found it very easy to write about this topic and found it quite fascinating.

[03:00] So they wanted articles about every single aspect of the gdpr. You know, there was even a kind of article by article breakdown of the whole law. And just by doing that, I sort of embedded this knowledge about the field in my head.

[03:15] I then did research projects at the, as part of the law degree about the UK's Data Protection Act. There was a provision called the Immigration Amendment, which immigration exception, which allowed the government and private companies to exclude subject access requests on a very broad grounds of sort of maintaining immigration control or something like that, which I felt was very problematic under human rights law.

[03:45] So that kind of helped me appreciate how important this stuff can be.

[03:50] And since then I have done lots of little jobs and lots of writing.

[03:57] Then I moved into training and I was working with this events company that you mentioned earlier, putting on events about data protection, meeting lots and lots of people from across the industry.

[04:08] Now I focus mostly on consultancy and also training.

[04:12] So yeah, that's, that's how it started and where I'm at now.

[04:17] Debbie Reynolds: Well, you do a great job. You do a great job of really reaching out and being able to share your knowledge with people. I think one of the things I admire about you as well is that I feel as though not a lot of people who are outside the US really understand the craziness of the way law works in the us, and I think that you'd understand that really well.

[04:42] Robert Bateman: Yeah, that has been a real interest for me over the past few years. Less so right at this moment because I'm mostly advising clients and it's slightly tricky to do that, you know, in certain areas of U.S.

[04:57] law. But the U.S. privacy scene is endlessly fascinating and, you know, developing so quickly and so dramatically. I think, what are we at now? 21 states with a comprehensive privacy law and lots of information enforcement from the ftc, Much more active than many regulators here on this side of the Atlantic.

[05:20] And of course the sectoral considerations, it's really, really messy and really overlapping obligations and exceptions and carve outs and the various obligations from different parts of the law, federal law, state law.

[05:41] It really gives me a lot to think about and lots of new developments to follow. The EU privacy law, data protection law can go around in circles a little bit.

[05:52] You know, five year long debates about the meaning of a particular word in a particular paragraph. Whereas in the us you haven't quite got to that stage yet and the law is still developing in some really interesting ways.

[06:05] Debbie Reynolds: Yeah, the US is very confusing. It's very confusing privacy landscape, you know, especially because we have, you know, federal rights, we have state's rights, and that's. Those things are very incongruent.

[06:20] And then we have these pushes that have been happening that probably will continue even more forcefully to try to get some type of legislation on the federal level that tries to preempt some state laws, even though not all state laws can be preempted in that way.

[06:37] So I think the next few years will be very interesting, to say the least.

[06:43] Robert Bateman: In the US you pointed something out to me about preemption of state laws, I think a few months ago, and the fact that, well, I kind of knew this in principle, but the American Privacy Rights act, how it would pre own some parts of, for example, BIPA or BIPA in Illinois and not others.

[07:03] And it's just, you can see why I think privacy law is most or more so the domain of lawyers in the US than in Europe. I think there are other reasons for that too.

[07:19] It's so hard to grasp these subtleties without having some good legal understanding of the different levels of power and federal, state and so on the way litigation comes into it too.

[07:34] And also it's kind of a little bit more tenuous because you have these cases that can knock down a law on federal constitutional grounds, like is happening with the age appropriate design code in California, where what are they called, net fair or something like that, or bringing a case saying that the law itself is unconstitutional, certain provisions of it are.

[08:01] And so I always wonder about all these new state laws. Is it kind of a house of cards waiting to be blown down, which keeps it kind of dynamic and interesting as well, if very confusing and uncertain.

[08:14] Debbie Reynolds: Yeah. And I think one of the things that we lack here in the US is we don't really have body that can work on privacy is not related to elections.

[08:33] Right, Right. So, you know, like I wish we had like an edpb, like a board, you know, it's like, okay, this is the board and they're working on privacy and they're not trying to get elected or not afraid and they're not going to get elected.

[08:45] So a lot of the push and pull that happens in the US around privacy and laws happen around like who's in charge or who's going to be in charge or who's not going to be in charge.

[08:58] Right. So even, even the most recent push in the US to try to get a law out on a federal level, you know, one of the senators that was pushing the law, she was retiring, so she thought, well, I could do whatever.

[09:14] Right. Because I don't have to worry about trying to get elected. And so, you know, I think that tension is always there in the U.S.

[09:21] Robert Bateman: Yeah, on many levels, too. And I guess the FTC has some degree of independence, but Lina Khan won't be there forever and she has changed the face of that agency so much, it could be really dramatic when post comes to an end.

[09:40] Of course, then you've got the Supreme Court, too. We don't know what they're going to do with various rulings. And what was the case a couple of weeks ago, Chevron, which talked about whether courts have to follow regulatory interpretations of laws.

[09:56] Those kinds of things are really easy to miss, but have huge implications for the everyday work of privacy professionals. I suppose it's just very difficult to predict how things are going to go and how these changes will manifest, particularly in the us, I think.

[10:14] Debbie Reynolds: Yeah. So what's happening in the world right now that's concerning you most about privacy?

[10:22] Robert Bateman: Well, we're speaking on the day that I think is official now, the election of Donald Trump. So I was thinking about trying to write something about what could happen with AI privacy, digital kind of policy under him.

[10:43] It. It's just so hard to say. Again, I mean, I saw an article a few months ago by someone talking about the US Data privacy framework, the data transfer mechanism, and how Trump's re election could have implications for that.

[11:01] Because of course, it's all based on the rule of law. If we take things seriously that he said during campaigning, then maybe there are some concerns around, you know, the kind of robustness of democratic institutions in the US that could have implications for that adequacy decision.

[11:22] That's the kind of first thing that came to my mind when you asked, because it's so relevant as we're speaking, perhaps not as people are listening to this, but the other developments, I guess this side of the pond.

[11:38] The UK has just performed second attempt or third attempt technically, at reforming its data protection and privacy framework. The EDPB that you mentioned earlier has been very, very busy bringing very strict interpretations of the GDPR and the Privacy Directive.

[11:59] The AI act is starting to kick in in the eu, so there's no shortage of things to discuss.

[12:08] I can't really pick one as my. The thing in which I'm most interested. I do this kind of newsletter for a client every other week and it's really a matter of choosing what not to write about because I never struggle for topics.

[12:23] There's so much going on.

[12:24] Debbie Reynolds: Yeah, I guess if I had a crystal ball, what I think the implications of election will be is that there probably will be. They'll throw up a federal bill, but it would be like a really wafer thin bill.

[12:41] And because they have the Senate, they may be able to vote in a bill that's like very weak and very thin so that they can do at least some level of preemption because that's what a lot of the corporate people have wanted for a while.

[12:55] But again, because states are a bit different, I think that there are still going to be a lot of states that are going to pass their own laws and they're going to probably start to pass more of those general applicability types of laws that cannot be preempted.

[13:10] So that's kind of what I think is going to happen with privacy. So I think it's going to get a lot more crazy.

[13:16] Robert Bateman: Interesting. So they got the Senate and they could introduce new law kind of to replace the, a new bill to replace the apra, you think, Debbie?

[13:29] Debbie Reynolds: Oh yeah.

[13:30] Robert Bateman: Doing something that pays lip service to the idea of a federal privacy law.

[13:36] I don't know whether that would be better or worse than nothing at all. Some people say that's what the APRA was, you know, the American Privacy Rights Act. I think that was a little unfair.

[13:45] I mean it was quite, it is quite a meaningful bill. But if we, if we, if, if you as, as in the US pass a law that looks like a federal privacy law but doesn't have much effect, then I suppose some people might consider the debate closed for a couple of decades.

[14:06] So it might actually be not represent much progress legally.

[14:12] Debbie Reynolds: I think what I, what I'm seeing and I want your thoughts about this and I think it's really interesting. So I think just like the GDPR was very influential in the US on privacy, I think Artificial Intelligence act will be similarly influential.

[14:29] You know, I think we're still seeing that on the state level and I think that will continue to happen. Also there are some regulations that are happening in Europe about things like the Internet of things about labeling and how those devices companies need to share information about data and because of international trade, I think, and we don't have anything like that in the U.S.

[14:55] i think that's going to push companies to try to align with some of those things even if there isn't regulation in the US because it's going to be a barrier to commerce, I believe.

[15:08] Robert Bateman: Yeah, I think you could probably be right there. I mean the U.S. actually managed to, well, Biden managed to pass the AI executive order before the EU finalized the AI act and of course it's not as significant or as difficult to implement as the AI act in the eu.

[15:30] But that was interesting that they were really quite proactive on AI.

[15:35] And of course I wouldn't bother trying to predict what Trump would do in this area. I think he has given mixed messages about AI regulation.

[15:47] But the point about international trade, you might be right. I mean the legislation could catch up with changing business practices. For example, you referenced there. I think I get these ones confused.

[16:02] But maybe the Data act or the Data Governance act, one of the two, which applies to Internet of things providers and forces them to share data in particular ways and U.S.

[16:15] products on the EU markets, it might not be worth U.S. providers kind of designing EU market specific products, especially when it's so core to the product itself. You know, with gdpr I suppose you can do a bit of tweaking to ensure people have access to the data and can delete it and so on based on where they are with if it's something you really do have to bake into the product yourself as applies with AI act and the Data act and so on.

[16:48] Perhaps it will, it will change company practices so much that they say, well, we might as well legislate for this now because people are doing it anyway and it's not harming their companies.

[17:00] We could have these standards in the US as well.

[17:03] Maybe you're right.

[17:04] Debbie Reynolds: Yeah. I think the next battleground which will be in my view kind of around standards. So whether that be IEEE or ISO, you know, these organizations are very active in the standard space, especially as we have more products that are collecting data.

[17:24] So you know, even nist, I think that that guidance will be very influential about how we kind of move forward in the future as we're using more things that are Internet connected or things that are collecting personal data.

[17:37] You know, I think that guidance, because it's not in the political platform and you know these projects go on for many years. I think they'll be hopefully there'll be more kind of privacy by design or like I like I like to say safety by design.

[17:52] Things that happen where it, you know, there'll be an impetus to move in those directions because it again, it makes trade better. It may make people trust what, what companies are doing with these, these things.

[18:08] And these standards organizations don't really wax and wane with who's in office at the time.

[18:16] Robert Bateman: Yeah. And we do have several references in off the top of my head, I think it's Tennessee and the Tennessee Information Protection act and also The Colorado AI act and also that Biden executive order I mentioned.

[18:32] They cross reference some of these voluntary standards to kind of give organizations a safe harbor of some sort if they, if they are accredited under them or they comply with them.

[18:43] So maybe that could be. Maybe. Yeah, I think you're right. That could be increasingly important going forward as companies look for ways to kind of harmonize their operations on both sides of the Atlantic.

[18:56] And it's interesting to see these standards referred to specifically. It's always an example, like an equivalent to ISO 27, whatever. But they do give companies an impetus to comply with those standards.

[19:11] And they are. You know, I've seen a few companies, a few clients recently going for ISO certification and a good certification provider, you know, suitably rigorous.

[19:24] It really is very meaningful. You know, it can really bring an organization, standards up. I used to think of it as a bit of a checkbox kind of thing, but it can be very detailed and rigorous process and do a lot of good.

[19:37] So maybe that is a good way forward for companies in jurisdictions where they're not. They don't have direct legal obligations to do certain things.

[19:47] Debbie Reynolds: Yeah. What's your thoughts about data transfers? Whether they be in the US I don't know state to state. But like the international data transfers, this is a area that you write about a lot.

[19:59] This is some of your most interesting writing to me because I feel like that's probably the most complex part of privacy. You know, how to square the different laws and different jurisdictions and how you share data across.

[20:13] So I want your thoughts on that.

[20:15] Robert Bateman: Yeah, it's a really interesting area for me and I do find it very frustrating at the same time.

[20:23] But I find it hard to settle on a good kind of policy position or opinion as to what we should do. Because, well, obviously in the EU we had the Schrems cases and we had the very strict interpretation of those cases by the edpb, which mean that for a period of time before the EU US data privacy framework was agreed, as we were discussing earlier, millions or billions of illegal data transfers were occurring every day.

[20:59] I think because people don't quite appreciate how strictly worded the law is and how strictly it's interpreted by the regulators and arguably the court. So there was a period of time where I think, well, we know that we had the Google Analytics decisions in the EU where I think about 10, 15 cases where website operators were deemed to be in violation of the rules on international data transfers by using Google Analytics, because Google would, in the course of providing its analytics service, transfer some data to the U.S.

[21:40] now, the reason I raised these cases is because the types of data Google transfer to the US via Google Analytics were about as kind of technical and potentially benign as you can imagine.

[21:53] You know, it was bits of people's IP addresses, which websites someone associated with that IP address might have been to, and maybe some association with Google account, depending on the implementation.

[22:06] Now, I'm not saying that Google is a completely benevolent organization as far as privacy goes, but those sorts of data transfers were about as kind of low risk as you can get, I think.

[22:21] And yet they were still deemed to be in violation of the data transfer rules, you know, consent or no consent at that time, there was no real way to conduct those data transfers legally.

[22:32] And I just wonder who really. And there might be someone, some activists or, you know, disadvantaged person, but who is likely to be harmed by that type of activity. And on balance, I think nobody.

[22:49] But if we had, you know, maybe some people, but it's difficult to imagine how the NSA would be interested in that sort of data or would use it in any kind of malevolent way.

[23:01] But, you know, with the law as it stood and as it was being interpreted, those things were pretty much illegal in the same way as transferring people's credit card numbers directly to the Kremlin would have been.

[23:17] You know, these were treated as equally serious violations of Chapter five gdpr. Just because the. The law is written in such a kind of blunt way and have been interpreted by the authorities in such a blunt way.

[23:30] But if you press a switch and automatically every company in the EU was, you know, all those unlawful transfers were shut down, it would have been total carnage. You know, the hospitals would have struggled to keep the lights on, I think.

[23:46] And, you know, air traffic controllers might have had issues because data just flows internationally by default now. So the kind of disconnect, I think, between what the law said and what was happening in reality was, was quite fascinating to me at that point.

[24:03] And, you know, it still applies to some extent, but less so at the moment.

[24:07] Debbie Reynolds: Yeah, actually, it's interesting that you brought that up, because when that, when those rulings came down, I was very concerned about it, as were you about that, in terms of the banality of some of the things that they were trying to stop.

[24:22] I understand it in principle, but also the thing that concerned me about that is that because Google Analytics is such a widely used product, you know, I have clients in Europe as well, and they never stop using it.

[24:37] Right. So I think to say that you need to stop using a product because it's going to because it's illegal in some way, but then really not giving someone like an alternative, you know, what they should do.

[24:53] I think it becomes a concern because it makes it seem, in my view, makes it seem like businesses think what's the risk to me? Or maybe I shouldn't comply with this.

[25:06] Right. So I think what you really want in laws is to make something, you want to make it meaningful, but you also want to make it something that a company can really implement.

[25:16] And so that was my concerned with that where I felt like a lot of companies have been so embedded over decades, probably, you know, as long as Google Analytics has existed and for them to really rip out that tool and there was really a comparable alternative for them, makes it hard, you know, because I feel like businesses want to comply with laws, but if you make it too hard for them, I think they won't comply.

[25:43] And then it makes it are for them to take the regulator seriously when they do things like that.

[25:50] Robert Bateman: I agree. And it wasn't just Google Analytics, of course, as you know, it was pretty much any US based service provider. And the argument in favor of having strict data transfer rules like that, I mean there is one, of course, you go back to Snowden and the kind of surveillance industrial complex or whatever you want to call it you've got.

[26:14] The NSA can make very good use from metadata as we know.

[26:19] I don't personally care if they are able to associate my activities on, you know, random Amazon website or whatever with my IP address, but some people do and they I suppose have the right not to have to worry about being surveilled.

[26:42] But the cost sort of risk analysis didn't come up in the EU's favor, I don't think for most businesses it's hard to explain what the risk to them is and it's even harder to explain what the risk to anyone else is because if you're a random example, a carpenter with Google Analytics on the website, how do you explain the risk associated with someone visiting that website and their data being sent to the us?

[27:12] I mean there isn't one. It just doesn't, it doesn't exist. So I do think that strict interpretation of the law, not necessarily incorrect based on the court's reading of the law, but the implications were pretty bonkers, I think in large part.

[27:33] Debbie Reynolds: Yeah, yeah. One case that you talked about and I would love to get your thoughts about, maybe just for the audience, just give like a brief overview of what's happening, what happened with that case and I saw your writing on this, and this was the Uber, the huge Uber fine.

[27:48] I think it was like over $200 million. They were fined about a data transfer mechanism. And I thought your point of view was really interesting there because I think that they were being slammed because they didn't have.

[28:00] They weren't using standard contract clauses, they were using a different mechanism for data transfer. And so tell a bit about this case and your point of view there. I thought it was really interesting.

[28:11] Robert Bateman: Yeah, I had a bit of a hot take on this one, as they say. I think it was 288 million euros Uber got. So let's, let's talk it through. So in late 2021, the European Commission came out with its new standard contractual clauses.

[28:29] And Uber previously had the old SECs in its agreement between its EU entity and its US entity, and they were joint controllers. So they covered transfers between those two entities with the old SECs.

[28:46] When the new SCCs came out, the commission that publishes the SECs, drafts them, gives them legal effect, published an FAQ document, and indeed said similar things in the actual legal decision that gave effect to these new SCCs that they are not to be used for transfers to an organization that's directly covered by the GDPR under Article 3.

[29:14] 2. And you can see why. Because these SCCs, the whole point is if a company is outside of the EU's jurisdiction, these clauses kind of put contractually on them the GDPR requirements.

[29:28] So there's no statutory requirement for them to, for example, help facilitate data subject rights.

[29:37] So therefore you put a contractual requirement in place if the company in the US is covered directly by the GDPR for whatever reason, then the Commission said there's no point kind of duplicating these requirements under contracts under the SECs, because they might even contradict what the GDPR says directly, which is true in a couple of cases.

[29:58] There is a bit of tension. So Uber says, okay, fine, so we won't use the SECs then to transfer data to a US entity because it's covered by Article 3 to the GPR.

[30:11] Now, this was a mistake on Uber's part and they got some complaints via France. The Dutch Data Protection Authority looked into them and discovered that Uber did not use SCCs they did not consider, or Uber didn't consider these to be international data transfers that required SCCs because of what the Commission said, both in its FAQ and also the implementing decision, to which the SECs are an annex.

[30:38] So they said, basically the upshot is it's not that we Disagree that Uber US is covered by the GDPR. We just disagree with the Commission.

[30:47] The EDPB's position is that regardless of whether the importer is covered directly by the GDPR, the exporter still needs to put SCCs or something in place.

[31:01] And the problem, another problem is that the EDPB did not state this position until about six, seven months into the investigation into Uber. So Uber didn't even know really that that was the EDP's position.

[31:16] They were supposed to just read the law themselves, come to their own conclusion about what it meant.

[31:22] And the fact that the correct conclusion, according to the edpb, contradicts that of the European Commission that kind of wrote this law, didn't count for anything. So it was, for me, an example of how confusing this area can be and how the kind of tension between these EU institutions can cause real problems.

[31:43] And the Commission said actually, in addition, it was working on SCCs to cover those sorts of transfers between EU entities and non EU entities that are directly subject to the GDPR.

[31:55] They haven't done that, but they did start doing it about a week after Uber's 288 million euro fine was announced. So they have started doing that now, two years after they said they would.

[32:08] Three, in fact. So it's a real mess. And at the end of the day, if Uber had put those SCCs in place, I don't really see what difference it would have made to anyone.

[32:19] You know, it doesn't affect anyone. They, Uber, us, presumably felt it was legally obliged to do these things anyway. It's just a piece of paper.

[32:30] So it kind of made it all seem a little bit absurd and pointless.

[32:36] Other people disagree with my take on that, but I think I still stand by it.

[32:43] Debbie Reynolds: I liked her hot take on that. You know, what I had been seeing because companies were sort of confused about this.

[32:51] Some of the companies that I knew, the bigger companies, they did a belts and suspenders type of thing where they did both.

[32:58] Robert Bateman: Yeah, I always.

[33:00] I wasn't advising a lot at that time, but I would say just do it anyway. Just put them in just in case. So that's what Uber should have done. But it's not, you know, the fact that they should have done that for practical reasons, I don't think it detracts from the point that it's just the ambiguity and the lack of certainty is a problem.

[33:22] I think under the rule of law, you know, you're supposed to be able to predict how laws will affect you. And this contradictory information from the people that wrote the law and the people charged with enforcing It.

[33:32] I don't really think that's good enough. You know, it shouldn't be happening like that. I know there's not much sympathy for US tech companies, but that's beside the point, I think.

[33:41] You know, it could be a US tech company today or an individual person tomorrow. The law needs to be clear and applied kind of fairly and in a predictable way.

[33:52] And I don't think it was in that case in particular.

[33:54] Debbie Reynolds: Yeah. And where is that case? Is it being appealed? Like what's happening with it?

[33:58] Robert Bateman: I'm pretty sure Uber are appealing. And then we're going to have these new SECs eventually released from the commission cover those sorts of circumstances. That will make things even more confusing, I think.

[34:09] Debbie Reynolds: Yeah.

[34:10] Robert Bateman: Because then it will be about establishing whether the company's in scope of the GDPR directly which set to use is the EDPB even going to recognize this new set of SCCs.

[34:21] So I think the confusion is not going away.

[34:25] Debbie Reynolds: Yeah. Well, everyone in the world, according to you, Robert, and we did everything you said. What would be your wish for privacy or data protection anywhere in the world, whether that be regulation, human behavior or technology?

[34:38] Robert Bateman: I would like to have a formal recognition of privacy as a, as a human rights everywhere, which it isn't, you know, including to some extent in the, in the US And I would like that to be compatible with all the great technologies that we use day to day, which, you know, I, I do indulge in the use of US service providers and data hungry services myself and I would like to be able to choose to do that regardless of the fact that there is a kind of flaw protection for privacy as a human right.

[35:24] I have been observing, and this is almost a cliche now, but businesses are taking this stuff more seriously. I really do believe that. I have seen it even among the, you know, the kind of the, the naughty companies that are constantly getting fined, you know, your methods, your Googles and so on.

[35:43] They're spending a lot of time and money thinking about how to do this stuff. Right. Whether you think that's cynical or they're trying to avoid it, you know, it really is a discussion on the table now.

[35:54] So I really hope that continues to be the case and is even more so.

[36:02] And also I hope that we do not place too much trust in AI systems as they develop and are integrated more and more into daily lives. I hope that Things like Article 22 of the GPR and some of the protections in the AI act are taken seriously and implemented properly and extended to other parts of the world where AI based decision making is liable to become more prevalent.

[36:34] So a little bit of a vague kind of jumble of things there, but I think that recognition of data protection and privacy, how that can feed into strong AI practices, not letting AI decision making takeover, it would be a total disaster at the moment because it's not reliable enough.

[36:57] But even if it were, I would still like to see that human element retained in important decision making.

[37:05] Debbie Reynolds: I share your dream, your wish for privacy be a fundamental human right everywhere, especially in the us And I agree. I mean, this is a complicated future that we have with the way the technology is developing and what that means for privacy.

[37:22] But I think, you know, it's just going to be a bit of a strange period probably for the next few years as we kind of navigate, you know, this rapid advancement in, you know, implementing new technology and how that, what that means for personal data.

[37:38] Robert Bateman: We live in interesting times, which some people see as a curse. But it does mean there's lots of stuff to me to write about, at least.

[37:48] Debbie Reynolds: You have no shortage of things to write about. I think so, yeah. Well, thank you so much again for being on the show. It's a pleasure, pleasure to be able to have you on the show.

[37:57] And thank you so much again.

[37:59] Robert Bateman: Great to talk to you, Debbie. So glad I could come on. Thank you.

[38:03] Debbie Reynolds: All right, I'll be following you and everyone. Please follow Robert on LinkedIn. His profile is outstanding. He does a lot of really deep dives. I like the videos that you do as well, so people can get a bit of education when they look at your profile.

[38:18] Robert Bateman: Thanks again, Debbie.

[38:19] Debbie Reynolds: All right, bye.