"The Data Diva" Talks Privacy Podcast

The Data Diva E67 - Ralph O’Brien and Debbie Reynolds

February 15, 2022 Season 2 Episode 67
"The Data Diva" Talks Privacy Podcast
The Data Diva E67 - Ralph O’Brien and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva,” talks to Ralph O’Brien, Founder, and CEO of Reinbo Consulting Ltd., the United Kingdom. We discuss his journey into Data Privacy, the most surprising thing in Data Protection and Data Privacy since GDPR was enacted, the need for a multidisciplinary approach to Data Protection, his primary concern about Data Protection, Data Protection in the UK after Brexit, the impact of Schrems II and EU adequacy on the UK, law enforcement uses of data vs. the GDPR, the geopolitical Data Privacy landscape of the EU, US, and the UK, Personal Information Protection Law (PIPL) in China, the trend toward more data localization and criminal penalties in Asia/Pacific, penalties may be harsh and swift, the Importance of UK Lloyd vs. Google case, the concern of big corporations over class-action lawsuits for privacy infractions,  and his hopes for Data Privacy in the future.

Support the show

1:03:54

SUMMARY KEYWORDS

uk, privacy, data, people, law, eu, organizations, adequacy, data protection, regulator, europe, business, work, bit, actions, standard, company, case, industry, google

SPEAKERS

Debbie Reynolds, Ralph O'Brien


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds, they call me "The Data Diva". This is "The Data Diva" Talks Privacy Podcast, where we discuss Data Privacy issues with industry leaders around the world with information that these businesses need to know now. I have a special guest from the UK, United Kingdom, Ralph O'Brien. Ralph is the Principal at Reinbo Consulting Limited in the UK. Thank you for being on the show.


Ralph O'Brien  00:43

It's absolutely my pleasure. Thank you for inviting me. It's an honor to be in front of "The Data Diva" herself.


Debbie Reynolds  00:48

Well, thank you, thank you. So your profile has always attracted me because I see you commenting a lot. I think I'm attracted by people who have their own views, and they put out content, and they comment a lot; you have a lot of deep experience in privacy. And I, in general, like to talk to people in Europe about privacy where I think a lot of people in the US, especially before the GDPR came out, privacy wasn't like a huge thing unless you're dealing like finance or health or something like that. So I feel like people in Europe are so steeped in the culture of the EU about privacy. And so I love to have that kind of deeper, richer perspective, in the year in the UK too, this post Brexit and all that stuff. So I'm sure you have really interesting things to talk about. Give us a little background about you and your journey into privacy so that the guests the audience can get to know you.


Ralph O'Brien  01:57

Well, I'm sure like, everybody sort of grows up wanting to be a privacy lawyer. No, I had a bit of an odd journey, to be honest. My educational background was in law. But I actually ran away; I actually ran away. I don't; it wasn't until I was at university studying law that I kind of went; what am I doing here? When I was younger, I wanted to be a policeman, believe it or not. So I kind of ran away a bit and ran away and joined the army and ended up in communication security in the army. And they sent me somewhere hot and sandy. So I ran away from that as well. And ended up ironically joining the UK public sector in the late 90s When the Data Protection Act 98 came in. So sort of the second generation of UK privacy because we had an 84 act before that based on Convention 108. So I came in 1998. Somebody saw law and computing on my CV, along with Communication Security, and said go away and learn about this data protection thing. And as a sort of an 18, 19, 20-year-old, you know, I kind of shrugged and went, okay. But then I went away, and I fell in love. But I've been in love ever since it was a privilege to kind of begin in the UK public sector, looking at, you know, the needs of the government, the needs of the many, as opposed to the needs of the few or the one few Star Trek fans out there the needs of the many as opposed to the needs of the few or the one. And then, really, when I was in the UK, I actually ended up working for the police doing a lot on transfer agreements. And the way data moved around the public sector; there was actually a set of murders. A gentleman, I use the term loosely called Ian Huntley in the UK, who killed a couple of girls, young girls, at a school. And it came down to the way he was vetted. So he'd moved around the UK, and because it moved, his kind of records hadn't moved with him. And what kind of really made my career was really looking at the way the UK handled criminal records, the way they looked at the rights of privacy in the individual compared to the rights of the employer to know about your background, your past. So we wrote a number of standards, we wrote a few audit manuals for the police and later for the ICO. And then, that led me to sit on a number of international standards bodies that make standards like the 27,000 series for information security and privacy. And then, one day, I was arrogant enough to think that I could go and tell other people how to do it. So I went from being an assessor an auditor as sort of a creature of standards in the privacy world appraisal of audit manuals quite early in my career to helping organizations get those badges or to access themselves against those standards. And, you know, I honestly wake up every day passionate and so happy that I work in the industry I do because, you know, my raison d'etre, if you like, is to make a difference, is to help. And so that's been my sort of guiding light throughout my career is to help make a difference.


Debbie Reynolds  05:10

And, you succeed and make a difference. I think you made a very big impact on me and other people on LinkedIn with, you know, just sharing your knowledge, sharing information, sharing updates about what's happening, you know, having standards thinking about that. So it must be kind of interesting. So, as a result of GDPR coming out, so much more attention and focus has come to data protection. And so the space is getting a lot more crowded than it was, before we you were, well, you were sort of working in it. So what has been your, I guess, what's been the most surprising thing that's happened, as you know, after GDPR came out with just the industry as a whole, you think?


Ralph O'Brien  06:04

Oh, well, it certainly popularized data protection and privacy. I mean, I often say that my twenty-odd years and data protection has been sort of fighting the rear guard of a losing action, right. You know, technology's increased, the scale of what we're doing is increased. But when the GDPR arrived, you know, there was certainly a lot more management focus, certainly a lot more popularity in the industry, the industry has grown massively, a lot more technology companies moving in, there's sometimes it's not always a good thing, the industry is slightly more competitive than it used to be. I mean, I'm here to collaborate. I still think it's sort of, you know, these the privacy professionals trying to work together, you know, to make the world a better place. But sometimes, I think the industry's gotten a little bit more competitive. You know, people are a little bit more guarded with their methodologies, perhaps. But the industry is still a lovely place to work, generally speaking, for the greater good. I mean, there's certainly been an explosion of privacy professionals, some, some who are brilliant, some who know enough to be dangerous, who don't have the history, who, you know, come in from a slightly different background, perhaps these been sales or marketing or a different background, or there's a lot more lawyers involved. You know, I, the UK privacy profession back in the 90s was kind of record managers, peoples with their hands moving through the files, you know, so looking at ground level, things like retention periods, and accuracy, and, and sort of the way data is accessed, and the way data is shared. But nowadays, there's a lot more focus on things like international transfer, and contracts, and the legal side of the profession. Whereas, you know, myself, I've always come from a place of wanting to have my hands on the data, looking at data quality. And what I really love about the industry looking forwards is that there's now this focus on privacy, by design, actually making sure that we, you, build privacy into new products and services. So GDPR, massive, massive change. And now we're going through another revolution, again, I actually think personally, that we're now at the stage where your businesses are bigger than countries, you know, the company is bigger than country. And therefore, the regulators are struggling to maintain control, you know, you've got these big global countries, companies, sorry, these big global technologies, but these little national laws. So, you know, you've got this kind of, you know, massive global conglomerates with which are more powerful perhaps, than the ability of a country's regulator to call them into check. So then we begin the teacher ethics journey and the data protection by design journey, which, again, is another fascinating sort of step in evolution.


Debbie Reynolds  09:04

I agree with that. So yeah, you pointed out some really interesting things about the industry. And, you know, I like to talk with people just sort of at all, all sorts of backgrounds that come to privacy, which is really cool. And then so I think one big misnomer and, you know, maybe this because of some of the organizations that have really come into a prevalence as a result of privacy, this idea that, that, that privacy is more of a legal thing the lawyers handle that that's not a true statement. You know, and I think it takes people from all different areas to really make a difference in privacy.


Ralph O'Brien  09:57

Yeah,  It isn't. I mean, we've all got a role to play, you know, from the, you know, from the regulator's side. From the lawyer's side, you know, I'm not saying lawyers don't have a role to play. I mean, your lawyers are brilliant when it comes to contracts and legal advice. But you know, sometimes different people have come from it from a much more practical perspective, you know, actually handling the data. There's the technical IT perspective. There's the policy perspective, and there's the sort of the software engineering perspective, you know, all of these different perspectives audit, you even international politics now, you know, I speak to you from a country that has actually left the EU now. Not my fault, I promise. I was on the other side, and I always have to say this, I was a remainder or a Ramona, as we're now called. So that, but you know, as a company that's left the EU, or the UK at the moment that is now sitting between, let's say, America, and in Europe, you share a common language and culture with America. But at the same time, we kind of have the shared history and values for Europe, and a government at the moment that he's trying to, you know, make the UK a very business-friendly place to live and try and distance itself from Europe, you know, thinking about deregulating or removing red tape. So you're how close the UK remains to Europe and how those sort of policymakers and governmental decision-makers coming to data protection here is going to be a fascinating journey to watch.


Debbie Reynolds  11:36

Yeah, I agree. What is for you right now top of mind, in privacy? So what is the thing that concerns you most that you're thinking about right now?


Ralph O'Brien  11:52

Well, I mean, the thing that concerns me, I mean, I guess I'll be the thing that concerns my customers most, you know, I've got customers coming to me talking about PIPL in China, obviously, as a recent development, there's still a lot of conversations around international transfer. In the UK, I'm really concerned about this consultation; the government has released called "data in a new direction", that in my humble opinion, it's dressed up in words that basically say, oh, we want to, you know, get rid of the red tape and the administrative burden that organizations have, in order to make the UK a more business-friendly place. But actually, what it looks like to me is they're almost removing some of the more administrative sides of the GDPR. And bringing it back to the principles, it's almost going back to almost like a 1984 Data Protection Act, as opposed to you know, what the GDPR, we have now almost backing us up and removing some of those human rights, removing some of those more, you know, protections that the GDPR brought in. So I think it's going to be really interesting to see where the UK goes because it's been granted an adequacy agreement from the EU. And if it then begins to grant its own adequacy to countries that the EU hasn't, or it begins to deregulate data protection here, and in the UK, you know, it's then we then lose those sort of adequacy arrangements with the EU. So, you know, I think we stand on a really interesting balancing act as always, between business and the individual, between the EU and America, you know, between the devil and the deep blue sea, perhaps, right? So, to me, data protection has always been a balancing act about the needs of the many, the needs of the one, the needs of the government, the needs of the citizens, you know, and then contrast that with the scale that technology brings you out these days. I don't think we've ever been in a more interesting and fascinating place. So I think data protection is more relevant now than ever, more important now than ever, yet, we're at this crossroads where, you know, you've got these global technologies and these little national laws. And you know, the one can't quite keep up with the other. So, I mean, it's just never been more fascinating and never been more passionate to be in data protection.


Debbie Reynolds  14:11

Yeah, I like the way that you describe it. Yeah, there. So I was wondering about this, and I'm glad you touched upon it. So the UK is kind of in the middle, in a way. So I feel like maybe the EU and the US are probably maybe pretty far apart on how we approach data, privacy, data protection in some ways. And then the EU you guys are in the middle. So you have because you're part of the EU, a lot of your laws sort of are parallel to that in terms of data protection, but then, as he said, as a result of Brexit and wanting to do kind of these different deals. The UK can give their own adequacy, and so that would be that was my concern. And I had like, okay, so if the EU, you know, I was praying that the UK got adequacy from EU and I'm happy that happened, you know that that directly impacts me and my clients who are in the US and in Europe, but then if the EU, UK gave the US adequacy, you know, I can imagine that we'll be raising alarms in Brussels about, you know, that relationship because we in the US have not changed. So I was like, okay, so what would be the basis for that, and maybe, maybe it will be something where it was more narrowly focused, you know, I, I don't know that this is my thought. So I'd love you to jump in on this. So a lot of the sticking points between the EU, US, and Europe, about kind of how we handle data protection has a lot to do kind of with indiscriminate data collection, and surveillance and stuff like that. So, the GPR doesn't really deal with the kind of law enforcement part or surveillance part. And then the US is we have laws that are saying, okay, you know, in Nashville, you know, it was national security, or whatever, you know, we do all these other things you all don't do. So in my mind, I feel like we should separate the way data is handled, where we treat it, and kind of law enforcement actions from how businesses handle it. So I can't stop the FBI from busting into a company's, you know, office and taking anything. So no matter how many contracts I write, or how many standard contract clauses, whatever, I just can't stop that. Like, there's nothing that business can do about that. So I feel like that's the issue that we have right now. And I feel like instead of trying to say, you know, let's it, you know, instead of saying, okay, I'm in the US, and I'm going to tell someone in the US give me this data because if we need it for law enforcement, I feel like the government should be working that out. And then they work with the company, as opposed to putting the onus on the company and saying, Okay, well, you have to be responsible if the FBI, but since your organization takes data.


Ralph O'Brien  17:40

Yeah, I don't disagree. Actually, I think the problem is somewhat of a political one. You know, I think that I mean, actually, actually, the UK has grown to two adequacy agreements because we were granted the GDPR. One and the law enforcement directive, a separate law, there's a law enforcement directive and the GDPR. Actually, anything law enforcement isn't covered by the GDPR. It's actually covered by the law enforcement directive. So a completely separate law. And then again, the EU is not allowed to touch national security either. So actually, anything law enforcement or national security kind of exemptions from the GDPR, where I think Europe and America differ. And, you know, I mean, Europe, publishers got the fundamental rights approach. So human rights, America more of a consumer rights approach, more of a opt-out versus an opt-in approach, perhaps, yeah. And I think, you know, there are lasting wounds in America that have come from September 11, 2001. That include things like the Patriot Act, and the FISA records, and the presidential executive orders, the way they differ from the way we do things over here perhaps is, they're a little bit more secretive, you know, you can gag an organization to kind of go, well, we're going to come and take your data, but you're not allowed to tell anybody about it, you know that. So there's a little bit less transparency, I think, now, I'm not suggesting that the UK or Germany or France doesn't do national security, that they're, you know, the Secret Service and MI6, you know, CSG here in the UK doesn't collect data, right, of course, they collect data. But I think what difference is perhaps the scale, you know, rather than doing indiscriminate mass collection, it's more targeted, there's more of a justification, there's perhaps more checks and balances and safeguards. Okay. And so, those checks and balances and safeguards to the European court's matter; they matter. Now. You could argue, yes, your data could get intercepted wherever you go. But there's more safeguards, more balances, more of the ability of an individual to seek redress, more of an ability of an individual to be able to, you know, stick their hands up, be aware that this is happening and be able to, you know, fight it or whatever. Whereas the US perhaps has a bit more of a predilection towards national security be able to do what they like when they like and that, So paraphrasing, of course, that's there are controls but not the same. So for me, what it comes down to is a political problem; as you say, the problem at the moment is they say, Look, if you're going to transfer data to say the US or, or anywhere else with a mass surveillance regime? Well, it doesn't say no; it just says you need to make sure that you have a transfer mechanism, such as standard contract clauses or something similar. BCRs, who knows. But then it says, but they're not enough. If you've just said, Debbie, it's not enough to write in a contract, you can't do that because American law will still apply that they'll still be able to collect it. So then it starts talking about adding in the supplementary measures as sort of a new thing, post the trims to a decision. And they kind of talk about supplementary measures to making it okay being things like encryption, when you send it to the US, but holding the keys over here in Europe so that the US organizations can't compel you to give them to them. And they're not encrypting it, not decrypting it when it's in the US, well, doesn't that defeat the purpose of transfer? Right? If it's not ever decrypted while it's over here? What was the point in sending it over to the US? So, you know, really, there aren't any good options, and to put it down on average organizations and average companies whim to say you have to understand the law of the country you're sending it to, you have to put in additional measures that will defeat the national security. It's kind of nonsensical, and there's no good option? The answer can only be a political one. But I can't see it being resolved anytime soon; you know, the US has been pretty stringent on its national security over liberty. The EU is very much stringent on its liberty over national security. And, you know, whilst the two are doing this, while the two are battling, bashing heads like that, you know, they might well come up with another agreement, a safe harbor, a Privacy Shield version free, but you'd instantly know that would be followed by a Schrems III, some sort of challenge. So, you know, I don't see this as a problem going away. I wish my crystal ball wasn't working. I wish I could suggest a solution. But you know, unless America suddenly turns around and passes the GDPR, or equivalent to the GDPR tomorrow, which I don't see happening, removes you're adding some additional safeguards in the national security. I don't see that happening. Nor do I see the EU backing down and saying, well, the American approach is okay, so where does that leave the UK? We're sat in the middle, right in the middle. We want to work with partners. The UK wants to change its brand. So there's a sort of new, exciting, cool, sovereign nation who can deregulate away from the horrible EU rules? That's not me saying that, by the way. Who can grant trade deals and adequacy agreements? All of these cool things? I mean, you know, on the DCMS, department, PR, Digital Culture, Media and Sport on their agenda is granting adequacy agreements to Australia, Singapore, Taiwan, Dubai, the US, all of those nations are not on the EU adequate list. It was actually asked to IPP Brussels last week to the DCMS. What happens here? What happens when you grant adequacy to countries that the EU hasn't? And the arrogance of the answer will go with me to my grave? Bearing in mind they're sitting in front of an EU audience, the head of data transfers at the DCMS said? Well, it could be we've done the homework for them. It could be that, you know, to be it could be that we have looked at something, and if we say they're okay, why shouldn't you say they're okay. And there was a level of arrogance there that I found unrealistic, let's say, you know, unrealistic. I mean, that, just to kind of clarify that I'm taking longer my answer. But to clarify that. I mean, when the adequacy agreement first came through, you only had to look at the two press releases. The press release from the EU said, look, we've granted you adequacy, but be your mind. It's a limited thing. We've got to review it every four years. If you change what you do, that adequacy is going to go; it was very sort of coached in you barely got this right. We could take it away anytime we wanted to, right. And the UK, in his press release, was, he kind of said we were rightfully granted adequacy because of our brilliant privacy laws. Now we can go away and change them; we can unlock the power of data across the UK. It was the two press releases looked at side by side. It was kind of a strange, strange day. Strange day.


Debbie Reynolds  24:46

Yes, yes. This is so complicated.


Ralph O'Brien  24:55

There's no right or wrong answer. Yeah. It's just going to revolve. And, you know, this is what makes it such a fascinating space to work in. I mean advising clients, you know, for the first time, I actually found myself advising clients to not break the law. Exactly. But I was saying to clients, well, you're not going to stop transferring data to the US, there's kind of no good option,  do it. So you're just going to have to do your best, which is put in standard contract clauses, try and work out what the supplementary measures mean to you, and sort of add them in. But, you know, if the business necessitates that transfer, you might just have to live with the risk.


Debbie Reynolds  25:34

Yeah, right. And a lot of it is, you know, and I'm glad you mentioned that about a licensing company. So for me, it's very important to educate them, like, okay, here is the deal like this, then here are the cars that you've been built, okay, and you have to decide for yourself how you want to play this. So, you know, it's not me saying, you know, don't do this, don't do that. I'm like, look; this is the deal. This is how things are; here are your options, and you have to, you know, whatever you decide to do, we can go that route, or whatever. So, I think that's all that you can do. You know, because supplementary measures may mean different things to different organizations. So if your Google supplementary measures may be, you know, very extravagant, you never know, you know, but a little mom and pop shop down the street, their supplementary measures may be more simple. So, so I think, you know, understanding, you know, what is at play here, what the risks are? And then what also kind of reasonable, you know, what is reasonable? What was long considered reasonable for your organization?


Ralph O'Brien  26:48

I mean, data protection law is full of that sort of those sorts of words. I mean, the one thing I actually do like about US law, I mean, I know, US law gets a lot of criticism, but I know there's only sort of three states, Virginia, Colorado, California, we've actually sort of data protection laws as such limited, they are, what I do like is the sectoral approach. I do like the fact that you've got HIPAA and FERPA and COPPA, the sort of laws that can be a bit more specific about their specific industry sector. The problem we've got over here is the sort of big omnibus laws like the GDPR, a law that's going to cover all organizations in all industries of all sizes. It, you know, is we can't be specific in those laws. It's got words like adequate and necessary and appropriate and relevant and proportionate. You know, all of these wonderful legal words that, you know, they don't say, have, you know, they don't say, make sure your doors made out of metal or make sure it's encrypted, right. They, they say, make sure your security is appropriate. They don't say this is the retention time. They say, keep it no longer than necessary. And what that means is this all risk management decisions, because even if you’ve as an organization, you've taken good risk management decisions. The data subjects, the individual, might disagree. On what's necessary, the regulator might disagree on what's necessary for the purpose. And that's kind of why I love it in a way it's. That's why it's so interesting is because it's all an argument. It's all proportional; it's all a risk management decision. It’s up to the organization to decide how much it wants to follow the law how far it wants to go down this road. And what I think is, is interesting is, is the motivational side, and I see changing; it used to be people used to come to me and say, Hey, how do I avoid a fight? And what's the minimum I've got to do to hit the legal red line? What are we going to do to comply with the law? And now I'm seeing the motivation totally change? It's, it's how do I use this? To make the best product? How do I use this to achieve a competitive advantage? How do I be better than my competitors to gain a good reputation in the industry? How do I do my best? And I think that's a much better position to be in personally.


Debbie Reynolds  29:12

I like the way you said that. I agree. So one thing that I'm seeing, which is a good change, is as you say, you know, I'm not saying people say okay, I want to avoid a fine. So tell me what I have to do to do like the minimum even though we know companies do that, you know, sure. And so is good. The company says, okay, I want to make a better product. But what I'm seeing this really getting the attention of organizations is that if they want to expand and they want to do business in different locations, they literally cannot sell their product. If they can't answer those questions, it is like, okay, you have the best product in the world. You know, you have these investors, and they're ready to sign on the dotted line, and they're like, well, so what's your plan? You know, do you have a policy? Do you have a procedure? Do you have people in place that sort of handle data protection? So I think it's interesting to see that people put so much focus on, you know, okay, well, let's make this product so awesome. And then, you know, have Data Privacy to be something that they think about last when literally, they can't reach the finish line unless they deal with that issue.


Ralph O'Brien  30:28

Yeah, I totally agree. And I see that as companies expand globally, I mean, a lot, actually, a lot of my customer base, even though I'm based here in the UK, a lot of my customer base is US-based. There are companies who are using me because they're expanding into Europe, or they've acquired a business here in Europe, and therefore they have to deal with that with these new regulations for the first time, or even a UK company is going into China. And he's to deal with a PIPL. And so you've got this idea of internationality, this idea of global data transfer, but this idea of national law. And so, how do we do that? It's very tempting to kind of say, well, I want to take advantage of the laws in different countries. So you could equally go, Well, I don't want to give GDPR rights to everybody, or I don't want to give you, or I want to take advantage of a company of a country, which doesn't have a privacy law, and therefore I can do what I like, right. But when you do that, you create different standards across your business. And that makes sort of it harder to be interoperable, you know, for want of a better word. A lot of the companies I deal with have now taken the opinion to go; I don't actually care what goes on in the UK; if the UK reduces its standards, we don't care because we are taking this sort of global gold standard, right? Therefore, it doesn't matter if your country gives people rights of access or erasure or portability. We're going to give them that, not because the law says it, but because that's our company's policy. Right. And I think that's really powerful when it when a country says we want what. Regardless of what the country says a company says, we want to get to the highest global gold standard. So it doesn't matter where we operate. These are our standards, right? And ethically, I think that's that that will be a much more positive place to be; you know if the UK does derogate away from EU law, well, what are you going to do? Are you going to offer it a dual standard, one standard for the UK, one standard for the EU, but you just got to adopt the highest standard and let the data flow? I do the second personally, you know,


Debbie Reynolds  32:42

Yeah. Great, great commentary. Let’s jump a few thousand miles away from the USA, UK. And let's talk about time talking about the PIPL. So the PIPL, I think, went into effect on November 1, 2021, if I'm not mistaken, and is the Personal Information Protection Law in China. So I think it is interesting. So I'm familiar with have been there for many years about data protection the way they handle things in China. I think some people were shocked that China actually did this law. And so, for me, it really is them putting their stake in the ground and writing out for people who did not know before what their expectations were for businesses that handle consumer data, data consumers in China. So it, to me, is not that different. You know, it actually, if you look over the years over the way China's handle sort of data businesses, you'll see in the law that that's what they were doing. So maybe people didn't understand it at a high level. They were kind of looking at it on a case-by-case basis, but I think it's good. They sort of put this in writing so that people understood this. So the thing that I was thinking, and I just want your thoughts. So people that I know that do business in China, they knew all this already. So I don't think this law told them anything new. You know, they knew this, but people who want to do business in China, they couldn't they did not know how to navigate, they have better clarity about well, it would be expected. What are your thoughts about that?


Ralph O'Brien  34:36

Yeah, I agree. I mean, again, it's certainly not a carbon copy or something like the GDPR. But, you know, it's definitely got a Chinese flavor to it. But there's a lot of commonality, you know, you start looking at this, and I mean, daily, all data protection laws across the globe are based on the OECD privacy principles of 1980, right nearly all, you know, say your purpose limitation You know, quality, security, individual participation, accountability, you all have those old principles, you know, data minimization, all of those old principles from the OECD 1980. What I actually love is they really haven't changed. In all of the laws across the globe, we see that commonality; then you get, you know, international transfers rights and data protection impact assessments. I mean, you're that there's a huge sort of GDPR flavor being pushed into this Chinese law. So, yeah, again, if you're an international company, and you're doing most of these things already, the gap isn't going to be that big, you know? Yes, there's a Chinese flavor to it. Yes, you are going to have to look at the detail, you know, because there is nuance. But for most organizations that have done a sort of a GDPR privacy management program, that, you know, moves you up towards that standard, you know, that you're not going to have to do huge amounts of change to be to kind of be on the good side of the PIPL. I think it's still developing. There are a couple of bits in it where it refers to counter practices where we haven't really seen them yet. It'll be interesting to see some enforcement because they didn't give a huge window; they kind of said, hey, the laws here, there wasn't like, oh, by the way, this is going to be enforced in two years’ time. There wasn't a huge window there. So I think that's really interesting. That they kind of just kind of threw it out there. And a month later, it was in law.  But yeah, I don't see organizations that already sort of up to the sort of the GDPR level, you know, having to make huge amounts of details or differences for the way they do business in China. I mean, there is a little bit in there about data localization, which is interesting, keeping Data, Local, which, you know, if you're up in the cloud, might be interesting. But yeah, from the first of November 2021, it's already in law, which we will see.


Debbie Reynolds  37:08

I think, also, one trend that I see in kind of the Asia Pacific region in privacy, regulation, not just with this, just over the years, I'm starting to see them create regulations where they're starting to go towards penalties, criminal penalties, you know, as opposed to, you know, civil penalties. Yeah. And then also another trend I see in the Asia Pacific is more data localization requirements.


Ralph O'Brien  37:53

Yes, definitely. So, yeah, when we start, it's almost what I call counter-globalization. So, you know, the more we globalized data from all these big internet companies, your Googles, or Facebook's or Twitter's, you know, whoever, at your Apples, Microsoft’s, you know, as soon as they sort of operate across the world, and, you know, we can truly think, truly, as a human race, we should be thinking more worldwide now, you know, but there's almost this counter-movement, that Brexit is a good example of it, actually, you know, that, you know, that there's this almost considered that the world is a threat somehow, and we need to look down and take care and be a bit more xenophobic. And, and certainly, with the government and the government security services, the idea that the data is somewhere else on the globe where they can't access it and can't see it. It's frightening, you know, it's a barrier towards law enforcement, you know, that. Yeah. So yes, there's, there's this big idea now of, well, yes, you can take the data, but you've got to keep a copy locally, just in case we, we want to look at it, right. So, you know, which is kind of opposite to the way the world is going in terms of cloud and data migrations and single global approaches. So you have this real kind of dichotomy with with with the way the law goes when it comes to criminal sanctions. I mean, it was always going to be the case in China. I mean, you mess up in Europe or in America. Yes. You might well have a lawsuit against your organization. Yes, there might be financial penalties. But you know, in China, you're appearing on national TV apologizing. You're literally prostrating yourself on your knees and asking for forgiveness. You know, it's a very different culture, in terms of that. Personal accountability of the management, right? The personal accountability of the management. Yeah, one of the things I'm fact I'm running a training course this week; I've got a four-day training course this week in British law. And you know, even though we have got some personal liability, it's more for almost like criminal offenses like unauthorized access to computer systems unauthorized modification, unauthorized, you taking data without the data controllers say so? Yeah. So there are some personal liability offenses. But when it comes to the regulator, when it comes to actions in court, they're always going to be against the organization. So yeah, I think personal liability should well be pushed for, in my personal view, rather than corporate liability.


Debbie Reynolds  40:30

Yeah, I think another thing with, you know, these Asia Pacific regulations, the penalties can last. They can be harsh and swift. So it's not something that's going to take years and years to go through a court process. They just like this law came out, it didn't come out of nowhere because they've been working on it for years. But if you think about how long it took the GDPR, and the Data directive, and even the UK Data Protection, you know, it takes many years for this stuff to develop. So in comparison, this law came up quite fast. It was developed quite swiftly. And in the past, even before the PIPL. If you look at the way that China's treated companies, they felt ran afoul of something like the penalty was like, very swift.


Ralph O'Brien  41:26

Well, there are pros and cons to democracy. Democracy is fantastic, you know, and I'm pleased to live in a democratic country, though, obviously, we've had to live with a vote that hasn't gone the way I've wanted it to recently. But, but yeah, the idea of democracy and one of the people is, it means you've got political parties who have to deal with the will of the people, it means that the political parties don't can't just set an agenda and do it, they've constantly got to be looking over their shoulder, what people think and what people feel, you've got to consult and ask nicely, and then water it down for people who don't like what you're doing. Whilst democracy is a fantastic thing, don't get me wrong, it does mean that it slows down the wheels, while you have to consider the will of the people, you know, the I suppose the advantage, let's say a Chinese state has got going for it is long term planning, the government doesn't change, therefore, you know, that the people can't suddenly kick you out. So you know, that does mean that you can pass things swiftly. It does mean penalties can be harsher. You know, it does mean that you don't have to worry so much about what people think and feel; you can just follow your agenda. Now, I'm not suggesting that's a good thing. But, at the same time, you know, it certainly means you can get swifter movement and harsher penalties. One of the things that bother me actually is the ability of a regulator, like the DPC in Ireland, or the ICO in the UK, to actually take effective enforcement action because their budget is not the same as Google or Facebook; Google and Facebook can appeal, they can put you to the tribunal, they can throw good money that you don't have as a regulator, you know, they've got more money than you at the end of the day, they've got more resources than you a deeper bench of lawyers than you have. So, you know, they will have the ability, a big global, national company, to frustrate a country's regulators’ efforts. And, you know, I think it's only got to be by joint joining up, or by pooling our resource or by even just appealing to their better nature talking about data ethics rather than complying with the law that pointed out the competitive advantage to them, that even the largest companies will ever toe the line so to speak. 


Debbie Reynolds  43:56

It's fascinating. I could talk about this for hours. Why don't we talk a little bit about the Lloyd Google case? We chatted about that before the session, but just tell me your first of all, just introduce us to what this case is and what it's about, and how the UK plays a major role in this and the impact.


Ralph O'Brien  44:22

Okay, yes. So, Lloyd Google. Okay. So like Google was, a court case was decided in the UK recently. It was a gentleman called Mr. Lloyd, who was actually representing 3 million Apple iPhone users. Okay. And the case was both brought in the US and in the UK. Actually, interesting enough. In the US, it was settled out of court under a consent decree under contemporary. But in the UK, it actually went to court. Now, it really wasn't really about the facts; the facts of the case are pretty obvious. Google had a workaround to get around sort of iPhone, Safari security, in order to get personal data on people. There's no doubt in the fact that Google put a workaround in place that got people's personal data in a way that Apple and Safari didn't want them to. It had a workaround, a hack if you like. And therefore, you know, Mr. Lloyd brought this case on 3 million iPhone users to kind of say, Hey, we should be entitled to some sort of compensation for damage or distress. That's what it says, actually; in fact, it wasn't even done on the GDPR. It was actually under our previous law, our Data Protection Act of 98. And so that's the facts of the case. There's nobody denying that Google didn't do this. You know, so you could argue, well, if Google is so guilty, why shouldn't people get compensation? Well, and it came down to the fact that the UK is actually quite resilient to what you in the US would call a class action. That they kind of don't like them here. Here in the UK, we don't kind of have almost that more ambulance-chasing lawyer culture here. And the courts traditionally have been very resistant to it. So what did they say? They said a couple of things. One, they said, Could you really say that all of these 3 million people had suffered the same level of harm, which is needed for in a representative action here in the UK? You need to be able to say that all the claimants have the same level of harm applied to them. Now, clearly, you could say the same thing occurred to them. But whether it was the same level of harm, you probably couldn't say that you couldn't say that one person would have the same harm in terms of privacy from another, from Google exploiting their data in this way through the Safari browser, the impact might be different on each individual. And secondly, they kind of said, well, are people really hurt? So you've got this concept of damage or distress. And what Lloyd argued was that the loss of control of the data was sort of damaging itself. me not knowing where my data was Google, having my data and taking my data without my say, so that was damage or harm to me. And the courts ruled otherwise, in both respects, they said, Well, you can't prove that everybody was damaged the same sort of class, if you like, doesn't really work. Not everyone has suffered the same damage. And secondly, they said, well, is loss of control damage, and they said, we don't think it is, we don't think that you can kind of prove that you've been damaged, just because somebody else has got the data that you didn't want to. It's almost a case of then if lawyer did argue what Google did with it harmed people, that would have been more likely to be awarded damages. And just saying, it's the loss of control, you know, that you know, someone else having your data you didn't want to, you know, it, that's not damage, it's just someone else having it, you know, that doesn't physically cause you damage or distress. So it's made it a lot harder here in the UK to kind of do a class action, made it a lot harder, and be kind of claimed damage or distress. Because the problem is with these representative actions, in my mind, if you're claiming, I don't know, against Google or Twitter or someone like that, the level of harm each individual has probably doesn't make it worth going to court. You could spend thousands on the court case, but the damages could only be less than a thousand. So how else do you take someone to court through some sort of representative action where you've got a law firm acting on behalf of a greater number of people is the only way to make it economical to take that to court. So I do kind of agree with their decision. But I don't agree with the consequence of it that Google gets off scot-free that people don't get the compensation that makes it harder for people to take action. I don't agree with a consequence. But I think in terms of the UK law, it was actually a sound decision. I can see their reasoning, you know; whether this holds up under the GDPR is another matter. Because it was done under the previous law, the 98 Act, not the 2018 Act. That's the GDPR. Yeah, it's complex. It's complex again, but it certainly made class actions or representative actions. A lot harder to do here in the UK.


Debbie Reynolds  49:43

Yeah. This is a big case. I know. A lot of big tech companies are breathing a sigh of relief about how this sort of came out because they're concerned about having, you know, millions of people, millions of people in class actions in the habit to sort of pay some type of, you know, redress there. So what do we have in the US in some cases? Well, first of all, we don't have kind of this strategy at a high level about data privacy, like GDPR in the US, but we do have, you know, states and like you say, we have sectoral laws, and those will result or have resulted in class action type lawsuits. And while those are caps, no certain amount, and then also the thing about whether harm is tangible and intangible gets resolved. So like you said, in those class actions, for the most part, you have to show some type of harm. So the only law that I can think of right now that people are concerned about, the less you don't have to show tangible harm, is the Biometric Information Privacy Act, Illinois. So that one is, you know, that went past everyone when no one was caring about Data Privacy, that Illinois just snuck that law in. And this, you know, played a big role. And I know a lot of people are really looking at that very closely. For that reason, they don't have to show tangible harm. There is no cap, really. But the damage is my data collection. So every time you collect someone's biometric data, it's, you know, it adds up. But I think that another difference between the US and Canada, Europe, in the way that we think about data, the way that the laws are written, it is based in the US very much on the consumer, right, as a whole as opposed to a human rights. So for some people who think about as a human, right, that right, you know, would be a cradle to grave, right. And then there's some intangibility there. Right. So with that, I think that's what makes it more complicated, where in the US, it's like, okay, I bought this thing, and we're going to calculate these fees, and all this type of stuff. So I think it's harder to express the human part of the law. It makes it harder to do.


Ralph O'Brien  52:19

I think it does. I mean, you know, I mean, as I know, a lot of things have been done under Section 3 of the FTC Act, which is, you know, unfair and deceptive business practices, right. So, you know, it's always it's less than you have breached someone's human rights and more that you have broken your promise to your customer. Right. That's, that's more as I understand it, where the US law is kind of base, right? Yeah, exactly. But again, this is also this culture of settlement or consent decree, you know, this, this settlement before it goes. Now, we haven't really seen that over here. And we did see it in the British Airways thing here in the UK. So British Airways had a data breach; it was fined by the regulator, you know, a large amount of money. But then, equally, there was a very similar sort of case brought against it, which British Airways settled out of court; I think it was about $750 per person where they were going for almost $2,000 per person fee for the, for the court case. So you know, we have started to see that little bit of settlement. Me personally, I'd be more worried. So to the GDPR. Well, not a lot of people realize it was actually got to sort of separate streams of financial penalty. It's got the regulator fine, which could be 20 million euros, 4%, of global annual turnover. Everybody knows that one. But that doesn't actually recompense the data subjects that just goes into the public purse; that's just a, you know, a fine, a penalty from the regulator. But we also have this compensation for damage or distress, that is, the individuals who access justice through the courts to gain, you know, damages through the courts. Now, we haven't seen a lot of them. And I think that's a more interesting route because where the regulators fail, where the regulators have been tied up for a long time, I actually think the class action route is more likely to achieve success. And it's, you know, 4% of global annual toner turnover or 20 million, even though it sounds high if you've got 20 million customers, and you've damaged them for, you know, $1,000 each, that's a lot of money. Right? You know, so you know that the size of those class actions with a big tech company is certainly one that is much more scary than even a regulatory fine from a supervisory authority here in Europe. So I think those class actions will be more of an interesting thing to occur. So yes, Lloyd versus Google, to me was, I don't think wrong anymore. But disappointing in terms of the way over, I would personally like to see these larger organizations held to account.


Debbie Reynolds  55:04

This is fascinating. We have to talk more about other cases that are coming up. So you're, you have a front-row seat to all the wacky geopolitical things that are happening.


Ralph O'Brien  55:20

Well, could you one of the great things about being based here in the UK and with all our history and things like that is we do have, you know, large global organizations, organizations that route their data through the UK, you know, we might be a small country, but actually, what's sort of fascinating about being here is we are sort of, we are the English language speaking part of Europe. So America likes us for that reason, you know, but we're also still a part of Europe. So Europe likes us for that reason. You know, we've got a huge financial community here. You know, most large global organizations will probably have an outpost here in the UK, at least. So you know, it doesn't mean that you're being here in the privacy, but it's a real privilege, actually, to be, you know, I've worked with some amazing organizations and terrible organizations over the years. And it's just been a fascinating privilege to be on this journey. And to kind of see where it takes us, I can only hope in the future. The UK government doesn't sort of degrade individual human rights and privacy rights in an effort to be more business-friendly. Like the consultation paper suggests. But I mean, who knows, I don't have a crystal ball. We'll see what happens. And it'll be a privilege to find out.


Debbie Reynolds  56:30

Sure. That's true. So if this were the world, according to Ralph, and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether it be you in tech, law, or anything?


Ralph O'Brien  56:50

Wow, the world according to hey, don't do what I say, okay? How I mean, you know, I mean, I'm a servant. Right? It's a privilege to help out organizations and to assist them and guide their risk; you know, I don't have any desire to lead or to, or to impose my will on anybody. So, I'm not really sure I should plead the fifth as you plead the fifth? That's correct. Yeah, on that one, but, I mean,  it's respect for me, right? We live in a global world with global business. And I think what bothers me is, I mean, you only have to look at privacy notice or privacy policy in American parlance; we'd say that which starts off with your privacy is very important to us. To me, that's the same as when an Englishman says, Your wedding, say your privacy is very important to us. That's the same as an Englishman saying, with all due respect, I mean, you know what's coming next, right? And it's not going to be nice. You know, a privacy policy to me actually doesn't really describe how you're keeping the data private; it does the opposite. It actually describes how you're going to use it, how you're going to move it around, what you're going to do that impacts on their privacy or, or removes it. So actually, privacy policy is kind of a misnomer. So to start that with, your privacy is very important to us. I would rather see more of a value transaction, more honesty, organized, saying, Look, this is what you give, and this is what you get. Right? And then leave it up to consumers. So don't dress it up behind your privacy is very important to us. Just be honest, actually say, Look, this is what we want to do with your data. This is what you're going to give us. Here's the value transaction; make an informed choice? Yeah,  I'm a bit of a privacy advocate. We're a bit of a human rights advocate. I do believe in people's privacy rights. In the UK,  we tried to pass this thing called the Online Harm bill. The Online Harm bill talks about making providers make sure you can't have anonymous accounts because you don't want trolls and bullies and people hiding behind anonymous accounts. But for every troll bully, there's a child who is questioning whether they're straight or gay, who wants to browse the Internet and find out about that culture without labeling themselves without it sticking to them for the rest of their lives. Right. There's someone who is potentially transgender,  who is in an environment or a family where they can't be or are restricted from being who they want to be and fulfilling their potential. And therefore, they have to go to the internet. And they can't use their real name, right? So for me, everything is a balance. You know, for every move we take online to protect ourselves from a troll or a pedophile. We're also then impacting the rights of people who need to be anonymous for a good reason. Yeah, there is no easy answer, but there has to be a balance and the slow erosion of privacy and decency, then perhaps I'm just a different age range where, you know, when I was brought up, I could browse the internet with a little bit more anonymously. It wasn't tracked and traced forever when I could meet people who disgusted me or thrilled me in equal measure. And I think that's our right to human to make mistakes, to not have things held against us and categorized against us forever for your for the mistakes we made when we were kids for to not be categorized and followed in, you know, surveilled, but to able to be a bit freer a bit more, to be able to live our lives in a way, not without consequence, I do believe in people taking responsibility for their actions, but in a way that is free from interference. Until you do something, you know, on the worst side, I mean, I don't have the answers. I don't have the answers.


Debbie Reynolds  1:01:05

That's a great answer, though. I think I mean, and this is so open, you know, there, if it were easy, we wouldn't be here.


Ralph O'Brien  1:01:15

Yeah, but this is what I love about the industry. I mean, what we're fighting for is the soul of humanity, right? And there's nothing worse, nothing greater than humans; we are terrible and brilliant in equal measure. And with all of that, terribleness. And brilliantly, you know, we have to legislate in cyberspace with technology that is kind of almost beyond our understanding and logic and ability to explain, we're moving into AI and drones and robotics and you stuff that, you know, I can use an app on my phone and a robot, a little trundling robot delivers my shopping to my door. How beautiful is that? That was science fiction when I was a kid. But that comes with data and surveillance and, and you know, the people be able to track where I am. And, you know, these things don't come without costs. So the very fact that you and I, Debbie, are able to have this conversation between transatlantic connections on a web conference system, and it'd be like, we're in the same room. I mean, how beautiful is that? But how scary.


Debbie Reynolds  1:02:18

Yeah. Well, I really enjoyed this; I'm so glad we got a chance to talk and meet. This is fascinating. So I'll definitely be continuing to watch you and things that you do, you know, on LinkedIn, and you know, keep up the good work, you know, You know, it takes all of us to be able to share information and get information across, and you know, as you say, You do as I do. So I think, you know, businesses can do business, you can do business, you can make money, but you can also respect the rights of humans.


Ralph O'Brien  1:02:56

The two are not mutually exclusive. It is a win-win positive-sum; it doesn't have to be a negative sum. I agree.


Debbie Reynolds  1:03:03

Well, thank you so much again, and we'll talk soon.


Ralph O'Brien  1:03:06

The honor is all mine. Thank you, Debbie.