"The Data Diva" Talks Privacy Podcast

The Data Diva E59 - Emma Martins and Debbie Reynolds

December 21, 2021 Season 2 Episode 59
"The Data Diva" Talks Privacy Podcast
The Data Diva E59 - Emma Martins and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva,” talks to Emma Martins,  Data Protection Commissioner at Data Protection Authority - Bailiwick of Guernsey (Europe). We discuss her journey to data protection and Data Privacy, her most significant concerns about Data Privacy now, GDPR enhancing privacy awareness, third-party party Data Privacy concerns, privacy concerns of small and medium-sized businesses, her ethical test for Data Privacy, European attitudes versus US perception of Data Privacy, the challenges of surveillance, greater scrutiny of Data Privacy steps needed to be compliant, the problem of privacy in technology implementations, advice for businesses starting their data protection journey, privacy perceived as a burden can hurt business success, and her hopes for Data Privacy in the future.



Support the show

46:44

SUMMARY KEYWORDS

data, people, privacy, organizations, regulators, happening, companies, technology, data protection, conversation, business, jurisdiction, question, thought, Europe, fines, world, surveillance, important, understand

SPEAKERS

Debbie Reynolds, Emma Martins


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva." This is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world for information that businesses need to know now. I have a special guest on the show from Europe. Her name is Emma Martins. She is the Data Protection Commissioner at the Data Protection Authority at Bailiwick of Guernsey. Hello, Emma.


Emma Martins  00:47

Hi, Debbie. Great to be here.


Debbie Reynolds  00:49

Yeah, it was awesome to have you on the show. You and I had the pleasure of doing a panel together for an organization in the UK, I believe, a few months ago. And we had such a great time. I think we're trying to corral that group again to try this other thing with Jasmine Hines. And so I thought it'd be a great idea to talk with you, you know,  as a regulator, that Data Protection Commissioner to get your thoughts on privacy and sort of what's going on in the world now and what you're most concerned about, but why don't you start by telling people, you know, about sort of your journey. And then also, a lot of people may not know where Guernsey is in Europe.


Emma Martins  01:43

Yeah, it's, it's a very small island, small but perfectly formed. So we're between a small group island between France and the UK when I can pretty much see France from here. So it's a great location. We're not part of the UK. We're a crown dependency. So we have our own legal framework and our own laws. So for me, I've been in data protection, an awfully long time, I sort of fell into it really, I think lots of people flows a bit of my age, fell into it in a way that you're seeing people that make active career choices now, which is really exciting, I think. But many, many years ago, data protection was sort of an add-on to a job I had working in the public sector. And it just struck me that this was a huge piece of work. But it wasn't really given much attention. I think we've moved on from there. But we've also got a lot of work to do. So I started off being a DPO. And I think that helps me enormously because I think there's a danger that regulators sit in ivory towers and preach and point fingers and weigh big sticks. But the reality on the ground is very different. And the pressures that organizations face, both in the public sector and the private sector, are real. With resource constraints with skills shortages, these are very real problems. And I think if you're alive to those from our as a regulatory office, then I think we can engage it a bit better. So my journey was starting out the other side of the fence, if you like, as a data protection officer. And then many years ago, I got the opportunity to work with the regulator, worked my way up. And I've been in this job for a number of years now. And as I say, it's a very small office. But I am a firm believer that success isn't always defined by how large the radius of your influences or your action is, but rather whether you're your own circle is fulfilled. And I think that we work very hard here as an independent legal jurisdiction to put in a strong regulatory framework around the protection of people's data. With an eye to we may touch on it later. But we're very keen to ensure continued adequacy, this jurisdiction, so the free flow of data is important for the economy. But first and foremost, this is a law about you and about me. It's about people. And that's at the forefront of all our conversations.


Debbie Reynolds  04:09

That's great. Thank you so much for that. You have like a front-row seat in what's happening, you know, in Europe, and you know, you're seeing things that are happening around the world. Tell me that the thing right now maybe concerns you the most, like what is maybe top of mind for you right now?


Emma Martins  04:37

It's a really good question. I probably have to look back a little bit to answer that. But those of us that have been working in this field for some time have seen the storm clouds gathering around the exploitation of personal data and the actual impacts that that can have, and I think it's they've been quite hard conversations in the past because it's often quite an opaque subject about data people just think of its computers and servers and something which is remote from them, or something that's unfathomable, you know, it's just purely about technology, and therefore they can't engage with it. But I think there's something's happened lately. So it's a concern. But it's also, I think, an opportunity that we need to grasp that the conversation is shifting, the narrative is changing, you're starting to see the way that the big tech companies use and misuse our personal data challenged. And it's also being explored. One of the things I think is so fantastic that's happened recently is you're seeing this leap from it is a question of law, a question of technology, to it being a question about our social lives, our values, our culture. And you can see it if you go on to any movie streaming platform, you can see films like The Social Dilemna, Coded Bias, The Great Hack, so they may or may not seem an important point in and of themselves. But if you think about that, for those of us, again, that have been in this field for a while, if you'd said to me 5 or 10 years ago, they're going to make movies in America about this issue, I would have laughed. But the reality is that is what's happening. And I think the reason that's happening is that we're starting to understand the actual impact on us of the way in which our data are being handled. That's not to say that, as a regulator, we think that everything is bad that sometimes we're pitted against innovation and against progress. That is, that couldn't be further from the truth. But it's about grounding those innovations in human values and values that respect each one of us and give us the dignity and autonomy and respect, as opposed to being based on the exploitation of us treating ourselves as nothing more than a commodity to profit from such a long-winded way of answering probably, but it's a huge, huge concern, this what's called surveillance capitalism. There are some fantastic books on this. But I do think that we're starting to see a chink of hope, in that we're seeing movies made about it fantastic. Author's writing books about it. And that then leaps in across to mainstream, it's all very well, those of us working in privacy and data protection, we get together, we have some fantastic conversations, we have conferences, but outside of that lovely bubble, what is happening, and what's happening lately, as it's starting to mainstream into society in a way that I think will influence and persuade other organizations, big tech is more complicated, but I think this persuasion model is really, really important that consumers and citizens become aware and put pressure on big tech and all companies that have data to do the right thing. And so I think we're the beginning of that, I'd like to think we're at the beginning of that process, and it will continue its momentum.


Debbie Reynolds  07:56

I think the GDPR was very good in terms of bringing more eyeballs onto privacy. So before the GDPR came out, people weren't, you know, they're kind of a group of people like us that cared about that. But there are very few of us, so we weren't as visible. So because the GDPR made privacy more of a C suite issue, something about the way businesses needed to change the way that they operated. And, you know, people were slow to take that up, really, and they didn't really understand what that meant. But over the years, we see more companies understand, you know, what that issue is, and I'm, I'm happy to see, you know, regulators working with organizations around privacy, so a lot of the talk in, in the US, and, you know, from a US perspective, like, for example, when the GDPR came out, you know, May 25, 2016, you know, I thought I was gonna wake up and everyone's gonna, like, people care about privacy today. And like, no one said anything, and I was like, Oh, my God, like no one said anything. So I knew that that time was really important to educate people that this was coming, you know, in terms of the enforcements to date eventually, you know, about two years later, you know, people started to really, you know, put it in the news and stuff like that. And I thought that was very important. But I think the way that people in the US thought about GDPR when it went into full enforcement was that you know, like, on the first day of enforcement, there'll be all these cases where regulators will be doing these fines, and it didn't happen that way. Now we're starting to see We know more fines and stuff happening now. But I think people here were thinking, you know, maybe they thought sort of took some of the steam out of DPR? I don't think so. Because, as you know, a lot of these cases take many years to go through. And, you know, regulators are busy people, you all have a lot of work to do. But you know, what, what are your thoughts about that sort of transition into having privacy be more, you know, more visible? I guess?


Emma Martins  10:37

It's, it's such a good point. I mean, we are moving from it being an IT or technology issue. Not that long ago, data protection will just be somebody down the corridor. They will be responsible for data protection. It's now recognized as an entire business performance issue because it's so wrapped up in trust. And you see real developments around breach reporting, that is all about trust, this sort of transparency and openness, about what, especially when things go wrong. That engenders trust, and organizations want their customers to trust them. So it goes way beyond data protection as to how do you want to be seen by your clients, your clients, the ones that make you profitable or not? It's a different conversation for the public sector, there's a different pressure, of course, but I think this whole drive that, like you're saying, the C suite, recognizing that this is what their clients now expect and demand. So I think we need to, in a sense, mature the conversation beyond let's just look at enforcement, let's just wait for the regulators to find invariably, that's post-event and post-event on a data issue is it means there are data harms? So let's reverse that. And let's say right before we get to that, so let's not obsess about how many millions of pounds or dollars worth of fines, yes, that that that matters. But what matters even more, in my view, is how we are actively working to prevent those harms from happening in the first place. So this is a sort of mild obsession with enforcement to look at other areas of our lives where we are required to behave in certain ways. You know, I don't steal from my best friend because I'm frightened of enforcement. I'm, I don't steal from her because I'm a decent human being. And I respect her. And I don't want to take that from her. So I want organizations to look at data, personal data, and say, I'm going to handle that well. Not because I'm frightened of fines, but because it's the right thing to do is the right thing to do for the people whose data I'm responsible for. And it's the right thing to do for our organization because that's what builds trust. And that's what builds confidence. And that's what builds a successful economy.


Debbie Reynolds  12:54

Wow, I agree with that. Let's talk about third-party data sharing. So what I'm seeing is, and I think this is gonna be a wake-up call for a third party. So, you know, the GDPR and other privacy regimes have always had parts in it about the responsibility of a company that receives first-party data and then how they transfer that to a third party. So we know all almost all companies work with third parties, and then figuring out what not only the best way to do that transfer but making sure that that transfer is within the realm of the law. But then the thing that I'm seeing now, that is surprising to me. And it's actually a good thing that I'm working with a lot of companies that are you know, like say they're trying to get funding, or they're trying to get, you know, big clients and customers and, you know, they, they, they do the presentations, they do the marketing thing, they negotiate contracts, but we're getting to a point now where, you know, they say, okay, everything's fine. We like your company will like all this stuff. But then now you have to answer all these privacy questions. And so that didn't happen before. So in the past, it was like, Yeah, you know, piracy was kind of an afterthought, like, Okay, we'll do this thing. And then, you know, maybe later down the line, we'll think about piracy or something like that. But now, it's literally becoming a stopping point like this. You can't, you know, I've literally had organizations come to me and say, you know, we can't sell our product because we can't answer these questions about privacy. And I think that's great. What are your thoughts?


Emma Martins  14:44

Oh, I totally agree. And it's so interesting to hear you talk about the influences you're seeing in your field. They're not coming from the regulator. They're coming internally. Those pressures are coming from businesses from commercials from others. etc. themselves, because they recognize the importance of getting this right. And, and the person asking the question, again, is putting that pressure on. And I think that's what we talk about when we talk about cultural change that it becomes embedded, it becomes normalized rather than constantly looking over your shoulder and saying, Well, what is the regulated thing? What is article this or section this the GDPR? Say? It's no, what is that? What is the business need to do to get this? Right? So I think that that to me, I know I'm a bit of a nerd on this stuff. But that, to me, is terribly exciting. Because you move beyond just waving a stick as a regular thing. If you don't do it, right, I'm going to find you to sort of community conversation about really high standards of data governance.


Debbie Reynolds  15:41

What do you know, a lot of times when people in the press even, even though I'm happy that people are talking about privacy a lot more in the press, you know, the bigger companies are getting a lot more of that attention, right, especially because of, you know, regulation and regulators. But what are things that may be small or medium-sized businesses need to really think about that they may be not thinking about? So, you know, I feel like, you know, let's say, you know, Facebook or Google, you know, they have hundreds of lawyers. And, you know, not every organization has a lawyer or has people in-house that can sort of doing this role. And we know that you know, a lot of these, you know, unlike the US, you know, in the US, our privacy laws are mostly consumer-based, so they're not human-based. So there are quite a few gaps there as a result, but in places like Europe, where your privacy laws are more based on kind of a human, and there are very few exceptions. What are things that you think that maybe smaller and medium-sized companies need to think about when they're thinking about their protection, as opposed to oh, my God, it's like this totally burdensome thing, you know, what is some advice that you would give them as a starting point, and they're looking, you know, how can I be better and get more mature in like, data protection?


Emma Martins  17:16

Yeah, I mean, it's one of the things I think is quite important for us when we're looking at smaller businesses is, is to not disenfranchise them from the whole conversation that they're all that privacy people and data protection people talk about is big tech. Now, that really matters for us and our future. But every if you think about the breadth of data protection regulation, it's not just big tech, it's your hairdresser down the road, it's the accountants down the road, it's, and it's everything in between. So if we constantly talk at one sector, at one group of businesses, one organization's the rest just feel neglected. So they either think there's nothing I need to worry about, or they disconnect completely and say, well, I'm not going to even the other one should I'm not going to bother. So I think it's really important. It's probably easier for a smaller jurisdiction, like ourselves, to properly engage with everybody else. And that includes the whole plethora of different businesses that are out there. And I think that the single most important thing that I've learned over the years is that most people want to do the right thing. Give them a chance, help them understand what the right thing is, and help them through it. Because it's, again, if you focus entirely on enforcement, if they only if they're only threatened by enforcement, they will feel defensive, and they will feel disengaged. Whereas if you say, Listen to think about your relationship with your client, they're human beings, does it matter to you? Do you want them to trust you forget using all the sort of a rather difficult language of data protection law sometimes data subjects, they're human beings? No, pointing back to your very, is a very European attitude to it. But it stems from a very human problem. You know, after the war, after World War Two, this sense of that human beings were so terribly treated with tragic outcomes, in the way that data were handled in that context. So it has historical roots, all of that. But I think that encouraging smaller businesses to really engage with the value of looking after the personal data and how it can, how it can improve their brand, not just avoid fines, they really people do want to trust organizations. And the organizations that we deal with that have made mistakes, often just say, I just didn't think I just didn't realize helped me so we work quite hard in our jurisdiction to, to put out engaging content, not just full of legalese, we work to plain English to try and make the language accessible. And I'm a great one for ethics. And it's such a broad subject. I mean, people study degrees or Masters on it, but if you could distill it, I say to people, there's a couple of ethics tests to take yourself through one. What if everybody did what I was about to do? What would the world look like to the sort of mom or best friend test if you told them what you were doing? Would they say, Oh, that's a great thing to do? Or maybe that's not quite so good. The front page test if the newspaper or media outlet got hold of what you're doing and published it, would you be happy with it? The stench factor? Just really? How does it make you feel? Do you see how some organizations really do exploit data badly? That really doesn't feel good, does it? And the last one, what if it was me? What if it was my data, or someone I loved data, and that's really salutary. We can really all think about that. And there's often a sense of detachment when we're talking about especially vast numbers that have datasets that each one of those is a human being and a number of human beings. And I think we forget that at our peril. So I try and encourage smaller organizations not to fret about that the arguments going on in the big tech space be interesting, yes. But for you, for those organizations, look at what data you have, look at what your business objective is, and understand how much the question of trust and confidence is now linked to how you handle people's information.


Debbie Reynolds  21:08

I love it. I'm gonna have to re-listen to that, again, after ethical tests. That's smart. And you know, it does bring it back to a human level, because we are at the end of the day talking about people this about humans, and, you know, what we do and how we live our lives? What is it? I guess I'm gonna switch a little bit and talk a little bit about surveillance. And before I talk about that, I'll I want to get on a plane that you were talking about, which is the genesis of the feeling in Europe, about privacy and data protection and how that's part of your constitution. You know, privacy is a human right in Europe, and in the US, privacy is not a fundamental human right. It more mostly is, is codified in our laws as a consumer, right. And so, to me, there are two different things that have happened. And I think this may be some of the friction and difference between Europe and the US. And it's like, you know, as a result of, you know, World War Two, and you know, the atrocities that happen, you know, Europe took the stance that if we collect less data, we make people protect their privacy, we can have better security, or better, better ways to handle the rights of individuals were in the US, I think, just naturally. And then also, because of 911. We said, Okay, we need more data because we want to be secure. So I think that when we talk about, you know, especially things like privacy shield and things like that, you know, we come at it from such a different, different perspective. So I think, you know, we understand that we think privacy has importance, but then how we go about it is very different. So what are your thoughts at all about? You know, like the transatlantic, you know, data transfer, you know, we, as we know, the Privacy Shield was invalidated last year, in 2020. And we don't yet have a new agreement there. But what are your thoughts about the things maybe we in the US and maybe Europe can maybe harmonize on to get to a better place in terms of that transatlantic data sharing?


Emma Martins  23:56

Goodness, that's a big old topic, isn't it? But I think that would start by saying, we as in the geographically in Europe, that area, and the US have much more in common than what the there is that separates us. And I think that you don't have to look too far in this world to see really oppressive governments using the data of their citizens in the most appalling way. So we start our to geographical areas of the world from a place I think of relative openness. Being a democracy is an important foundation for these conversations. So we may start from different historical perspectives. And that's entirely to be expected if you read the history of Europe, in the end by the end of the world war two and what and what lessons that were wanting to be learned as they move towards a union of countries. You just understand the emotion of that better, and I think it's true they'll be important that we look back to look forward. So that rather than saying, well, and I hear it said a lot, you know, the European obsession with privacy and data protection versus the US more consumer, I just don't think we should be pitted against I think there are differences. I think some of them are incredibly nuanced. Some of them are incredibly legal. I'm only working through some of them, the courts, but I think fundamentally, we start from the same place, which is about respect for the human. So yes, of course, these things are complicated. And if you look at, especially the wording around GDPR, things like how to identify a special category data, that's all feeding from what happened to certain parts of the population during World War Two. So you can see that direct link, and it just helps you engage a bit better. It's a terribly difficult conversation when you've got slightly different perspectives. But it's too easy to make more of that difference than actually there is, and I think that we see some of these very complex legal challenges playing out, but let's celebrate the fact they're being played out. Because in some jurisdictions, there's just a given you just get told what your government is doing, you sometimes don't even get told it just happens to you, at least we can be part of the conversation. But I think it is difficult for organizations who are part of their business, want data to flow freely across the globe. I think that it has made people think much more carefully about where they're exporting their data to and what sort of protections and controls are in place for that data. And I'm not saying it's easy for some of those organizations. But I'm saying we I think that we are committed across our jurisdictions to finding a solution for businesses to be able to have the free flow of data for the benefit of the economy, but also without compromising on individual's rights.


Debbie Reynolds  26:53

Wow, I love talking to you about this. You have such a level-headed way of thinking about it. So let me switch to surveillance. So you and I did a panel together for the National Liberal Club, The Fences Security Circle, and our topic was sort of the surveillance state who is watching, and I'm just going to give a shout out to our other panelists, Daniel Cope, as global data privacy at HSBC. We have Helen Beveridge, who's the Data Protection Officer oversight, Yasmin Hines, the global privacy lead and legal counsel at pontoon solutions, and me. So this is a great session. I think it's online, so people can like to look at it. And I may put a link in the transcript for this particular session, but it was really great because surveillance is definitely what's happening now. And it's what's happening next. And it's something that, you know, is sort of, you know, for me, it's like, okay, we see all these new regulations that are coming out, and will continue to come out and different jurisdictions. And then, on the other side, we have all this technology, just really speeding ahead so fast. And you know, so much of the technology is being developed, you know, will result in surveillance, whether its intentional or not, because there's just so much more data that is going to be collected as a result of kind of technology. And so, you know, I would love to know, you know, your thoughts on kind of where we're going or where we need to go, or what we need to be thinking about in the future as it relates to, you know, you know, surveillance, whether it be how companies, you know, to me is more about how companies implement tools that they're using and making sure they respect the privacy of individuals, because they're obviously companies, they're going to make money from selling, you know, these technologies, but it really is incumbent upon the user to be able to figure out navigate what's the best use of it, especially in different jurisdictions. And I feel like that's the place where companies struggle.


Emma Martins  29:29

Yeah, I totally agree. And I think a point I would like to make, which I feel quite strongly about, is that we need to be honest about the fact that technology moves faster than regulation and law can deal with. So if you're always looking to regulators and law enforcement to deal with some of these challenges, it's going to be too late. And surely we are more intelligent as a human race. To let it get to that stage. We need to have these conversations. We need to build in human values, human conversations around all surveillance, especially AI, but the question is who is surveying? Why are they surveilling? How are they surveilling? But the terrible truth really is that each and every one of us is carrying around the most extraordinary surveillance device without giving it a second thought. And I think that one of the things that those of us in this community are working hard to do is just raising awareness if you understand that sort of day-to-day surveillance that is actively going on. And it's quite one thing if you talk about government surveillance. So if you're, you know, talking about our different governments may be implementing surveillance for national security purposes, or border control, who was surveying or smartphone which is collecting the data on my phone and my tablet? Where is it going? What are they doing with it? How, where are the transparency and accountability? Because if you've got if you're in a democracy, at least in theory, there is an accountability mechanism through the ballot box. But if it's some anonymous tech organization that is geographically remote from you, and how do you get into that if you've got a problem? So I think it's a much broader conversation. The technology is there. It doesn't mean to say we should always use it. And the danger is we see these nice, shiny new tech items of the yes. We'll have a bit of that. Let's worry about the data issues later. That never ever works. You see in the GDPR standards. This needs to build in privacy by design from day one. And that is not just about making allowing the technologists who are doing phenomenal stuff. But they can't just be technologists in the room. They have to be lawyers, technologists, ethicists, social scientists. We all need to be part of the conversation because it's not just what technology can do. It's what it should do, what we want it to do. We need to take back control and ask ourselves, what do we want? And how do we want to achieve that and the dangers these, these decisions are taken out of our hands, and we just gallop away, simply because the technology can and then to pick up the pieces, later on, is incredibly difficult, if not impossible.


Debbie Reynolds  32:08

I agree with that. I agree with that. You know what, I see a lot of cases in the news, you know, happening with regulators, and this happens a lot. So let's say, let's say an organization, let's say they have security. Well, I give a good example, a real example. So I live in Illinois, in Chicago, and there's a bakery in Chicago many years ago that implemented a fingerprint scanner for people to clock in and clock out at work. And so I'm sure what happened is they saw it somewhere, advertise me for all this would be so cool that we do this, or whatever. And what I think that company didn't realize is that there's this very strict law in Illinois called the biometric Information Privacy Act. And they were supposed to let the people know, before they collect the data, like what they were gonna do with the data and how long they intended to retain it. So because they didn't do that, an employee filed a case against them, and, and the employee won this case, and then the redress was quite high because the way that the court in Illinois interpreted the data capture how they quantified it was, you know, this company was in violation every time that they collected the fingerprint of this person. So that can add up, right, you know, so let's say someone punches in and out like four times a day or something for over many years. So, you know, it was a pretty significant fine for this company. But you know, what I'm seeing in the news a lot of times is, you know, companies will say, implement this cool software or this technology, and then you know, does something or collect something that maybe they shouldn't have collected, and then regular gets wrong, they say, Okay, I want to delete this data that you're retaining, first of all, and then stop doing this process. And some of it is, maybe there's, you know, either there's a notice or consent that you have to get from a person, or it may be a situation where there may be a part of the technology that you're using, that you just don't use, that you turn off, you know, enter in a jurisdiction, you know, so that, to me is, you know, it's putting the ball in the court of the user of technology or the implementer of the technology, and I think people need to really think through you know if you have something that does 10 things, you know, are those 10 things even legal in your jurisdiction, okay, you turn those certain features off or find a way to kind of mitigate that risk.


Emma Martins  34:56

Well, it's such an issue, and I love stories. I think you're making it really so important. But I'm pretty sure that if you'd spoken to that bakery six months ago, a year ago before, and said, right, do you want to set up a system that's essentially illegal, essentially immoral, essentially oppressive your work? Because they would have probably said, No, thanks. So the question again, for us, how do we get that front-loaded? How do we get those questions to be asked by those organizations at the beginning of the process, rather than after when they've got a multimillion-dollar, whatever it is, to their name. And that's the big challenge for all of us who care about this, not just regulators, but I'm pretty sure going back to my earlier comment that most organizations, whether it's a bakery, or whether it's a hairdresser, or whether it's a coffee shop, or a bank, want to get this right, either because they respect the values that it represents, or they just want to avoid fines. You know, the motivation is different across the board, isn't it? But I'm sure most of them do want to get this right. So the question for us, it's got to be, how do we help them do that rather than just trial waiting when things go wrong?


Debbie Reynolds  36:00

Right, right. I don't know. I think I don't even know how to how to talk about this gap. Really, I guess, the thing is, like, so the example is, you know, a company sold this technology to someone, and then it's kind of it becomes a problem of a company that implements the technology. But I feel like there's such a gap in the middle there in education, even. So, I think some people think, okay, because it's sold somewhere, it must be okay. And they don't understand their responsibility. So they may say, Well, you know, I bought it from Amazon, it must be great, you know, it must be, you know, is out there, let me implement it. And I'm not really thinking about, you know, the harm to the individual. Because, you know, it's, I think it's someone else's responsibility. What are your thoughts about that?


Emma Martins  37:00

Yeah, and again, it goes back to the sense that because the technology is there doesn't mean to say we should engage with it. And we need to change your mindset around that. But now, I think you're right, that there is a huge gap, and it fixes us. And I think, like anything in life, if you're faced with a challenge or a problem, you can just despair about it and sit in the corner and do nothing. Or you can say, Listen, I can't fix the world. But in my little sphere of influence, I'm going to try and fix this area, and certainly for us in a smaller jurisdiction. And I totally understand that this is probably easier in a smaller jurisdiction because we have proximity to our regulated community. We can access them more easily. But the questions that apply, how do you engage? I mean, I can guarantee outside of the very lovely privacy and data protection community, if you say to people what you do for a living, there's a little bit of rolling eyes to heaven. Isn't that as Oh, that sounds really fascinating. Not? And when my kids say to me, what do you do for a living? It's like, what only now it's becoming interesting now that you've seen some movies about it. But in the past, it was like, really? Can't you do something exciting with your life? Think about how data affects every single one of us every single second of the day? Why aren't we more interested? That's a question we have to find an answer to. We have to get a solution. And it's not to be patronizing. But it's to say in real terms, how do we get people to care, to care whether they sign up to the new whatever messaging system, they use the new terms, conditions, if they're not happy, there are alternatives in the sense of empowerment, a sense of engagement, a sense of caring is what we need to work to, I think, and it's, it's happening to me, but it's just not happening as fast as we would like, but it is happening. So I think the thing is to take those when it does happen, celebrate them, and really celebrate them and celebrate the brilliant people that are in this community. But recognize that we've got a long, long road still ahead of us.


Debbie Reynolds  38:53

Well, I agree with that. I agree with that. What if what will be your thought? Let's say we have a company, an executive at a company that's listening to this, regardless of you know, I say it's the barbershop down the street, or they count. Let's say they haven't done anything really on data protection, like where would you recommend that they start?


Emma Martins  39:20

Every year in this office here, somebody comes around and makes a note of the equipment, the things of value, and that's for insurance purposes, it's your data it did an inventory of possessions. You take the data away from any organization. It will not be able to function as it's the most valuable non-consumable asset an organization will own. We need to start recording it, looking at it, treating it in that way. If you suddenly, as a business, suddenly start to really think about the value of the data. It's like a gold bar. If you're in a have a bank, you're not going to put the gold bars out on the street for someone else to look after. You're going to put them in a suit. Look after them. So the same principle with data, what needs to be looked after? So the very first question is, what data do you have? On what basis? Do you have it? How open are you with the way in which you collected, so starting to look at it through the lens of value? And importantly, not just economic value? I think the economic question is terribly important. It's the I've heard it described as the fuel for the economy. It's more complicated than that. But you get the point. But it's also importantly, about each of us as human beings. So it's not just an economic conversation. We're not just commodities to be profited from. We're individuals with lives with hopes, with fears with expectations with rights, and they need to be respected too. So everything, the huge irony of all of this is that everybody wins. If we get it right, you know, there are people profiting from the important, but at the expense of a lot of other people, if businesses build their business models on treating people, well, everybody can benefit from that. And I think that's a sort of state of utopia that we, I would look to towards, and I hope we get to one day, you never know.


Debbie Reynolds  41:10

Yeah, yeah, I agree with that. I always tell people, you know, I try to make privacy a business advantage. So instead of thinking of it, some people think of it as a tax. This is kind of an extra burden that I have. And it's like, you know, I feel like companies in the future, if you can't show that you can respect the privacy of other people, you're not going to be in business, because if the is a barber down the street, you know, you have to choose between the two, one cares about data protection, and the other doesn't, you know, your you know, a lot of customers will go to that other Barber, so, to me, it's it can be a business differentiator for organizations.


Emma Martins  41:55

I totally, totally agree. And I really do want to shout this on the rooftops because it is an asset these words before, and they're simple words, but they're powerful in trust and confidence. And if, if an individual has trust and confidence in a brand, they're more likely to deal with them. And what erodes trust and confidence very, very quickly is misuse. And when we say misuse of people's data is a misuse of people. Because we are our data, we are increasing our data. So what happens to our data happens to us in real-time in the real world is not just a database or an Excel spreadsheet. It has consequences. So I think those companies, you can start to see it, you start to see advertising around the fact that we respect your privacy, your privacy matters to us. Now, some of those rings true, some of those less so. But again, it's a move towards a narrative whereby that's a commercial advantage. Now, whether they're, they're moving that way because they think it's the right thing to do in their ethical organization, or whether they're just responding to a client customer pressure, I don't really care because they're still going down. I'd like them to be engaging on an ethical level and great if they are. But if they're just responding to their client's need, their customer needs them super, because we are influencing the client and customer narrative themselves and what they're expecting what they're demanding of the companies that they interact with a max pressure that will have real consequences and real impacts.


Debbie Reynolds  43:23

Excellent. So if it was the world, according to Emma, and we did everything that you said, Well, we wish for privacy or data protection anywhere in the world, whether it's technology, people, regulation, what are your thoughts?


Emma Martins  43:41

When we step on a plane, we don't balk at all the health and safety and the guys checking the engines and the money it's cost us to pay for well-trained pilots and all the rest of the things that go around making that plane safe. I'd like us to move to well, where we don't balk at the legislative framework that sits around it. We don't balk at the can be an additional cost to make sure data governance is in place. We don't balk at all the things the law gives us. We celebrate it. And we are grateful for it. Because it makes our lives better. And I heard privacy described as a little like oxygen is you don't really realize it's there until it starts being taken away. And I think we're so privileged in our democracies to be talking about this. And to have a framework, we have slightly different frameworks you and ideally in our lives around data, but nonetheless, we have a voice. And these discussions are being had, and privacy and data protection is being taken seriously. And there are many, many people across the world that don't have that luxury. So my vision would be this is so wrapped up in human dignity and equality and fairness that every citizen has the right to so many of us take for granted and so many of us roll our eyes to heaven when we talk about but actually when you think about it is the foundation stone for us living free and fulfilled lives.


Debbie Reynolds  45:05

Wow, that's a really great answer. Oh, wow. Well, that's something to think about for sure. Well, you left me with a lot of things to think about. I'm gonna have a noodle on some of this stuff. Well, thank you so much for being on the show. This was great. I love to hear your perspective and such. You know you can explain these things in ways that I feel like any organization or any business can understand. And that's, that's really important. Because, you know, as you said, and as I say to, you know, this is about this is a human problem, right? It's something that anyone needs to anyone, and everyone needs to understand. We all have kind of a stake in the future.


Emma Martins  45:52

Sure do.


Debbie Reynolds  45:54

Excellent. Excellent. Well, I will be excited for us to do some other collaboration together. And I'll talk to you soon.


Emma Martins  46:03

Thank you so much opportunity, Debbie, really good to speak to you. Ratio. trained at