"The Data Diva" Talks Privacy Podcast

The Data Diva E170 - Dr. Valerie Lyons and Debbie Reynolds

February 06, 2024 Season 4 Episode 170
"The Data Diva" Talks Privacy Podcast
The Data Diva E170 - Dr. Valerie Lyons and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva” talks to Dr. Valerie Lyons, Chief Operations Officer, BH Consulting and Author of “The Privacy Leader Compass”. We explore the connection between Environmental, Social, and Corporate governance (ESG) and privacy, discussing how privacy can contribute to the social element of ESG through initiatives like corporate social responsibility reports. We also discuss the structure and unique features of "The Privacy Leader Compass" book, a practical guide for building and leading privacy teams. The book uses the 7S model from McKenzie for organizational effectiveness to make the privacy team and program effective. Dr. Valerie Lyons highlights the significance of developing executive presence and soft skills for effective leadership. She shares her experience of learning to communicate effectively with senior executives and emphasizes the importance of brevity, bullet points, and knowing what senior executives are interested in. Debbie Reynolds and Dr. Valerie Lyons discusses the importance of understanding motivation and finding the key to change for the people they are communicating with. Dr. Valerie Lyons delves into the complex relationship between AI and personal data. She highlights the power of AI in detecting medical conditions through mouse movements but also expresses concern about the invasive nature of AI inference and the lack of transparency in the process. Lyons emphasizes the need for ethical considerations and prudence in utilizing AI while acknowledging its potential for progress. Overall, she raises important questions about the implications of AI inference on privacy and personal autonomy and shares her wish for Data Privacy in the future.

Support the Show.

41:39

SUMMARY KEYWORDS

privacy, ai, csr, data, organization, piece, ethics, legislation, work, inference, esg, talk, book, robust, skills, governance, create, information, world, started

SPEAKERS

Debbie Reynolds, Valerie Lyons


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information the businesses need to know now. I have a very special guest on the show all the way from Dublin. This is Dr. Valerie Lyons; she is the Chief Operating Officer of BH Consulting. Welcome.


Valerie Lyons  00:44

Thanks very much for having me. I'm delighted to be on your show.


Debbie Reynolds  00:47

Yeah, I'm very happy to have you on the show. So, we know each other via LinkedIn. You and Todd asked me to collaborate with you on a book that has come out. I actually got it in the mail recently; it's really good. It's called The Privacy Leader Compass. And it's a very popular book. But in addition to the book, I want you to tell a little bit about your background and your trajectory into privacy and what really got you interested in it.


Valerie Lyons  01:22

Thanks for that introduction, Debbie. So, what got me into privacy? Well, I suppose I have been in cybersecurity; it's very much the first part of my career; I wasn't a very good programmer. And so that led me to fall into systems admin, which eventually led to cybersecurity and security,  and security leadership. So I was the CISO in KBC Bank for about just under 15 years, which is a long time to be in a CISO role. And then I decided that I wanted to do a PhD. And I spoke to a professor in Trinity College, which is where my undergrad was from. And he said I should look to my past and then look to the future, let the Ph.D. be the funnel between the two. So, I decided that I liked privacy. And I wanted to do a little bit of work on privacy. And I thought that privacy was the future because GDPR was 2016, GDPR was all the talk and all the rage in 2016. So I did a Ph.D., or I started a Ph.D. in Information Privacy at Dublin City University at that time. But that morphed into a Ph.D. in privacy as an ESG. And privacy as a CSR, which was really, really topical by the time I completed the PhD. So my research primarily focuses on what is the value of privacy as an ESG. So what things can an organization do that get them the most bang for their buck in terms of privacy as an ESG or CSR? So that was kind of where I came to be as, during that time then, separate to my PhD, I also moved roles, and I moved into BH Consulting. At the time, there was no consulting department or team for privacy; it was a very new thing that was established after 2016 in Europe we started to demand DPO as a service and CPOE service. So essentially, we set up that team, and I led that team until the team became too big, and I became COO, then about three years ago in BH Consulting. So we now have two teams, one in data protection and one in cybersecurity. And then for myself, what brought me I suppose to writing the book was the PhD was now completed. And I always wanted to write the book, I knew I had learned so much information over the past seven to eight years. In fact, it was the guts of 10 that I had been working in privacy at that point that I wanted to distill it down so that other CPOs and DPOs could kind of leverage that information that I gained over the previous decade. So that's how I got to be sort of where I am today. I always say that my passion is privacy and what I did for a living with cybersecurity, whereas now I feel like I found my tribe, if that's the right word.


Debbie Reynolds  04:23

That's tremendous. Thank you so much for sharing that. So you have this background in cyber, this deep interest and background in privacy, and then ESG. I love the way that you connect that together because a lot of people, I don't think, see that connection. Let's talk a little bit about the connections. I think when people talk about ESG, they think about other societal issues. They don't really think about privacy as a societal issue or evil that you need to really think about, but tell me about how those two connect.


Valerie Lyons  04:57

So the ESG has three dimensions: The environmental, the social, and the governance. So everybody thinks about privacy as a governance issue, though. So the G is well established the E, which is the environmental piece, really, there's pretty much zero link. I've seen some tenuous stretches where people have started to link encryption and Data Privacy with the E, how much carbon footprint that generates, and things like that. But I really think it's a bit of a tenuous link. I've seen better, more recent links on the E side. So within that dimension, the environmental dimension, which is the whole working from home, is enabled by privacy professionals. We enable that to happen securely and safely together with the cybersecurity people, but the two are working hand in hand to make that happen properly. And so what carbon footprint are we saving? And that would be a good one for E is that by allowing remote working, are you reducing the carbon footprint of your organization that could be attributed to an A? Again, I don't think it's very robust. But the S, the social element, absolutely is, so we see the social element being promoted through what our corporate social responsibility reports. That's the best insight we get into the S piece. So, ESG is typically quantitative. And CSR, the S piece, is typically qualitative. We're seeing those merge where you're getting quantitative and qualitative results put together, more often than not, and this is where the thing that you're saying about people referring to ESG and privacy, it's a new thing. More often than not, when they're referring to it, they're talking about governance window dressed as the S piece. But actually, the S piece is far more than governance. So, I've analyzed, in my research over the years, multiple CSR reports. And typically what the CSR reports contain is greenwashed compliance. So saying things like all our people have been trained in Privacy Awareness Training, but GDPR says they have to do that anyway. So it's a marketing spin on lawyers' compliance, but many of the large tech organizations actually do S, more real S, real social contribution from a privacy perspective. So we see them doing things like Cisco, for instance, we see Cisco, writing amicus briefs and fighting governments against privacy legislation that isn't robust enough for the consumer. We see Microsoft, for instance, saying that they're implementing Privacy Awareness Training across all the organization, regardless of the requirement to do so and implementing GDPR levels of data protection, regardless of the requirement to do so. Now, the cynical may could argue that, well, actually, it might be easier to do that, than comply with a whole panoply of legislation legislations from around the world. But at the same time, it's something they don't have to do. We also see conferences being run by organizations; we see organizations pulling together privacy standards and AI governance standards. Now we see filtering into that, for the public to use, we see white papers, these are all social pieces of privacy, that we are truly social that they are beneficial to the public. The reasons why organizations do this, there's multiple reasons. And obviously, much of it is about consumer trust. So I don't really care what the motivation for the organization is to do the S piece to actually do privacy as a CSR; I kind of see it as a means to an end; I like the end; the end is that we get better privacy, but the means to it and the motivation factors that they have words, getting that it's most likely increased reputation, sales, all those kinds of elements. It doesn't matter to me as long as the end product is more robust privacy.


Debbie Reynolds  09:03

I agree that right, you want the benefits, right? The actions that people take, and it may not necessarily be important to me; I agree with you. Why do they do it? That's great that they do it. As you were talking about the S in ESG, it occurred to me that part of the S could be ethics. And I just want your thoughts there. Because I feel like when a lot of people talk about privacy, they only think about the legal aspect. So there's like, okay, we're not going to do anything until there's a law passed. And then we're going to put all these programs in place, but there's more privacy than compliance for regulatory elements. And to me, outside of that can be ethics, right? Why you do what you do. Why do you care about the individual? Are you thinking about the harm that could be done then with that person? As you were talking, I'm thinking, now maybe that's where ethics goes, I don't know, what are your thoughts?


Valerie Lyons  10:08

You're absolutely right. To me, ethics does, to some degree, fit into the S piece; if we look at the definition of CSR, CSR is considered to be all those activities that an organization undertakes that exceed. So there's two qualifications that exceed regulation, if they don't have to do it, and contribute to society in some shape, or form. So there's the two elements have to be met, because you can do something that exceeds regulation. But it's not CSR; it doesn't contribute to society in some way. Now, society can be the consumer; it doesn't have to be like the world or global world; in general, it can be just as simple as the consumer, or the shareholders, the wider stakeholder community. So, it can even be governments and policymakers. So it's just wider society. Ethics. If we think about it, no legislation, GDPR in particular, can be effective if we don't adhere to the spirit of the law, not just the letter of the law. That piece is ethics. Ethics is also as you say, just because it can doesn't mean it should. And it's in that space of just because you can doesn't mean you should. In other words, we're going to look at potential harms that can be caused by the activities that we undertake, and we're going to avoid those potential harms. That is definitely CSR. So I can, you know, give you examples of CSR where organizations are trying to avoid those harms. Henson, for instance, in China, they create games and embedded an ability within its games, to not have addiction, you can create a four hour pause within it so that it actually worked for the rest of the day. But D of that was so that your child wouldn't become addicted to it. So they're building that in, but they're disconnecting from them, getting more use out of you, because they know that it's better to not do harm. And so we see this element of avoiding harm. But it's very hard to say it's ethical when you have a piece of legislation like the Digital Service Act now coming at you, and that tells you actually there's a code that says you need to do now. So we see ethics being a fluid thing, that sometimes it sits in the attic spaces, and sometimes these activities are set in ethics. Sometimes they don't because they sit in governance in some jurisdictions. So, the Do No Harm piece is actually coming through in Europe through the Digital Service Act. And in other jurisdictions, we might see it more as a CSR element, because there is not yet that set of similar regulations. So the ethics piece can be sometimes in the CSR space, and sometimes in the G space, if it's moved into a regulation.


Debbie Reynolds  13:12

I agree; I love this; I love to talk with people who have dealt with these issues on a deep level as you have with your research and your work. You can explain just as you had just done in a very deep way, in a very explainable way, why these concepts are important and how they apply to people. Let's talk a little bit about the book The Privacy Leader Compass. So, it's a comprehensive, business-oriented roadmap for building and leading practical privacy teams. And it includes insights from 60-plus top privacy experts and privacy leaders; you were gracious enough to ask me to contribute to the book; the book is very good and I'm not saying that just because I'm in it. But the thing I like about this book, and I read lots, I see lots of privacy books, everyone sends me their privacy books, I can read it. But the thing that I think is really unique about this book is the way that it is structured, where I have a lot of practical tips and takeaways that people can use right now in their job, whether they're on the legal side, whether you're on the data side, tell me about just the structure of the book, and how you mean for people to be able to consume and use it in their work.


Valerie Lyons  14:35

So you're absolutely right, actually, I think the structure is something that's unique to the book. And when we looked originally at writing a book, I always consider myself a bit of a hybrid though, I looked at the literature that was out there, and I found it a bit jumpy and didn't find it structured enough for me and for the way I think and the way I need to see things sort of presented to me, just in terms of how my brain works. And so academics, when they encounter that, they tend to look to find a structure and another field and use that structure in another field. And essentially, that's what we did; we took the 7S model, which is a model from McKinsey that was used for organizational effectiveness. And so we looked at the privacy team in the privacy program as an organization. To make it effective, we use the 7S model. So, the 7S can apply to any type of organization. So it's strategy, which obviously applies to privacy, because you do have privacy strategy, and particularly in larger organizations, but your structure, obviously, you're going to have privacy structure, is it going to be top-down? Is it going to be portable? Are you going to have a DPO? You're going to have CPOs. So that structure, you got systems, and as the frameworks that you're going to use to structure your privacy program, you've got style. That's your leadership style. If you're leading a team, What style do you have? And what implications does that style have? You've got staff? So how are you going to manage the privacy team in the book views Melvin's team roles because it’s a particular favorite of mine? My master's is in leadership. So, I kind of leverage the key things I picked up during my master's that I've applied in my workspace. And then we have skills, what skills do you need, you can find it as in skills, for a DPS and CPOs on the Internet. And they usually are related to the knowledge that they have to have. But that's not what I focus on in the book; what I focus on in the book is the soft skills that you need to lead a team of people. And those people are privacy people typically. So it's the skills that you need to lead those people. And then shared values are the laws that we have to understand that that skill set obviously is important to understand the laws and the jurisdictions that your business is operating in, but also the ethics piece. So, I cover ethics and ESG hugely in that chapter. Because it's so important, in my view, and obviously, in yours.


Debbie Reynolds  17:13

That's tremendous. Thank you so much for that; I highly recommend the book; this is one of those books that you just have to keep around. Because there's always going to be a nugget of something in here that you may need. That may be something that you are thinking about that you need to consider that you can go back to time and time again, I think that's really great. One thing that you said that I want to expand upon is around soft skills. So a lot of times I get this question all the time from people, how do I get into privacy? How do I improve my skills? How do I get traction within organizations? this is an example I've actually heard people say: it does not work, right? So if you are a privacy person, and you come into a meeting, and you're like, I'm in charge here, and I'm going to do X, Y, and Z. Or you're just not going to be successful in that organization. So, it definitely takes that soft skill part. And the ability to be able to talk across different silos within organizations. But talk to me about the strength of those soft skills because I feel like some people feel like, okay, I went to IAPP, I got a certification, I got this degree, and then I'm going to be the best privacy, whatever, whether that's a Data Officer or Privacy Officer or a Data Protection Officer. But some people don't know how to talk to people in different areas, and they don't know how to get their trust and their buy-in, and without that, you will not succeed. But tell me your thoughts.


Valerie Lyons  18:48

It's a brilliant comment, Debbie, and actually relates back much to your contribution to the book, getting buy-in from senior leadership teams. How do you do that? I've heard this question being asked so many times: I have everything that I need, all my qualifications. I've got everything. How do I do that? And what executive presence? What does it look like? How do I establish it? They're not easy skills. It's much easier to learn the subject matter expertise than it is to learn how to behave during a data breach, how to keep the team going during a crisis. The skills that are required are not specific to privacy. They're much, much wider and much, much broader. I would say to everybody, go to a master's in leadership, go to an MBA, go to something that's going to give you an outlet to practice those skills and learn those skills. If you can't learn them on the job. They're so important, how to speak to senior executives. I had a very early experience many, many years ago when I was working in the bank. My first time speaking up to the board about an email system that I wanted to introduce this, the board wanted to hear about. My then boss, who was very, very good to be mentoring me, said, just keep it baby language, just baby language, no fancy words, nothing fancy, just baby language. And I started off by saying I wanted to discuss the complexity of the information assets in the organization. And down the table, I saw him put his head in his hands. And I knew that meant I had gone wrong. Already. I had literally not even finished my first sentence. And then I changed course. And I just said, okay, I'm going to use baby words here. But actually, I've learned, you know, over the years, it's not about the words, it's about privacy. It's about knowing exactly what it is they're interested in. They're interested in profits, they're interested in share value there, they're not interested in that tiny minutiae of why a machine didn't work. And so it's gravity bullet points, knowing exactly what those bullet points are before going into the room. And knowing your bullet points, always, because you could have an opportunity, but elevator pitch, in the lift. At any point in time, when you meet someone on that team, you need to be prepared for that one bullet point: what we're doing this year, what's important to the privacy team this year, what is the biggest challenge we face this year, just one bullet point that you always have in your head, and go through the soft skills as I see them in the book. And I analyzed each one; we talked to an awful lot of people when we were creating the book about what those skills were, that they thought they had, or the skills that they felt were missing in a lot of leaders. And so the idea is, it's a compendium of behaviors that we should see in leaders, and no one is going to have all of them. But you can look at the compendium and say, I'm actually not too bad, let's say 75% of these qualities. These are the ones, obviously, that I need to work on. So you can see from the compendium of these skills, where your maybe your areas to develop are.


Debbie Reynolds  22:22

I love that; I love it. I love to talk to executives who have been in corporations or working with corporates for many years, because you can hear the maturity and the learning there. I agree very much; you definitely have to figure out the motivation; what motivates the people that you're talking about? What are they most interested in, I used to say, it's very important. So if you're going into a meeting with the board, think about what they want, think about what they want to hear and be able to try to make sure that your message resonates with them. The best way to alienate the person that you're talking to, especially if they're not a privacy person or legal person, the last thing they want you to do is spout out article numbers and talk about GDPR and different acronyms. So that's like the worst you could possibly ever do.


Valerie Lyons  23:15

That is so true.


Debbie Reynolds  23:17

I always say you have to be able to, especially in privacy, because there's a human issue; you have to be able to talk to someone who's 80 or 8. So when I'm talking with people, I try to make it as simple as possible and try to think about it around the things that people care about. I had an example, a specific example I want to give you about this; I was talking to a lawyer about some system that we wanted to do. This person did not want to go along with the suggestion. And I realized that this person and they worked tons of hours, they have young children and stuff like that. And I said, do you want to go home at 5 o’clock? And that resonated with her. And I'm like, okay, this is if we do this this way, you won't have to work 12 hours a day, we have to do something different. So being able to find those things, interests, things that will resonate with the person you're talking to, will really help you get that buy-in.


Valerie Lyons  24:21

Such a good example as well. Just finding there was a great book called Change or Die. And that was the guy who wrote the book said that the only way to create the change was to figure out what people's keys were, to find their key, a key to change for them. And everybody's key is different. So you're right, find their key.


Debbie Reynolds  24:42

Excellent. I want to talk a little bit about the EU; you're included in the EDPV's expert pool for privacy. And I know the AI Act isn't directly privacy, quote unquote, went away. It is because so much of the data we're dealing with is off humans. And so that raises the risk in privacy. So, to me, those things are connected. But give me your idea about what you think the impact of EU Privacy Act will be. Before you answer that question, as someone who saw the GDPR when the GDPR first passed in May of 2016, I thought I would wake up on May 25, 2016, and that people in the US would just care about privacy; this is the thing I have been working on for many years and nothing, there were no articles, there was nothing in the paper and more popular, you're thinking about it or whatever. And I was crestfallen; I thought, this is so important; this is going to change so much of what we're doing. So two years later, once it actually went into enforcement, then you started to hear people talking about it. And since then, we've seen how influential GDPR has been across different jurisdictions around the world. So with that backdrop in mind, and you sort of at the center of what's happening in the EU, with AI, being first jurisdiction in the world, have comprehensive regulation featuring AI, what do you think are going to be the impacts globally?


Valerie Lyons  26:22

I don't think it's going to be like GDPR. So, I don't think it's going to have that same impact. But I think that we actually don't really know yes, because we know that this piece of legislation was produced very, very quickly. It doesn't mean that it's not a robust piece of legislation; it is a good piece of legislation. And a huge number of people invested time into getting it across the line quickly. But I think that rather than the legislation, it's the other piece, it's actually the AI. We don't know where that's going. And we don't know how it's going to take off. And I'm not too sure that the legislation knows where it's going to be in five years’ time. GDPR knew that it was dealing with personal data and what was going to happen with it. But it didn't know about AI in the way that it does today, which necessitated an additional component, which is the AI act. I don't think the AI actually knows what lies ahead. Because of AI. I mean, we can say, well, we know there's going to be problems with training data and with copyright and with creator sort of patents. There are so many issues that lie in the AI space. But we didn't know when Twitter started, how addictive likes would be. We didn't know when Facebook started. But it could be used for bullying. We didn't know these things when those social media apps started, but that became apparent as the actual apps evolved. And so now we find ourselves, I think the AI Act is only going to address the burdens of the AI world. But I don't think it's going to be able to cope with the teenager. If you want to call it that. I just wonder whether the process that we take, which is we watch, we observe, we see the issues, we create legislation that takes years to come to the fore. And then we watch, and we see, and then we created, but we're always firefighting, we're always behind the issues and the problems. And then we're trying to create solutions through legislation. And I don't know whether that's the right way forward, as data becomes so big. I think maybe we have to radically rethink how we approach protecting privacy. You know, and by that I mean, like thinking about sort of federated identities and things like that. And that will only fix the personal data piece of AI. There's lots of AI stuff that's not related to personal data. But that would be my view on our approach. I think the AI act is necessary. I actually think it's vital right now. I'm delighted that we have it doesn't mean more work for privacy professionals. Absolutely. But it just means enormous amounts of work. Not a few were good. If you know what I mean, if you had all your boxes ticked, and you were already addressing the ethical components, the fairness components, the transparency components, the necessity and proportionality principles. If you weren't already good in that space, I think you're going to be okay with AI. But I think if you're weak in that space, and if you have a lot of leverage, then you're going to find a lot of work needs To be done with the AI regulation for your organization, I think it's back to the same thing as we were with GDPR, which was there was an awful lot of organizations were not compliant with the directive made a huge jump to make become compliant with GDPR. And so they complained that it was such a big job. But actually, they should have been already so much better with the directive, because it had no stick to beat them with, they never bothered. And I think that may be the same, you know that with AI, they've been compliant, the ethical component, not been very good with that space. And the AI regulation may move in and sort of become more robust in that space. But we've yet to see, we have to wait.


Debbie Reynolds  30:48

I agree with that. I love that point of view. When I think about AI, the blind spots I see are the weak areas that I see. And I agree with you; companies that are very good and robust in these other areas can add on the considerations for AI and probably get this much of a step, right? They're just adding on; they're building on their sound foundation. But for companies that haven't done that, it'd be a heavier lift within their organizations. But when I think of AI, I think about two things that will say companies who aren't ready to handle what the AI Act or different AI legislations have come into force. And that is the data governance area and the data lineage. So, a lot of what AI calls for is this level of transparency in how data is being handled of individuals. And a lot of that stems from the lineage of data. What is the data you have? Where did it come from? Why are you using it? How are you using it, and to me, part of that task lives or will live or sits in governance within the organization, but we have the broader because before governance was like, we don't care where the data came from. But this is what happens to it when I get it, or whatever we're thinking of from an AI point of view; you have to know that part of knowing that governance is understanding where the data comes from. Do I have the right consent? Do I have a legal right to use this data? Even if I have a right to use this data? In one sense? Can I use it in this different other way? What are your thoughts?


Valerie Lyons  32:28

I think that there's a monster of complications. When it comes to AI. It's not straightforward. So training data, for instance, that's a monster. Is my data going to be used for training data? In fact, it probably already has been. And how do I say no, now that it's already in there a very difficult thing to do? How do I know what the AI is creating in terms of information about me? AI is a wonderful thing. I'm the positivist. So I always come at these things from what are the amazing ways that it can be used, then prudence in how we do that, rather than let's all be scared about AI. Obviously, there are some scary things. But if we were to take the scary approach to it, you know, we'd have never gone beyond candles towards electricity. It does represent progress. I think that, for me, the big issue when I think about my own personal data with AI is the knowledge of what it's created. Microsoft has created tools that can detect early onset Parkinson's through your mouse movements using AI. That is incredibly powerful. But what else can they infer? So, it's this space of inference that I find almost disturbing. I'm a wow factor with it as well. But you know, I moved between two spaces where I go; wow, that's incredible. But also, wow, that's really a little bit invasive, you know, my mouse movements, and getting early onset Alzheimer's. And they can do to the same accuracy as medical tests, they can also do early onset Parkinson's. So there's so many different things that are positive. You can see straight away the invasiveness in the inference, and it's the inference that AI is capable of is far greater than any inference a human being could ever make. Because it's a pattern matcher, it can match patterns. And so what does AI know about us? I can't make a subject access request. If I don't know that you've created that information about me. Traditionally, I've known what information you have because I actually gave it to you. I know you have my name and address because I gave it to you. I know you have my name and address because you send me something. But what do governments know? What do large corporations know about us through inference? And it's this capacity to create inference that I feel very uncomfortable about, and I'm not too sure the AI regulation is going to fix that. And I think that in this space, we're going to see many breaches of legislation in the inference space. We have tiny little nuggets happening all the time; we saw a really interesting court case, not related to AI, in Lithuania, where a Lithuanian woman said the records were being kept by the government and demanded by her government for her spouse's name. They were sensitive, not normal, personal data; they were sensitive personal data because her name was Louise and her spouse's name was Mary; therefore, you know, that my orientation, my sexual orientation is homosexual. And so they weren't using the right legal basis; there were a lot of complications. And that's not including AI. But that's an inferential piece of data that they hadn't considered. Some argue, which is fair; well, it also has always inferred intersexuality. But we never thought about it, you know, how sexuality was always inferred by your spouse's name. But we just never thought about it before. And I thought that was an interesting comment. I was at a workshop and somebody made this comment. And that it was an interesting one, which is true. These governments have had our spouse's name. So they've been able to infer her sexuality. For many, many years. It's just now that marriage is allowed for everybody. They can infer homosexuality. And so I suppose the minority element, we have to be careful. But that's where I see the most dangerous space in the AI right now. In the bit that we know, I know, there's a bit that we don't know what none of us can anticipate with that, that sort of the next five years looks like.


Debbie Reynolds  36:58

Wow, I agree with that. I actually did a video a couple of years ago about inference. And so that is the problem because like you said, it's not something you gave someone, it's something that they connected the dots about, and they may use that in some way that could possibly be harmful, which is a challenge. So if it were the world, according to you, Dr. Valerie and we did everything that you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or something in technology?


Valerie Lyons  37:32

I personally think there's a massive disparity in terms of privacy across the world. And I'm not just talking about the US versus the EU versus Eurasia. I think privacy has become almost a commodity of the wealthy States and nations. I'd like to see privacy become a more social expectation that people are entitled to privacy. We do have privacy as a right. But it's a different thing to data protection. So I'd like to see all the countries; I don't expect them to have robust data protection. But certainly, I know, for instance, that markets in Africa, you'll find that the iPhone doesn't have the same market in Africa as other phones because they're cheaper. So, therefore, it's like the data of the poor is leveraged. And that I'd like to see change. I don't see much motivation to make that change. I suppose we've got bigger fish to fry. In a lot of ways, obviously, I think it goes without saying there'll be everybody would like to see a federal privacy law. But we can bind that one today. But it would be nice; it would be fantastic for professionals in the US to just have one piece of legislation that is harmonized across the States. I think it would be great to have that. There's also other laws. I think, for instance, you've got the Illinois Biometric Act. You know, there's an awful lot of talk about Europe saying, oh, well, you know, the US should take those European laws. And I think just US laws we should be looking at and saying that's a really fantastic law. And we should look at that law and expand it. Because I do think it's a really robust piece of legislation. I think the world would be an awful lot easier if our legislation across the world were much more similar. They don't have to be exactly the same. But if they were much more similar, and then that, particularly I think in the US, I think data subject rights need to be stronger. And I hear a constant diatribe about you know, the cost of data, subject rights and delivering data, subject rights, etc. Being high cost in capitalist organizations. And I get that that's actually the cost of processing data. You just happen to be getting it cheap is my argument. And so I'd like to see that change. I think the ethical approach to privacy as a CSR is becoming more and more important. I've seen the research show that over the last 10 years, privacy as a CSR was only being spoken about and discussed in CSR reports in large tech companies way back 10 years ago. Now, it's pretty much the Fortune 500. They're all talking about privacy. So we're seeing it go in the right direction. It would be my Santa's list.


Debbie Reynolds  40:30

That's a great Santa list; I cosign your Santa list. I agree with that as well. Well, thank you so much for doing this. Thank you for doing this from Dublin. I really appreciate your point of view. And I really appreciate the mark that you're putting on the privacy industry. We need more people like you who are really willing to share your knowledge and wealth of information with all because I think we learn so much from one another. And I think that you're really ahead of the game in terms of leading and showing your leadership in that area.


Valerie Lyons  41:08

Thank you so much, Debbie, I much appreciate. And thank you for your wonderful webcast as well. Yeah, it is my favorite.


Debbie Reynolds  41:21

Thank you so much. I really appreciate it. Well, we'll talk soon. I really appreciate the talk today. This is fantastic.


Valerie Lyons  41:30

Thank you so much for having me.