"The Data Diva" Talks Privacy Podcast
The Data Diva Talks Privacy Podcast
The Debbie Reynolds “The Data Diva” Talks Privacy Podcast features thought-provoking discussions with global leaders on the most pressing data privacy challenges facing businesses today. Each episode explores emerging technologies, international laws and regulations, data ethics, individual rights, and the future of privacy in a rapidly evolving digital world.
With listeners in more than 157 countries and 3,594 cities, the podcast delivers valuable insights for executives, technologists, regulators, and anyone navigating the global data privacy landscape.
Global Reach and Rankings
- Ranked in the Top 2% of 4.6 million podcasts worldwide
- Top 5% of 3 million+ podcasts globally (2024) – ListenNotes
- More than 1 million downloads worldwide
- Top 5% in weekly podcast downloads (2024) – The Podcast Host
- Top 50 peak in Business and Management (2024) – Apple Podcasts
Recognition and Awards
- #1 Data Privacy Podcast Worldwide (Privacy Plan)
- 5 Best Data Privacy and Data Protection Podcasts for 2025 (Velotix)
- Best Data Privacy Podcasts 2026 (RadarFirst)
- The 17 Best Privacy Podcasts To Listen 2025 (bCast)
- Best Data Privacy Podcasts 2025 ( Player FM)
- Best Data Privacy Podcasts 2026 (Goodpods)
- Best Privacy Podcasts 2026 (Feedspot)
- #1 Data Privacy Podcast Worldwide 2024 – Privacy Plan
- The 10 Best Data Privacy Podcasts in the Digital Space 2024 – bCast
- Best Data Privacy Podcasts 2024 – Player FM
- Best Data Privacy Podcasts – Top Shows of 2024 – Goodpods
- Best Privacy and Data Protection Podcasts 2024 – Termageddon
- Top 40 Data Security Podcasts You Must Follow 2024 – Feedspot
- #1 Global Data Privacy Podcast (2021, 2022, 2023)
- Community Champion Award – Privacy First Awards, Transcend (2024)
- 20 Best Data Rights Podcasts – Threat Technology Magazine (2021)
Audience Demographics
- 34% Data Privacy decision-makers (CXO level)
- 24% Cybersecurity decision-makers (CXO level)
- 19% Privacy Tech and Emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6% Media, Press, Regulators, and Academics
Engagement and Reach
- 1,000–1,500 average weekly downloads
- 5,000–11,500 average monthly LinkedIn impressions
- More than 15,000 subscribers to the Data Privacy Advantage newsletter
Sponsor Impact
- 4 podcast sponsors secured funding within 12 months of featuring
- $45 million average funding raised per sponsor
- 3 average new enterprise customer sales per sponsor within 6 months
About Debbie Reynolds
Debbie Reynolds, “The Data Diva,” is a globally recognized authority on Data Privacy and Emerging Technology. With more than 20 years of experience, she advises organizations across industries, including AdTech, FinTech, EdTech, Biometrics, IoT, AI, Smart Manufacturing, and Privacy Tech. As CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, she combines technical expertise, business strategy, and global regulatory insight to help organizations retain value, reduce risk, and increase revenue.
Learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E274 - Liz MacPherson and Debbie Reynolds
Episode 274 – Liz MacPherson, Deputy Privacy Commissioner, Office of the Privacy Commissioner, New Zealand
In Episode 274 of The Data Diva Talks Privacy Podcast, Debbie Reynolds, The Data Diva, talks with Liz MacPherson, Deputy Privacy Commissioner at the Office of the Privacy Commissioner of New Zealand, about how privacy functions as a critical guardrail for innovation rather than a barrier to progress. The discussion focuses on New Zealand’s purpose and context-based privacy framework and why strong privacy foundations enable faster, safer, and more trustworthy data use across government and industry.
The conversation explores a landmark case involving the use of facial recognition technology in supermarkets, where regulators, businesses, and independent evaluators worked together to test effectiveness, necessity, and proportionality before deployment. Debbie and Liz unpack why biometric data demands heightened scrutiny, how privacy impact assessments and real-world trials can reduce risk, and why facial recognition is not a plug-and-play technology. They also discuss the importance of human oversight, data quality, access controls, transparency to the public, and the risks of bias and misidentification when systems are poorly governed.
Debbie and Liz also examine New Zealand’s Biometric Processing Privacy Code and its role in setting clear thresholds for biometric use, including limits on categorization and inference. The episode highlights why data retention is one of the most overlooked sources of organizational risk, how unnecessary data creates downstream harm, and why treating personal information as a treasure rather than an asset to be exploited builds long-term trust. Liz emphasizes that organizations succeed when they place people at the center of data decisions and design privacy as part of the full information lifecycle.
Become an insider, join Data Diva Confidential for data strategy and data privacy insights delivered to your inbox.
💡 Receive expert briefings, practical guidance, and exclusive resources designed for leaders shaping the future of data and AI.
👉 Join here: http://bit.ly/3Jb8S5p
Debbie Reynolds Consulting, LLC
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:13] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast, where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:26] Now,
[00:27] I have a very special guest all the way from New Zealand,
[00:31] Liz MacPherson. She is the Deputy Privacy Commissioner at the office of the Privacy Commissioner of New Zealand. Welcome.
[00:41] Liz MacPherson: It's fantastic to be here, Debbie.
[00:43] Debbie Reynolds: Well, we were connected on LinkedIn. I saw your name pop up and I was like, oh, my gosh, I love to have Liz on the show. And so I was so excited when you agreed to be on the show.
[00:54] It took us a while to get this together, but I'm really happy to have you here.
[00:57] Liz MacPherson: Oh, well, thank you so much. It's a real privilege and I follow you, obviously, and,
[01:02] and just really find the insights that come from your podcast so valuable. So thank you.
[01:07] Debbie Reynolds: Oh, that's so sweet. Thank you so much.
[01:10] Well, I would just love to hear. I would love for you to tell the audience your trajectory and privacy,
[01:17] how you got to where you are and what you're seeing in privacy in New Zealand and beyond.
[01:24] Liz MacPherson: Okay. Well, like a lot of people who come to privacy, I've come to it by sort of bit of a wayward journey, but when I look back at it, everything was connected.
[01:33] So I have been working in and around the public sector in New Zealand for quite some time now, longer than I care to admit.
[01:43] And so I've worked in the public sector in roles, policy roles,
[01:50] governance roles,
[01:52] in the economic space,
[01:54] at the labor market space,
[01:57] and then moved through into leadership roles in those areas at our Ministry for Business and Innovation and Employment.
[02:08] I was at the Department of Labor leading part of that for some time.
[02:12] And then from there I moved and became the government statistician of New Zealand.
[02:21] So that's the equivalent of the chief statistician for the US and so in that role, I was responsible for all of the data and statistics, national data and statistics that's provided for New Zealand an independent role around the production of those data and statistics.
[02:40] And in that role,
[02:42] very focused on the power of data, the fact that information can unlock so much for individuals, for society as a whole, in terms of wellbeing opportunities,
[02:58] managing risks, and a variety of other things.
[03:01] And one of the things that a statistics organization holds very dear is the.
[03:08] The integrity of the information, but also the protection of that information.
[03:12] So a lot of focus put on ensuring confidentiality which is not the same as privacy, but very closely related.
[03:20] We always took huge amounts of care to make sure that individuals weren't identifiable information that was provided.
[03:28] And while I was a government statistician, I started working with the Office of the Privacy Commissioner at the time, John Edwards,
[03:36] around some principles for safe and effective use of data and ensuring that it stayed safe while you're undertaking analysis,
[03:48] including thinking about algorithm use.
[03:50] Because the role of the government statistician had expanded to not only being about the statistical information, but also I became the government's chief data steward.
[04:01] So I was responsible for thinking about how we were using, but also keeping safe data across the government sector.
[04:09] Upon leaving the role as the government statistician,
[04:13] John Edwards, the Privacy Commissioner, asked me if I would come in and do some work with them for a while. And I intended it to only be a year,
[04:22] but here I am almost six years later,
[04:25] and I'd have to say that thinking in privacy has become a real passion for me.
[04:33] I can see the critical importance, as I could when I was government statistician, but even more now of personal information,
[04:41] the fact that we're not just talking about data, we're talking about people, we're talking about their lives, and we're talking about information that can unlock doors for them, but also can close doors, too, if the wrong person gets a hold of it.
[04:59] So every day I come to work and there's proactive work that we're doing, but there's also the reactive work that we do to help individuals who have complaints,
[05:09] help them find resolution, but also to focus on compliance action where there's been a data breach,
[05:18] as we just had over our holiday period.
[05:20] But it's a real privilege to work here.
[05:23] Sometimes it feels a bit dystopian when you look at the types of issues that we confront every day on behalf of New Zealanders.
[05:31] But for me,
[05:34] it's incredibly rewarding to be able to do something about this based here at this organization. So, yeah,
[05:41] that's me.
[05:43] Debbie Reynolds: Wow, that's a really interesting segue into privacy. And so I guess when I. When you talked about being a statistician,
[05:52] you know, I think automatically about differential privacy.
[05:56] And sometimes people, because they don't understand that, they don't know why differential privacy, why it works in things like statistics and maybe not in other things in other parts of privacy.
[06:08] So I think that's really interesting. But one of the things that you said was about which I think is very key,
[06:15] and the data uses really are about safe and effective uses of data so regardless of what else people say it is at the base, I think that's what it is.
[06:27] Liz MacPherson: I agree entirely. And one of the conversations we have quite regularly here in New Zealand, I'm sure it's happening globally. I know it's happening globally is we have been in a world for some time and increasingly in a world where jurisdictions, governments, businesses,
[06:45] individuals themselves want to use personal information data to unlock,
[06:52] you know, innovations, to be able to do things that we haven't been able to do before, faster, more efficiently,
[06:59] to be able to solve problems that we've had.
[07:03] And often you have this conversation which is. Goes along the lines of,
[07:09] I could have done this but for the Privacy act or but for the privacy restrictions or that somehow our focus on keeping information safe is holding everybody back.
[07:25] I guess from our perspective,
[07:28] guardrails are critical to actually being able to innovate safely and with speed.
[07:35] So there's often the conversation which is about the fact that the thing that helps a car goes go faster, the innovation that helped a car go faster was brakes.
[07:46] You know, brakes and good tires. Without brakes and good tires, you couldn't have fast cars.
[07:52] So in the same way,
[07:54] the way we look at it is that in order for you to be able to innovate,
[07:59] to achieve all the things that we want to achieve as countries, as individuals, as businesses,
[08:04] you actually need good guardrails. And privacy is a critical part of that.
[08:09] Debbie Reynolds: I'm smiling because the analogy that you use about cars and brakes was exactly the same thing that was on my mind. It's so true.
[08:19] Liz MacPherson: It's so true. If it wasn't for brakes, we'd still be wandering around with red flags in front of cars, and cars would only be able to go four miles an hour.
[08:27] So it's so true.
[08:30] Debbie Reynolds: So true. Well, I want to talk a little bit about foodstuffs and supermarkets.
[08:36] I know that there have been cases that have come up not just in New Zealand, but other in other countries, including the U.S.
[08:44] this is actually, I hadn't really thought about the connection here, but because companies that sell food like supermarkets, first of all, they have a lot of interaction with the public.
[08:56] They have a lot of data about the public and how they use tools.
[09:01] They seem to be rapidly going forward into biometrics and other things that I think sometimes these organizations, they may think that they're cool things, they may be cool, but they.
[09:16] They may not be thinking about the ramifications of it in terms of how they collect and retain that data. But I want your thoughts there.
[09:27] Liz MacPherson: Well, maybe if I Just tell your listeners a little bit about what happened here. So Foodstuffs is one of two major supermarket grocery chains in New Zealand and the one that we were working with was Foodstuffs north island.
[09:44] We have two major islands in New Zealand, north island and the South Island.
[09:48] Names and Foodstuffs at North island had been having some issues with respect to retail crime and they had tried a variety of other tools.
[10:02] So they tried the. They'd had security guards for some time,
[10:06] they'd had CCTV cameras for some time and a variety of other things that you would, you'd look at, you know, locking cabinets, locking trolleys, locking number of things.
[10:16] And this hadn't really worked for them.
[10:19] They discovered that some of their supermarket, it's a sort of,
[10:25] it's a franchisee type model. So each supermarket,
[10:28] they belong to the sort of cooperative, but each supermarket is owned by the franchise owner.
[10:36] And they discovered that a group of those supermarkets had gone ahead and put in place facial recognition.
[10:44] Without actually really thinking about it particularly, they just thought it could be a good idea and so they had put it in place.
[10:53] When it was discovered this was happening,
[10:56] they came and saw us and said, look, actually we want to make sure we're doing this right,
[11:02] so we want to work with you on how we could do this in a privacy enhancing way.
[11:10] So we said, first of all, stop everybody who's currently using it, using it while you work this one through.
[11:15] So over a period of time they worked with us.
[11:19] I think probably I'll just let the listeners know that in New Zealand there is very little use at that point in time, in particular, very little use of facial recognition in New Zealand at all beyond obviously people's mobile phones and things like that.
[11:35] But in terms of businesses,
[11:38] there wasn't really any use of facial recognition technology. We have some at the border obviously in terms of people coming, going across our borders.
[11:50] But apart from those sorts of uses, in some, one particular case in gambling venues for problem gamblers covered by its own legislation, there wasn't facial recognition. So for us this was a first use of facial recognition in this way and the fact that it was taking place in an essential service industry made a difference.
[12:16] So they worked with us on their privacy impact statement. So they worked through a private privacy impact analysis. We basically were a sounding board for them.
[12:28] And as they worked through that.
[12:30] And I think the other thing that your listeners need to know about the privacy framework in New Zealand is that we have one act, the Privacy Act 2020 that governs all organizations pretty much that collect Personal information in New Zealand.
[12:50] The key thing that your listeners need to understand is that unlike a lot of other privacy laws globally,
[12:57] ours is not consent based.
[13:00] It's not based on consent, it's based on purpose and context.
[13:06] So you have to have a lawful purpose collected with your organisation to collect the information,
[13:12] and it has to be necessary for that purpose.
[13:15] And also,
[13:16] you know, context matters in terms of the sensitivity information as well.
[13:22] So we worked all of that through with them and they were able to put in place a variety of mitigations against the privacy risks that were identified,
[13:33] including things like immediate deletion of all unsuccessful matches as somebody came through the door of a supermarket,
[13:42] against a watch list of persons of interest who had shoplifted before or had been violent before,
[13:49] particularly a focus on violence and assault.
[13:53] But they weren't able to give us a critical piece of information or prove a particular issue which is connected with that purpose and necessity for the purpose which was,
[14:08] is this actually going to be effective in reducing violent crime at all?
[14:12] So for us,
[14:14] when we step through from purpose,
[14:18] particularly with biometrics, because we now have a new biometrics privacy code in place,
[14:23] we ask the question,
[14:25] is it necessary for that purpose and is it proportionate?
[14:30] And so effectiveness is a critical part of that.
[14:34] So they didn't have the evidence to prove this because they couldn't get any, because they hadn't used it.
[14:41] And there wasn't really the evidence that they could provide us from similar situations overseas that demonstrated that it really made a difference.
[14:52] So it was a bit of a catch 22 situation. And the way we worked our way through this was to propose a trial.
[15:00] So the supermarket established a trial with an independent evaluator associated with the trial that took place in 25 of their stores.
[15:10] And the evaluation was set up to determine whether the use of facial recognition was effective in reducing the incidence of particularly violent crime.
[15:25] But at the same time,
[15:27] foodstuffs also used the evaluation as a way for them to learn as they went through the trial period, in terms of whether the mitigations they'd put in place against privacy risk actually worked or not.
[15:40] So a bit of an innovative way of approaching this was the first time we've ever had a trial running under the Privacy Act.
[15:49] And to ensure that we were keeping everybody safe while we did this,
[15:54] we established an inquiry under the Privacy act to run alongside the trial.
[16:01] And the inquiry allowed us to basically monitor the trial as it went along.
[16:07] And essentially the message we were giving to people was,
[16:11] and you have to basically meet the basic requirements of the Privacy Act. When you're doing this trial, the only thing you should be testing is effectiveness.
[16:19] All your other safeguards should already be in place.
[16:23] And so part of what we were walking alongside was to make sure that they were.
[16:29] We had shop visits during the time to check their security arrangements, to have a look at their training,
[16:36] to test things. As we went along.
[16:39] At the end of that period of time,
[16:41] we came to the conclusion that, on balance,
[16:46] given the mitigations that they'd put in place,
[16:49] that this was an acceptable use under the Privacy act in this particular context, for this particular supermarket and the way they'd done it.
[16:58] And I think there were a number of things that we learned through that that we've built into some information for the sector as a whole,
[17:08] sitting under our new biometrics privacy code, which we can talk about in a bit. So I think one of the key things was we have people who think that you can just pick up an FRT camera, plug it into your system and, hey, presto, you know, everything is fine.
[17:25] The fact is that this is not a plug and play type situation.
[17:30] Facial recognition,
[17:31] you're talking about biometric information,
[17:35] which is some of the most sensitive information a person can have.
[17:39] You can change a password, you can't change your face,
[17:42] you can't change other characteristics that are you.
[17:46] It's not about you, it actually is you in this case.
[17:51] So you have to be very, very careful about this. You have to be sure that there isn't anything else that you could do that would achieve the same purpose at a lower privacy risk.
[18:01] And you have to have thought that through. But so I guess the key things are careful thought before deployment is absolutely critical.
[18:10] The quality of that list I talked about earlier is just.
[18:16] I can't stress how important it is the data quality, to start off with. So a your images, good quality images, they're actually accurate. How people get on and off the list.
[18:28] So, Ken, how do people get on? What criteria do you use for putting somebody on a list? Is it because you've got evidence of a serious assault, for example? Is that how they get on a list?
[18:38] Are they put on because they trespassed? What's the evidence for how they get trespassed?
[18:43] Who says they get on?
[18:45] How can they get on? How do they know that they're on the list? How long do they stay on the list?
[18:51] And then one of the things we've discussed, discovered over time, is access controls. In terms of who within your organisation gets to see this information is also really important.
[19:04] Something Else that,
[19:05] as I said, goes back to the plug and play thing.
[19:08] Facial recognition needs to be part of a wider operating system which includes your people in your organisation, your employees.
[19:18] So have you trained them as to how to use this?
[19:22] How does the rest of your system operate?
[19:25] Where do you store things? How do you store things? What is the quality of the facial recognition system you've purchased?
[19:31] What's the quality of cameras, quality of where they're placed?
[19:35] We had someone the other day, I mean, one of the things that came out of the trial was where you've put your camera and the time of day and the quality of the camera makes such a difference to the accuracy, which of course is vitally important.
[19:49] Some other things,
[19:51] telling your customers about the fact that you're using facial recognition is critical. So they need to know before they come in the store that it's being used and they need to know how they would go about getting access to that information for correction purposes.
[20:11] So, for example,
[20:13] I have been in a store and the next time I come into the store,
[20:19] I'm told that I'm not allowed in the store because there was some issue.
[20:24] I need to be able to challenge that.
[20:27] I need to be able to say, well, how did I get on the list? Where's the evidence about how I got on the list?
[20:33] And we did have one situation during the trial where a Māori woman was accused of being someone she wasn't,
[20:46] doing things that she hadn't done.
[20:49] Fortunately, there weren't more of those sorts of cases. But for foodstuffs, what that said to them was,
[20:55] the training is critically important. The person who. The people who were operating at that time weren't the people who were fully trained, they weren't in store.
[21:04] The human in the loop is critical.
[21:07] And if you know that facial recognition, because it is, after all, just a camera that works with light and dark,
[21:15] is not as accurate for people with darker skins,
[21:18] then you need to put steps in place to make sure that you're actually mitigating against that. So there were a whole range of things that we learned through that that we've published.
[21:29] And I guess the last thing I'd basically say is that there's no set and forget here.
[21:35] You have to review and monitor your systems on an ongoing basis. You don't just set them up and think they're fine.
[21:41] You actually have to look across the whole system, their security,
[21:46] how many false positives and false negatives you've been getting,
[21:49] and keep reviewing the effectiveness of the system as you go.
[21:56] So, yeah, I've been babbling on, Deb.
[21:59] Debbie Reynolds: No, this is an incredible case study because I think a lot of people think that they're gonna do something wrong and then the regulator's gonna come to them and then maybe this process will start.
[22:12] But the fact that foodstuffs came to you proactively and said, we wanna do this and we wanna do this in the right way, I think that's an excellent example. And then you've learned so much that you can help other people fully understand it from different angles.
[22:28] Liz MacPherson: Yeah, is so useful.
[22:31] We've got a number so sort of backing up a bit.
[22:35] At the same time as we were working with foodstuffs, we were also developing a biometric processing privacy code for New Zealand.
[22:45] And again, for your listeners,
[22:48] the New Zealand Privacy act is an interesting one in that it includes a quite extraordinary power, I think, which is that the Privacy Commissioner has the power to establish privacy information codes which actually modify the act,
[23:10] and once they've been promulgated, they become law.
[23:14] So we can effectively,
[23:16] after we've gone through the statutory processes that are required, which include a lot of consultation, we can put in place a code which effectively changes our law.
[23:27] It modifies 13 privacy principles which are the heart of our. So in the case of biometric, back in 2021, we were really concerned that we could see biometrics become. The future of biometrics was that people were going to want to use it more and more and more for a variety of different things and we could see it becoming commodified.
[23:50] So you effectively, you know, sooner or later it will be. You get a camera and you get biometrics with it. Sort of like, would you like a side of biometrics with your.
[23:59] Whatever.
[24:00] And because we could see that coming,
[24:03] we decided we needed to strengthen our legislation by looking at the biometrics area. And the reason we wanted to do this was because our act, again, unlike some other acts, because it's really focused on purpose and context.
[24:18] So it would essentially say to you, for example, the more sensitivity, sensitive the information you're collecting, the more robust your safeguards have to be. The more.
[24:27] The clearer you have to be about necessity for collecting this information,
[24:32] et cetera, et cetera. So we don't have.
[24:35] For that reason,
[24:36] we don't actually have a.
[24:38] Don't have sensitive information to find in our act like some other jurisdictions. So from other jurisdictions.
[24:46] So we decided we needed to strengthen our act. So we put in place the biomet process and privacy code after position papers and four streams of consultation,
[24:56] and it was promulgated in November last year and came into effect then for pretty close to then for new uses of biometrics, and comes into effect in August of this year for existing uses of biometrics.
[25:11] To give people a bit of time to get their ducks in a row. They the key thing about the biometrics code is that it strengthens and clarifies the requirements to assess.
[25:23] Remember I was talking about effectiveness before?
[25:26] To assess effectiveness and proportionality,
[25:29] to adopt safeguards to reduce privacy risk, and to tell people that our biometric system is used before collecting.
[25:37] The other thing it does is it limits some particularly intrusive uses of biometrics.
[25:45] So when we talk about biometrics biometric processing code, it regulates organizations that collect, hold and use physical or behavioral characteristics of a person.
[25:59] That's what we define as being biometric information for the purposes of biometric processing. And there are three things that we talk about there. So biometric processing, we say, is using biometric information for verification.
[26:13] So that's one to one. I am who I say I am for identification. So that's one to many. So I want to identify this person within a collection of people, for example, or categorization.
[26:26] And that's what we talk about when we're talking about categorization is essentially using an automated process using biometric information to collect, infer, or detect certain types or generate certain types of information or to categorize somebody.
[26:44] So,
[26:45] for example,
[26:46] trying to infer somebody's personality or emotions or mental state,
[26:53] that's what falls into the categorization space.
[26:58] Trying to determine somebody's health information by they walk past. And there are systems that can infer from the way that you're walking some medical conditions,
[27:10] those sorts of things.
[27:12] We've put some limits on when you can use biometric categorization. And so we've effectively said that if you are the health,
[27:22] the. We've got another code, the health information privacy code. And if you're covered by that, then you follow the health direction in that. But if you're a business that isn't a health agency, then you can use categorisation for health and safety reasons.
[27:39] So, for example,
[27:40] there are some businesses that use cameras and other things to detect fatigue for people who are.
[27:48] For whom fatigue would actually be a health and safety risk. So, for example, you can imagine somebody like my brother, actually,
[27:57] his job is to drive these huge tankers.
[28:01] If you get as tired and fall asleep at the wheel, that's not only a risk to you, but a risk to everybody else on the Road. So being able to use that sort of technology for fatigue detection is something that we've said is, okay, so we've put this biometric code in place.
[28:19] And one of the things it does critically is,
[28:23] and the Foodstuffs trial helped us with this in terms of just strengthening our thinking,
[28:31] is it effectively says there's a threshold you have to get over before you can use biometrics. And that threshold is that you have to basically have been able to prove that you have to be able to prove, if we came and looked,
[28:46] that your use of biometrics is connected to your purpose,
[28:52] that you have gone through the process of assessing effectiveness to prove necessity.
[28:59] And you've also looked at proportionality. That is,
[29:02] there is nothing else that you could have used reasonably that could have have done the same,
[29:09] achieved the same purpose with the same level of effectiveness or similar level of effectiveness.
[29:14] And you also need to have assessed, when you're doing that proportionality assessment, you need to be basically balancing the risk against the benefits, basically. And learning from the foodstuff's work.
[29:26] What we built into the code was the ability for people to do a trial.
[29:31] Debbie Reynolds: Oh, that's incredible.
[29:33] I want to talk a little bit about categorization now. This fascinates me because I think that you've addressed it very well,
[29:45] because this is the issue that concerns me a lot, and that is inference as part of that,
[29:51] as you well know, I'm sure is about what action does company take as a result of the accusation. What are your thoughts?
[30:01] Liz MacPherson: Well, I mean,
[30:02] essentially we've said that in New Zealand we have a Human Rights act, and there are certain things that an absolute. No, no under our Human Rights Act.
[30:13] So that is making calls about people because of things you infer about their. Their age or their gender or their sex or a variety of other things. So we haven't said no biometric categorization for the.
[30:29] But for the very reason you've discussed, we've said you can only use this where it is. It is about pretty much health and safety.
[30:38] So we've said age detections,
[30:41] potentially, if it's accurate,
[30:43] for if you need to effectively restrict people's access to things because of age. But even there, you have to be very careful.
[30:51] But yes, I mean, the critical thing here is what are you using this for? It has to go back to a lawful purpose again.
[30:59] And so we have quite definitely said there's placed fair use limits on attempting to infer people's emotions, attempting to infer their sexual orientation and attempting to infer their health Status from the facial recognition or biometrics generally.
[31:22] Debbie Reynolds: I want your thoughts about.
[31:23] And this goes with purpose limitation as well.
[31:27] And that's the data retention.
[31:29] And so I feel like this is an area where companies get into some of the most trouble because a lot of times they retain data entirely too long and it creates more risk for them as an organization, but then also for individuals.
[31:45] But I want your thoughts on data retention.
[31:47] Liz MacPherson: Oh, absolutely.
[31:49] I am on record as describing retention as the sleeping giant of cybersecurity.
[31:57] I really believe it is a critical issue.
[31:59] And if you look at some of the big data breaches around the world,
[32:04] in Australia,
[32:06] here and other places,
[32:09] very often when you get a huge data breach,
[32:12] a lot of the information shouldn't even actually have been held.
[32:17] Huge risk to the organization,
[32:19] huge risk to the individuals who potentially didn't even know the data was still there.
[32:24] So. So from our perspective, yes, retention is critical.
[32:28] And the way we think about it is, again,
[32:31] you can only keep data for as long as your purpose allows you to, unless there's another law potentially that requires you to keep it. Like, for example, our tax laws require certain types of information to be kept for seven years or something like that.
[32:48] But if you no longer have the purpose for which you collect it,
[32:53] the information,
[32:54] that information should be being deleted.
[32:57] And our advice to businesses, to anyone who collects information, is once you've collected it,
[33:04] you have to protect it. You know, you collect it, you protect it.
[33:08] So first thing you actually have to ask yourselves is, do I need to collect it at all?
[33:13] No.
[33:14] For my purpose, do I really need to be collecting this sensitive information because.
[33:18] Or personal information? Because once I do, I've got to protect it and I've got to. Part of protecting it is having.
[33:25] As it comes in the door, I should actually have a retention policy for it.
[33:30] Yeah.
[33:30] Which essentially says,
[33:32] after it ages out to this point,
[33:36] I delete it. The person's no longer my customer. After a certain period of time, I delete it.
[33:42] That's absolutely critical. And so many businesses just forget about it.
[33:49] And we've had situations where I can remember one which was associated with a recreation facility that not only collected information it didn't need,
[34:02] essentially they were. Their charging rate said that somebody who was local got charged less than somebody who was a visitor. Right.
[34:10] So they wanted people to prove that they were local.
[34:14] So what did they do? Instead of just asking to see something,
[34:17] they actually took photographs of it. And then they stored driver's licenses and other identifying information in their system, which they didn't need. And Then they forgot they had it when they moved to another system and then their legacy system got hacked with all of that identifying information in it.
[34:36] So these basic hygiene matters are critical.
[34:42] Ultimately it goes to the trust and confidence that the people who use your services have in you. And that should be organization's bottom line, you know, putting those people first.
[34:55] Debbie Reynolds: It's so true.
[34:56] And I think over time people leave companies,
[35:00] they make changes to their technology,
[35:03] but their responsibility for that data remains the same. And so sometimes I think people forget that part of it and it becomes a challenge.
[35:13] Liz MacPherson: Yeah, we've put together some guidance which all of the stuff I've been talking about to you about is on our website, by the way, privacy.orgnz but we put together some guidance which we call the how to do Privacy well,
[35:28] and it takes you right the way through the lifecycle of the information.
[35:33] And what we really suggest to people is they put together what we describe as a privacy management plan.
[35:40] And it has all of these aspects in it and you keep that plan up to date and ultimately it goes to treating that personal data, that personal information that you hold as carefully as you would anything that is commercially sensitive as carefully as that, or more carefully than that.
[36:00] But if we're trying to talk to people in the C suite,
[36:04] this is critical to your ability to not only potentially innovate in order to give your customers better services,
[36:15] but the trust and confidence that your customers have in you.
[36:19] Time and time again we see surveys of the public where people say,
[36:27] if I know that a firm has been hacked and they didn't have the right,
[36:33] you know, safeguards in place and they had a big data breach,
[36:36] I'm going to think about moving, you know,
[36:39] I don't want to do business with somebody who doesn't take care of my personal information because ultimately that tells me something about whether, who they care about,
[36:49] whether they actually really do put their customers first.
[36:53] So we would say to people all of those elements of the, of the information life cycle in terms of a person, person's information that you're collecting are important,
[37:04] their basic hygiene.
[37:06] But also, and critically, importantly,
[37:09] you should be thinking about this as a, as an asset or that you are protecting.
[37:15] Debbie Reynolds: Absolutely.
[37:16] I like to say I tell organizations that they need to think of a human as stakeholder,
[37:24] as if they were a person on the board.
[37:27] Right. So their data is also important not just because you sell them a product or service,
[37:33] but because you're holding information about them.
[37:37] Liz MacPherson: I couldn't agree more here in New Zealand.
[37:39] So people who have traveled here will know that our indigenous culture,
[37:46] the Māori culture, is critically important to the way we think about things. And here in New Zealand, and from a Te ao Māori perspective, there is a real focus on the person.
[37:57] So there's a saying,
[38:00] and our advice always to an organization, for example, who's suffered a data breach,
[38:07] is that if you put the person at the centre of your response,
[38:12] you'll very seldom go wrong.
[38:14] So, you know, really think about that person.
[38:17] And if the person and,
[38:20] you know, treating their information as a tone,
[38:23] their data. Taonga means treasure. In Māori, if you treat it like a treasure,
[38:28] that will actually take you in the right direction.
[38:31] Debbie Reynolds: I agree with that completely.
[38:33] Well, Liz, if it were the world according to you and we did everything you said, what would be your wish for privacy or data protection anywhere in the world, whether that be technology,
[38:46] human behavior or regulation?
[38:50] Liz MacPherson: Well, I think we come back to what we've just said.
[38:53] I think we need to have reinforcing systems of incentives,
[38:58] penalties, and education that reinforce the importance of treating personal information as a treasure, as a toll.
[39:10] Whether we're talking about AI,
[39:12] biometrics,
[39:14] basic personal information,
[39:17] it's about having those systems in place that mean that anyone who has the privilege to look after somebody else's personal information is taking it seriously.
[39:31] Debbie Reynolds: Wow, this has been incredible.
[39:33] Well, thank you so much for being on the show. Your insights are incredible. And people, please definitely check out the website.
[39:41] We are all looking very closely and admiring all the work that you're doing in New Zealand. It's very instructive for all of us around the world to see how creative and pragmatic the way that you've been about this.
[39:55] And I think ultimately that's the way it should be. So it should be a surprise to companies. It's like, okay, this is part and parcel of your responsibility and this is the way the regulators can help you get there.
[40:07] Liz MacPherson: Yeah, no,
[40:08] I think we need to think about ourselves as a privacy ecosystem. We all work together to achieve the outcomes that we're after. So thank you so much. It's been just a delight to have a chat with you, Debbie
[40:22] Debbie Reynolds: Oh, it's so great. Thank you.
[40:25] Thank you. It's such a pleasure to have you on the show and hopefully we can find ways in the future to collaborate together.
[40:32] Liz MacPherson: That would be fantastic. And my best wishes to you and the show for 2026.
[40:38] Debbie Reynolds: Yes, thank you so much. Thank you so much. And we'll talk soon.
[40:42] Liz MacPherson: We will. Okay.
[40:44] Debbie Reynolds: Okay, Bye bye.
[40:45] Liz MacPherson: Bye bye.