"The Data Diva" Talks Privacy Podcast"

The Data Diva E282 - Evan Benjamin and Debbie Reynolds

Season 6 Episode 282

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 37:49

Send us Fan Mail

In this episode, Debbie Reynolds “The Data Diva” speaks with Evan Benjamin, President of Tier 3 Inc., about the growing challenges of privacy in AI systems, particularly in relation to inference, agent-based systems, and data lifecycle management.

Evan shares his transition from IT, e-discovery, and information security into privacy, highlighting how the rapid adoption of large language models has exposed gaps in how organizations approach privacy and data protection. The conversation explores the distinction between security and privacy, emphasizing that security focuses on protecting systems while privacy focuses on purpose, data use, and fundamental rights.

Debbie and Evan discuss the risks associated with AI-driven inference, including how systems generate insights about individuals based on context and historical data, often without user awareness or control. They also examine how AI memory and agent-based systems can extend data usage beyond original intent, raising concerns about purpose limitation and data minimization.

The discussion further addresses challenges with data retention, logging, and traceability, as well as the difficulty of deleting data from AI systems once it has been incorporated into model training. Evan highlights the technical limitations of data erasure in machine learning models and the implications for privacy rights such as the right to be forgotten.

Finally, the conversation explores issues related to data processing across multiple systems, including the complexity of managing controllers, processors, and sub-processors, as well as emerging risks related to liability when organizations deploy AI systems and autonomous agents.



By popular demand, Debbie Reynolds Consulting is now offering executive briefings on emerging data privacy risks and how companies can avoid them. To learn more, visit the Executive briefings page on my website.

Support the show

Become an insider, join Data Diva Confidential for data strategy and data privacy insights delivered to your inbox.


 💡 Receive expert briefings, practical guidance, and exclusive resources designed for leaders shaping the future of data and AI.


 👉 Join here:
http://bit.ly/3Jb8S5p

Debbie Reynolds Consulting, LLC



[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:13] Hello,

[00:14] my name is Debbie Reynolds. To call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.

[00:26] Now I have a very special guest on the show, Evan Benjamin. He is president of Tier 3 Inc. Welcome.

[00:33] Evan Benjamin: Thank you. Thank you, Debbie.

[00:35] Debbie Reynolds: Yeah, well, we've been connected for quite some time on LinkedIn. You always write very thought provoking posts and sometimes you send them to me and yeah, it's just been fascinating to see the questions that you ask.

[00:50] They're always very deep, thought provoking. They aren't surface things you could just slide past. Right. You have to kind of really look at it and think about it. But I want you to tell me a little bit about your career trajectory and how you got interested even in privacy as a career.

[01:09] Evan Benjamin: Yeah, well, I started, I'm an IT guy, so I do heavy IT and for over 20 years and I fell into like E discovery, electronic discovery and, and forensics and litigation technology and not, not really privacy focused but,

[01:26] but a lot of cloud, network,

[01:28] things like that. Heavy tech.

[01:30] I want to say right off the bat, Debbie, that it helps to be a tech person in AI, but you don't have to be.

[01:38] No, there was more emphasis on security, not privacy.

[01:42] So that helped. A big, big background in information security.

[01:46] Then right after ChatGPT came out, I wanted to learn LLMs and everything. And back then we didn't have agents, we didn't have anything.

[01:57] And I st. We, I, everyone jumped into LLMs.

[02:00] But then I started hearing how LLMs would break privacy rules.

[02:07] People were saying,

[02:08] who's doing LLM security? Who's doing LLM privacy? And I said, wait a minute,

[02:13] is this a thing?

[02:15] So I went and got my CIPPE and, and then I started,

[02:20] I started studying for my CIPP US and I said,

[02:24] look at all this. I don't know,

[02:26] look at all this. I don't know about EU and I know everything about most things about cloud it.

[02:32] Look at all this stuff.

[02:34] Control or processor? What is this? Data minimization, what is this?

[02:38] And I had a great background in information management and even then they were talking about data retention and all this stuff, information lifecycle. And I said, now they're, they're worried about it for LLMs because LLMs are trained on the whole,

[02:56] all this, millions of data.

[02:59] So when I got my cippe, I said, oh, this is great. I'm going to. No one is attacking privacy.

[03:06] I'm going to attack it. I'm going to attack security. And I became. I just threw myself into it.

[03:12] Look what happened now.

[03:14] Agentic, all this stuff, Claude code and all this stuff,

[03:18] they're still breaking privacy.

[03:21] In fact,

[03:22] it's getting worse.

[03:24] So the more I try to teach people and I'm working,

[03:28] there's clients out there, and there's one, I'm helping a client and finding the need to really teach people about privacy and security because everyone's of the mindset. We got to push, we got to push, we got to innovate, we got to get this product out.

[03:43] And then people come back with, then let's put in privacy. Let's do privacy after deployment.

[03:51] So then, Debbie, people would start talking about privacy by design, privacy by default. And I'm a big follower of iapp and they all talked about this. Privacy by design, privacy by default.

[04:04] I also got my IAP has something called the cipt, which I don't know if a lot of people may not know, but it's information privacy technology.

[04:15] So I said, wow, they were teaching how to do privacy by design and default.

[04:21] And I found out that I was going to a lot of places they didn't want to hear it,

[04:25] they just don't want to hear it. And that made me confused.

[04:31] So that's where I am here. And as we go into 20, 26 and beyond,

[04:36] there's more tools out there that is just making it harder to do privacy by design.

[04:43] That's why I'm here.

[04:45] Debbie Reynolds: It's a hard problem, and I'm glad that you saw it and decided to pivot in that way.

[04:51] I want to touch on something that you said fascinates me. So I'm a technologist also by trade, so it fascinates me that you have that background. And I dabble bit in many years and help me implement stuff for electronic discovery in my former life.

[05:10] One thing that you said about,

[05:15] you know, your background in data management and things like that,

[05:19] and I want your thoughts,

[05:21] is that I feel like some people thought,

[05:24] especially in the U.S.

[05:26] it's like, okay, we solve for security,

[05:30] then that automatically solves for privacy. And that's not true, but I want your thoughts about that statement.

[05:37] Evan Benjamin: Well,

[05:38] some people ask,

[05:39] can you have security without privacy or privacy without security if you're protecting a system?

[05:47] Right, I can put controls that protect a system. And people are saying, Well, I got SoC2, I'm good for privacy. I got ISO 2701,

[05:57] but that has nothing to do with data minimization.

[06:01] That has nothing to do with purpose limitation.

[06:04] So Even if you're SoC2, ISO 27001, ISO 42001, you've touched it.

[06:11] But guess what? You're still going to violate GDPR Article 6 and GDPR Article 9 because you don't know the legal basis of processing.

[06:21] That's the problem that security doesn't have the same language as privacy and it doesn't understand that security is protection, but privacy is fundamental rights.

[06:34] You could protect the system,

[06:36] but you're not protecting the fundamental rights of people. That's what security people don't get.

[06:44] You can do penetration testing, you can do red teaming, but guess what? You're not red teaming. Privacy.

[06:50] Your red teaming, prompt injection, your red teaming. I would like to invent a new red teaming for privacy.

[06:56] Let's red team a data breach. No one does that.

[07:00] They wait on for it to happen. And they don't understand that. Look at the fines that GDPR imposes.

[07:06] Who wants to pay that?

[07:07] So security does not understand the need for privacy and they see it as a lesser discipline.

[07:14] And there's no way that you can bundle those two. There's no way.

[07:19] Because when the auditor comes, they're going to ask separate security questions and separate privacy questions. And there's no way that you can integrate it and say we do it all.

[07:27] There's no way.

[07:28] Debbie Reynolds: I love that,

[07:30] the way that you defined it. I know. I feel like I'm always eye rolling when people tell me, oh, we solve for privacy because we have security.

[07:37] I feel like cyber security is about protecting everything.

[07:42] And, and privacy is about like, why are you using this?

[07:46] Evan Benjamin: Right.

[07:46] Debbie Reynolds: What's the point? Like, what's the purpose?

[07:48] Evan Benjamin: What's the purpose? Yeah.

[07:51] Debbie Reynolds: So, yeah. So regardless of what a company has,

[07:56] let's say that this analogy I like to use around a bank.

[07:59] So let's say cyber is like the security of a bank. So you're protecting the outside of the bank, the inside of the bank, everything that goes on in the bank.

[08:09] Right.

[08:10] But privacy is about what's in the safe deposit box and why is it there.

[08:14] Evan Benjamin: Why is it there?

[08:16] Debbie Reynolds: Yeah.

[08:17] Evan Benjamin: And maybe I told you to only save $1,000 of mine and you're saving 2,000. Why? And you're not. And I say delete 1,000 and you're saying no, and you're saying,

[08:27] so that's privacy, you know,

[08:30] and it's just funny that everyone's doing privacy policies and they think that the privacy policy covers them.

[08:38] And Debbie, I bet you and I and another attorney I know are the only three people who read privacy policies. When you read them,

[08:45] you cannot put everything you need in a privacy policy.

[08:50] Right. That if you show that to the auditor, they're going to be, well, you know,

[08:55] right now I'm studying, I want to be an AI auditor, and I'm taking, like advanced classes in AI auditing.

[09:01] And even they try to cover more about privacy security,

[09:05] like the whole 360 view.

[09:08] I mean, does that mean everyone has to be an auditor in order to learn this? No.

[09:12] So I worry everyone's getting excited about Agentic and Claude code because it says, look what he can do, and look what lovable can do. I built a Lovable app. It was to do something simple.

[09:25] But did I check the terms of service for lovable? Am I checking the privacy policy? I don't know what it's doing.

[09:31] So I worry when people get excited,

[09:34] and I don't want to be the person who comes and just quash that.

[09:38] I want to be the person who says,

[09:40] look what you forgot.

[09:41] Debbie Reynolds: I want your thoughts about this thing that I talk about a lot. And it's about inference.

[09:48] Right. So this problem has already always existed with data, but I think it's becoming even more fraught with AI. And that is. Cause when typically we're talking about privacy, we're talking about data about people.

[10:07] Right.

[10:07] Whether it's data they gave you or not, we could talk about. That's a whole other story about how you got the data in the first place.

[10:15] But one of the things that concerns me a lot around AI systems is the ability to create inferences and to make decisions about people with inferences.

[10:24] And a lot of privacy laws don't cover that. Because you as an individual may not own the inference. Right?

[10:33] Evan Benjamin: Right.

[10:33] Debbie Reynolds: Right. You may own the data they initially used to create an inference, but you may not own the inference or the right to the inference. So that becomes a whole other issue, especially in the AI system, when people are making decisions about you based on that inference.

[10:49] But I want your thoughts. There's kind of a gap there. I see. I want your thoughts there.

[10:54] Evan Benjamin: Okay.

[10:54] You and I have an advantage because in electronic discovery, they talked about metadata,

[11:00] and when I was at law firms and they would tell me,

[11:04] produce the data,

[11:06] produce the actual data and send it to opposing counsel,

[11:09] someone else would come and say, what about the metadata?

[11:12] And a lot of people would say, metadata? Well, metadata is information about that data.

[11:17] And guess what? The opposing council wants that.

[11:20] The judges want that.

[11:21] So AI does the same thing.

[11:23] I can put a document and train on that document, but there's what the equivalent is for AI context.

[11:33] So AI now takes everything,

[11:36] it remembers everything about you because it has memory.

[11:39] At Genti can, we're headed toward where AI has more memory now and it has more. It just remembers more about you.

[11:46] That context is metadata.

[11:49] Oh, my goodness. I see people when they tell me how long they've been typing in ChatGPT,

[11:56] and they don't,

[11:57] they never delete that conversation.

[12:01] And three months later,

[12:02] ChatGPT will come back and say, I remember you told me remember the conversation. So, and it's looking at all the context and they're surprised when they say, how did ChatGPT know that?

[12:14] Because it's got memory,

[12:15] it's got context. Because of that, it can infer, it can make inferences. And did you know that someone told ChatGPT about a trip they took one summer and they totally forgot about it and they were asking for places,

[12:32] restaurants to go to or something like that. ChatGPT came back and said,

[12:36] remember when you went to Milan and you like that food and. And I think you would like the new Italian restaurant that opened up on Smith street.

[12:46] Debbie Reynolds: And.

[12:46] Evan Benjamin: And they were shocked because here's AI inferring what you will like just because of where you've been.

[12:55] So because of that metadata, because of that inference,

[12:58] it's.

[13:00] Think about it. AI is not going to ask you, can I delete that inference?

[13:04] So data minimization is also inference minimization.

[13:09] And it's going to start saying things about you that you never ask.

[13:14] It gets worse with agents, because if I do an app with three agents and one agent is going to orchestrate everything,

[13:21] that that agent stands in my place,

[13:24] and it's going to tell one agent to go to the web and do something and say, well, let's do what Evan likes.

[13:31] We know that Evan likes tennis shoes.

[13:34] So when you go OpenAI does E commerce now, right, you can buy stuff,

[13:39] it's going to remember stuff about me and Debbie. That's the problem. That is a problem with this inference.

[13:46] When I say by proxy, it's using information I never told it to use,

[13:52] and it's transmitting that to other agents I did not tell it to use.

[13:57] It's breaking purpose limitation and data minimization by communicating with other agents I did not tell it to speak to. And that's the problem. And I have no idea what it told the other agent.

[14:11] When you look at output,

[14:13] the output makes inferences about you different than what your input is.

[14:19] But no One is checking the output.

[14:22] They're just grabbing it. Especially students who use it for writing research papers. They're grabbing it, writing, and they're not checking, they're not filtering the output. That's my concern.

[14:32] Debbie Reynolds: That's a huge concern.

[14:33] You know, so we should talk about agents. You talked a little bit about it.

[14:37] One of my concerns about agents and being able to have them take autonomous action is that in my view, AI system,

[14:49] a lack of data is also data. Right?

[14:51] So people think, oh, well, I'll just tell the agents to do this and they'll do it. It's like, no, they're inferring things,

[14:59] even by things you didn't say.

[15:02] Evan Benjamin: Right.

[15:02] Debbie Reynolds: So there's a woman, this happened a couple weeks ago. I think she worked for one of the big, you know, tech companies.

[15:09] She had an agent, I think she had, like, openclaw that she had installed on her laptop or something.

[15:15] By the way, no one, please,

[15:17] never do that. But somehow, whatever instructions she gave them,

[15:23] it decided it wanted to delete her inbox.

[15:25] Right. And it started deleting her inbox. It was like, you think of it like, oh, well, I didn't tell it to delete my inbox. I said, but you gave it control to do everything.

[15:35] Evan Benjamin: Everything.

[15:35] Debbie Reynolds: So even when you didn't tell it specifically not to do that, it's making those inferences.

[15:42] Evan Benjamin: Yes.

[15:42] Debbie Reynolds: You know, so I'm telling people,

[15:44] like, the example I gave is,

[15:46] let's say, for instance, you give agents, like, all this control over your, you know, finances and different things like that. And let's say you said, I want to go on a trip to a Milan.

[15:55] And they say, well, I sold your.

[15:57] Because this is how much the trip will cost. And so now you can go to Milan. It's like, wait a minute now, wait a minute, wait a minute.

[16:04] Evan Benjamin: It's making autonomous decisions.

[16:06] You're going to hear a lot of people say, why can't you just trace that?

[16:11] And, Debbie, there's key words that privacy people need to learn, but they're not learning.

[16:18] So if a lot of people on LinkedIn are writing about traceability and observability, and they're saying that if you run an agent, you can trace what it does, you can do.

[16:30] It's like you keeping track.

[16:31] It's like you starting a stopwatch and saying, go. And you keep track of all its steps,

[16:36] and all those steps are written in a log, and you're supposed to save those logs. Okay?

[16:42] Companies don't know how long to save, and they delete it.

[16:46] And if you delete it,

[16:48] you just deleted the only record you had of what that agent did.

[16:53] So we need a retention policy on those logs,

[16:57] not just the data that the GDPR says we got to get rid of data after a certain amount of time.

[17:04] But what about the logs? If GDPR tells me I have to delete the logs, I lose every trace of what that agent does.

[17:13] That's going to hurt me. So we have a conflict because people don't know what to say, then they delete everything.

[17:20] Right?

[17:21] But I mean, it's a conflict because if you work for a, the government,

[17:26] if you work for a really highly regulated agency, they're going to say, keep everything for a long time.

[17:33] I can't say exactly for who, but I work for a big consulting company that managed all these agencies.

[17:40] And we didn't know when to delete it. We ran out of storage.

[17:44] We just kept saving and saving and said, we need more storage, we need more storage. Because we kept asking, why can't we delete it? It's been,

[17:52] well, we gotta keep it for 10 years. Well,

[17:55] you're gonna spend thousands on storage because you can't delete.

[18:00] What would GDPR say about that?

[18:02] Debbie Reynolds: Right.

[18:02] Evan Benjamin: You know, so that's what. There's just so much tension. I don't understand it. I feel like I can fix it and then I can't fix it. Don't know what to do.

[18:11] Debbie Reynolds: Well,

[18:12] one thing I tell people, and I tell them a lot, data systems are created to retain data, not to.

[18:19] They're created to remember data, not to forget it. So when we're talking about like purpose limitation and privacy by design and deleting stuff or de risking in some way, you're really saying you have to take a step,

[18:35] an additional step that's probably different from what the maker of this tool you use wants you to do with that data.

[18:45] And so I think that creates like technical problem. But then also, as you said, a lot of companies don't want to delete stuff.

[18:53] And so this is. And for me, this is the unique thing about privacy laws. It's like, I can't think of any other law regulation,

[19:01] the saying,

[19:03] you know, at a certain point you need to decide where certain data needs to go away.

[19:08] It can't be like forever. Data and companies have for the longest time operated on the assumption that they can keep everything forever. And we know that keeping everything forever now in a privacy framework is really creating more risk for you and for the individual.

[19:28] What are your thoughts?

[19:29] Evan Benjamin: You want to hear something scary?

[19:31] Law firms,

[19:33] when they settle a case, they archive case and they keep it for a long time. And I used to ask them why, and these are some of the bigger law firms.

[19:41] And they said,

[19:42] in case the client comes back and reopens later. So can you imagine if you want a case or lose a case and you just. The party never wants to go back and do it.

[19:52] But the thinking was, what if they come and open it later? So we have to find space for that.

[19:59] Why is that allowed, Debbie, in the real world, like,

[20:03] I could be trained on one thing and go to a company and tell them what to do and it would mean nothing.

[20:11] It would mean nothing. And then I would be almost sad that why are they teaching certain things in the classroom and making you learn this on the exams and then you can't implement that in real life?

[20:23] I just don't understand it. And I want this to be enforceable, where people come to the compliance person first and the privacy person first and the legal person first and say, what should we do?

[20:34] And it's backwards.

[20:36] We find practices that are not being done and then they suffer harm and then they go to legal and then they go to compliance and said, what should we have done?

[20:45] Not what should we do right now, what should we have done? So I just. I don't understand. I'm hoping it's.

[20:53] I ask people in the eu, is that different? Are we doing it backwards?

[20:57] And the EU seems to care more about regulation and the order of things that they do. But I don't understand our innovation culture.

[21:07] And it only takes a few seconds. It only takes.

[21:10] I'm in every day. I have to make sure that clients are compliant. I use software to keep track of everything.

[21:16] If I spend an hour doing that,

[21:18] I can save you a lot of heartache, but no one's checking. Like, if no one's going to check that and just put that on the side and say, they don't even know what I'm doing, what do we do?

[21:29] You can't force someone to light regulation. But I think, Debbie, a lot of people are being scared. I think a lot of consultants are scaring people and saying,

[21:39] let's figure out your fines if you do this.

[21:42] Let's figure out what 4% of your annual turnover is. Is the answer to scare people, or the answer to,

[21:50] you can't say help anymore, because they have to be motivated to do this. So, again,

[21:55] that's not going to change how law firms behave. But I do want to. I will tell you, if you have a second,

[22:00] something even more scarier With AI people,

[22:03] that doesn't happen with electronic discovery. And this,

[22:07] I couldn't believe this. I'm trying to teach people how impactful this is going to be.

[22:13] You know when you people take a big foundation model like Gemini or OpenAI and they do what's called fine tuning and they fine tune it on very domain specific data.

[22:24] Right.

[22:26] You know what we found out?

[22:27] If you company took some customer service transcripts and they fine tuned it on that to train it to make it more specific. Do you know that updates, the fine tuning can change some of the parameters and actually update the model weights.

[22:42] But if you tell me to delete data and I go back and I delete the customer service transcript that I used,

[22:51] some of that information is still in the model weights.

[22:54] Debbie Reynolds: Right.

[22:55] Evan Benjamin: I didn't delete the parameter, I didn't delete the model weight.

[22:59] How was some dpo, how is some surveillance authority in the EU gonna know that I didn't delete, that it persisted?

[23:08] So if someone says the right to be forgotten, I have the right to erasure,

[23:14] what am I going to do?

[23:15] I can't erase that from the model.

[23:17] How do I explain something like that? Debbie, it's very hard to explain that to a CEO or compliance team.

[23:25] They just can't visualize that.

[23:26] Debbie Reynolds: Right. Because they're thinking of it as, okay, this is what I put in.

[23:31] And if I take out what I put in,

[23:33] then it's not there anymore. It's like, no, well, that, like the metadata is there, the inference is there.

[23:39] Evan Benjamin: Right.

[23:39] Debbie Reynolds: That gap is there. There's information in the model saying that it was there, there was something there that you tried to delete and it can actually reconstruct it. Right. So you're really not deleting.

[23:50] I did a thing about this,

[23:53] about this particular thing. There was actually a,

[23:56] a research study that had come out where they had trained a model. Something about, I don't know if it was like bananas or monkeys or something. They were like, okay, you love monkeys, you love bananas or something like that.

[24:06] Evan Benjamin: Right, right.

[24:06] Debbie Reynolds: And, and so they took all that data out and they did like this huge scrub of this data set or whatever and they actually reconstructed the model and they could not get that information out.

[24:18] Evan Benjamin: Oh, wow. See?

[24:21] Debbie Reynolds: Right, right. So exactly is as you say.

[24:24] So the information is still in there even though they removed all the trace elements. So it's like when you make a cake, you can't take the eggs out.

[24:34] Evan Benjamin: You can't. Yeah, you can't. It's in there. It's in there.

[24:37] Right. But why aren't people worried that this is going to be.

[24:40] You can't really do a data erasure or if someone asks that, you can't explain that to a user, you can't explain that to someone who was harmed and in Italy or something.

[24:50] That's my concern.

[24:52] How do we explain that to legal teams? Because this is technical.

[24:56] This is technical.

[24:58] And someone said, show me the weights. I can't,

[25:02] I can't. That's proprietary. I can't. But I can tell you it's happening because there are people who are actually developing really cool GPTs custom that will try to go and look at parameters and show you what was changed.

[25:22] They can read it, they can't change it because if OpenAI sets its own parameters. But if someone is actually working on a custom GPT that'll read that so that they can show legal teams and say, see, I can't change that.

[25:36] And once the attorney sees that, they'll say, I understand now what do we do?

[25:41] So we need something like that. We need these advanced tools.

[25:45] Debbie Reynolds: I think that's true. I know I had told people when people started using these LLMs and things like that, and they were talking about deletion, I said, but you can't delete stuff, right?

[25:57] Not in the way that you think. Right. Even if they took out the original information,

[26:01] the traces of it are still there. So.

[26:04] Evan Benjamin: Right.

[26:04] Debbie Reynolds: Oh my gosh. I want to talk a little bit about GDPR and breach of data. Right?

[26:11] Evan Benjamin: Okay.

[26:13] Debbie Reynolds: So a lot of times,

[26:16] I don't know, I want your thoughts about this. So when I think of privacy laws and regulation,

[26:22] there's a whole ocean of stuff that we don't catch in these regulations because a lot of them assume, first of all that you know, the company that has your data.

[26:33] Right. First of all.

[26:35] And second of all, I think it also assumes that you have some type of agency about what happens to this data. So like let's say the data broker business.

[26:47] This is one reason why it's been extremely hard for regulators to try to. Even though we're still trying, you know, we have, you know, like the delete act and different things in different states trying to do this.

[27:00] But the reason why data brokers have been sold difficult to deal with because a lot of,

[27:05] a lot of regulation is written as if you are a customer of a company and you give them your data. And there's this whole stewardship path that happens. But that's not all that happens with data.

[27:19] So that's what concerns me,

[27:20] especially around inferences. Right.

[27:23] Where you don't know these people.

[27:26] They're making decisions about you with stuff that you don't know about. Right. And things that you don't even have a right to see. So how can you be forgotten?

[27:34] How can you make a right to be forgotten? Correct. How can you make a right to be forgotten? Request. Request something that you don't know about.

[27:43] Evan Benjamin: Right.

[27:44] Debbie Reynolds: What are your thoughts?

[27:45] Evan Benjamin: Well, what's complicating this is cloud.

[27:48] And we're developing apps that use multiple clouds,

[27:52] even for the government. Let's say we develop an app in Microsoft Azure. I found out that a lot of people are using multiple clouds for redundancy. But you can do one service in Azure and one service in aws.

[28:05] And because of all this,

[28:07] you've just extended the reach of this to multiple processors. Right.

[28:11] And the problem is we have to file a DPA with every processor we use and with every sub processor. And that doesn't get done.

[28:19] So the processor,

[28:21] I'm using GDPR language now, the processor turns around and subcontracts out to a processor I don't know.

[28:28] And they turn around and do something.

[28:31] And at one point these processors are going to determine the means of processing and turn into a controller.

[28:38] And now they have the right to spread that data all over the place. And no one told me because I don't have a DPA with them. And I even did a data protection impact assessment that listed all the processors I know,

[28:51] but there were five I didn't know.

[28:54] Whose fault is that?

[28:56] GDPR says you have to keep a record of all your processing activities. Article 30.

[29:01] And even if I have a document that shows all that and you turned around and gave it to a sub processor and you didn't tell them that they were under the same rules that they were.

[29:11] A sub processor is under the same rules as a processor,

[29:14] but they don't know that.

[29:16] So this is where it's going to get weird, Debbie, because an agent is going to be acting as a processor and the agent is going to turn around and give it to another agent who becomes a sub processor.

[29:30] So everything you said about people,

[29:32] now the agent is replacing it. And agent is a controller, the agent is a processor. The agent is a sub processor. I don't have any DPAs with any of these agents.

[29:44] Now we're in deep water. What are we going to do?

[29:47] It's going to get worse,

[29:50] right?

[29:51] Debbie Reynolds: So another sneaky thing that's happening. I'm not sure if you've seen it, but I tell people all the time, look very closely when let's say you subscribe to a service or use any type of cloud thing and they send you notice saying that they updated their privacy.

[30:07] Privacy, you need to look at that really, really closely. Well, what some of these companies are doing, and I think they're going to start doing it way, way more once they,

[30:16] once they start implementing more age,

[30:19] is that some of those companies were saying, well, initially we were the controller,

[30:24] but now you're the controller.

[30:26] Evan Benjamin: Right.

[30:27] Debbie Reynolds: Because now we're gonna give you the access to the tools and then you're gonna have to wear the crown. Right. To do this stuff. And I think companies don't know what to do.

[30:38] Cause they think, oh, well, I subscribe to the service,

[30:41] you all do all this stuff.

[30:43] You're the controller. Cause you have all the data or you have all the tools that they're like, okay, I'm going to give you the tools and then you're going to be the controller.

[30:51] And even though these companies don't know how, what to do, I think it's going to be a mess.

[30:56] Evan Benjamin: Yeah, I mean, but that's what I'm saying. But they don't know that because they don't read, like you said, they don't read the updated terms of service or policy.

[31:05] Right. But you have to think why it's different. These Frontier Labs,

[31:10] they basically say you're on your own.

[31:14] Right. They indemnify themselves. So they say if you, if any harm comes, don't come to them.

[31:20] But we're finding out in courts they still have some liability. Right.

[31:24] So does anyone really talk about liability when they talk about controller, processor or provider deployer? Because if they have to know where liability isn't shared, like it depends on what, you know, what someone did, or they're still talking about, do we have strict liability or do we have shared liability?

[31:46] What's going to happen when you give all the authority over to agents,

[31:51] even if the terms of service and the privacy policy says here it's all in your hands, but you turn around and give that to an agent,

[31:59] what have you just done?

[32:01] Is the court going to say that you assigned all your rights or are they still going to say, debbie, you're still liable even though your agent,

[32:09] you gave that authority to your agent, but Debbie, you're still liable.

[32:14] How do we write that in? That's not written in the contract.

[32:19] So I really wish we'd talk more about, like as a attorneys, we're talking more about liability. I wish they, there were more seminars on privacy and liability or agents and Liability, like everything you talk about, just add the word liability.

[32:34] Because you just have to know you're going to share part of it. Even if you hand everything over to an agent,

[32:43] you're going to share part of it.

[32:44] And that's what I'm concerned about. People are going to say it's not going to happen to me. And there's a lot of SMBs. I'm letting you know now. There's a lot of small to medium.

[32:53] Who says it'll never happen to me and it's going to happen to anyone.

[32:58] My take is it's going to happen to anyone. You think the notifying bodies in the EU cares about. They give some exceptions for SMEs and SMC, but you violate, you're still liable, you can still pay fines, right?

[33:13] So I don't know, in your education,

[33:15] do you start off with a warning or do you build up to the warning when you talk to people about privacy?

[33:24] Debbie Reynolds: I,

[33:25] I'm not the type of person that leads with fines. So I don't try to say if you don't do this, you're gonna get fined. Especially in the U.S. i think we have a serious problem there because we don't have like a lot of regulation around AI, which that's a whole other thing I talk about.

[33:41] But my thing is I tell people like, do you wanna lose a dollar to make a dime? Basically?

[33:48] So I was like, do you wanna lose the trust of this customer or whatever to do this little other side thing, or do or not really provide the proper stewardship that's required?

[34:01] I don't know, is an interesting issue. So I try to tell companies, hey, this is the lay of the land.

[34:06] Best, worst case, middle ground. Here are your options.

[34:10] You can only play the cards that you're dealt, right? So it's like, here's the deck,

[34:15] you tell me what you want to want to do and it will go in that direction.

[34:18] Evan Benjamin: Right?

[34:20] I would like to just remind people, as we come to the end, I just want to remind people that in electronic discovery they always told us to do a data map and they always say,

[34:30] draw, draw a diagram of where the data is going so that we know where to go look. Like when we do collection and preservation and the other side wants to look for data, we have to show them where all the data was.

[34:43] Like all the shares, all the network drives, all the hard drives and everything.

[34:47] Well, guess what? Guess what I'm going to tell you, Debbie.

[34:50] For agents,

[34:52] you have to do an agent map.

[34:55] And you have to map everywhere the agent touches, plugins, APIs,

[35:01] every vendor it goes to. Because if you don't, you're going to lead to transfer violation, data transfer violation,

[35:11] and data minimization violation. So just like you need a data map, you need an agent map. Everything that you and I learned in electronic discovery for data applies to agents.

[35:24] That's my lesson today.

[35:26] Debbie Reynolds: That's a good lesson. And then I predict that metadata is going to be an even bigger thing,

[35:32] because in order to do that traceability,

[35:36] you will have to create more metadata in these systems to be able to read what can and can't be used for what purposes. That's my prediction.

[35:44] Evan Benjamin: Right. Oh, I can't wait. I'm gonna keep my eye. I'm gonna see.

[35:48] Six months from now, I'm gonna be calling you and say, debbie, you were right. You were right.

[35:53] Debbie Reynolds: Oh, my gosh.

[35:54] Well, if it was a world according to you, Evan, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be human behavior, technology, or regulation?

[36:05] Evan Benjamin: Okay. As a privacy technologist,

[36:11] I want to turn privacy into engineering.

[36:13] When I do compliance,

[36:15] I do what's called GRC and for AI, and people say, you can turn GRC into engineering.

[36:22] My wish is that we turn privacy into engineering.

[36:26] And my wish is that more people take the CIPT exam and they will learn more about privacy technology than they've ever learned in any AI governance.

[36:38] Plus,

[36:39] turn privacy into engineering and go learn some cipt. That's my wish.

[36:45] Debbie Reynolds: That's a good wish. That's a great wish.

[36:48] Well, thank you so much, Evan. This is fantastic. I'm sure we'll be in touch in the future, but this is great. Thank you for being on the show.

[36:56] Evan Benjamin: Thank you, Debbie. I enjoyed it. Thank you.

[36:59] Debbie Reynolds: All right, talk to you soon.

[37:00] Evan Benjamin: Okay, Debbie, thank you.

[37:01] Debbie Reynolds: Thank you.

[37:02] Evan Benjamin: Bye.