"The Data Diva" Talks Privacy Podcast

The Data Diva E63 - Ron Hedges and Debbie Reynolds

January 18, 2022 Season 2 Episode 63
"The Data Diva" Talks Privacy Podcast
The Data Diva E63 - Ron Hedges and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds talks to Ron Hedges, Senior Counsel at Dentons and retired United States Magistrate Judge for the District of New Jersey. We discuss our interests in discussing emerging legal and technology issues, his passion for electronically stored information (ESI), technology and legal education, his interest in Data Privacy laws, basics of privacy in law in the United States, Data Privacy from a civil and criminal standpoint, how the growth of surveillance impacts Data Privacy, the difference between necessary and unnecessary data retention, his concerns and AI and the evolution of Social Media,  and his hopes for Data Privacy in the future.

Support the Show.

54:16

SUMMARY KEYWORDS

privacy, statute, data, information, litigation, deal, people, law, technology, surveillance, involving, protected, issues, breach, ai, talk, private, fourth amendment, organization, state

SPEAKERS

Debbie Reynolds, Ron Hedges


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Diva Diva". This is "The Data Diva Talks" Privacy Podcast, where we discuss Data Privacy issues with industry leaders around the world, information that businesses need to know now. I have a special guest on the show today, Ron Hedges. He is a senior counsel at Dentons Law Firm. He's also a retired US State Magistrate Judge for the District of New Jersey. Hello, Ron.


00:45

Morning. Thank you for having me on. Nice to be here.


Debbie Reynolds  00:49

Yeah, this is such a fun podcast for me. So you and I met as we have a friend in common, who's Gail Gottehrer, who's a lawyer in New York has worked on emerging technologies. We've been friends for many years. So she invited me to be a part of the New York State Bar Association Technology Committee, for which you are a coach here. And that's how we met many years ago. But since that time, we've done a lot of different collaborations together. And I feel like we've done so many this year, I've lost track almost.


Ron Hedges  01:31

We have done a lot this year. And we have another one coming up in January on Artificial Intelligence.


Debbie Reynolds  01:36

We do, we do. So I just had to write a list of some of the stuff that we've done together. And it's not even a complete list. It's just stuff that I just, I knew for sure that we did and I, you know, obviously we're going to do a lot more. So the cool thing is, we end up doing, you and I tend to like or gravitate towards issues related to kind of emerging technology and emerging legal issues. So that's why I think it's so fun to collaborate with you because we, you know, I think you beat me to the plan. So I think we're going to do it. They are Metaverse, so you'd already done some on that. And this year, we did some things on like voice printing, post-quantum cryptography. We recently did one on data literacy, for attorneys. And you know, as prolific as we have been in our joint collaboration, you're even more prolific in other things that you're doing. So I always follow your feed and see places you're speaking or things that you're writing. I just want to do a short list of some of the stuff that we did together this year. So we've done I think we do at least one or two things for the practicing law Institute, the American Health Information Management Association, a HIMA, Louisiana Health Information Management Association. Lima, as I said, the New York state bar association and I think we did Lawline Or we're going to do, we're going to do Lawline.


Ron Hedges  03:16

I think we've done well online. But this is a one on one onlline program. AI is what we're doing in January. And we have other PLI events and other events coming up. In 2022. We have a lot going on. Hopefully, some of them are even going to be in person again. So we'll see what happens with that.


Debbie Reynolds  03:35

So we do speak together, we write things together. I think we were contributed to reporting for the New York State Bar Association. I think there was something else you wanted me to collaborate with, knowing like our first author will talk about that after the show. But tell me what drives your passion for education and being able to be an advocate for not only technology, but how it intersects with the law?


Ron Hedges  04:00

Well, let me first off start by saying whatever I say today is a personal opinion. It doesn't reflect my firm. And I'm not giving anyone legal advice. I'm just trying to inform and educate. But to go back to that question of viewers, I started on the bench in 1986. I can't tell you the first matter I had involving electronic information, although between 1986 and when I went when I left in 2007. Obviously, the world changed in litigation as to lawyers knowing what ESI electronically stored information is looking forward producing it and fighting about it. If it's not produced or if it's not produced in the rightful form, or if it's lost, but to teach. I have always been interested in teaching. I taught at Georgetown Law, the first class on electronic discovery; I also have taught courses at Seton Hall. And at Rutgers on different topics. One of the fun ones I taught at Rutgers law was on science fiction in the law. And this I did maybe, oh, in the 1990 or so. And, interestingly enough, a couple of the things that we talked about in the class, and I had taken science fiction stories and said, how are we going to deal with this, as we move forward, was on electronic surveillance. So I've always been involved in that I still do guest lecturing at different schools. I was at Seton Hall a couple of weeks ago, and I'm going to be there again next week. I've done things with directors and other places. So I've enjoyed teaching, and I've just fallen into electronic information, civil and criminal because when I was on the bench, I dealt with warrants as well as civil cases. And it's an interesting topic. And the other fun thing about it, there's something new every day. And that, I think, is what got me into technology. Because there's always a new technology to think about, and it's always something for me, that's fun. Because I have to take some things I know and maybe extrapolate them or draw analogies to what we've done before.


Debbie Reynolds  06:22

Yeah, I agree with that. I didn't know that background; that's really interesting. That explains a lot about you and your manner. I love the fact you're you speak a lot, but you're a really good listener. So you understand, be listening very well to people, you could communicate really well, they do it in a way that is inviting. So people feel brought into the conversation as opposed to being shut out. So you know, even when you're talking about, you know, ESI, you know, some people just say that to keep moving, you say what, you know, electronically stored information. So you're bringing people into the conversation. And I think that's really important, especially as we're talking to people in all, you know, all walks of life, all kinds of levels of understanding of kind of technology and law.


Ron Hedges  07:11

I miss doing things in person. Number one, I like to travel. And obviously, we haven't been doing much of that in the last two years, or so, hopefully, things are going to open up again. And we're going to be able to do live events. There are a couple scheduled for next year already, although I had a couple scheduled in December that went virtual were put off just because of the pandemic. But I missed the travel. On top of everything else, I miss live programs, in-person programs because you have a dynamic when you can talk to people. Frankly, I walk around a room and ask somebody a question. I can't do that on Zoom, really, or any other platform. And, frankly, I’m sometimes disappointed; I might be speaking to an audience on Zoom of a set couple of hundred people. And I get one question or two questions. And I just think that's the nature of the platform and the fact that you're not in front of someone.


Debbie Reynolds  08:16

Right. Well, tell me about your interest in privacy. So this is fun for us. Because we exchange articles with one another all the time, when we see something we're like, what I can't believe that's happening, you know, and we try to, you know, bring it into our conversations or bringing into, the collaboration that we do, I think it's really interesting. So I think you and I do the same thing. So tell me, what is it about privacy there that got your interest?


Ron Hedges  08:45

I do a lot. I do a lot on ESI, or electronic information being brought in an ESI in the criminal context. And I have had put together for years, a compilation of case law and other resources on Fourth Amendment, Fifth Amendment issues, Sixth Amendment issues, evidence issues, and whatever discovery issues in the criminal area for electronic information. That compilation is updated semi-annually, and it's hosted at the Massachusetts Attorney General's Office website. So that's available if anyone wants to see it. But that's fascinating to me because we're dealing with whether there's probable cause to get a warrant that raises privacy issues. So that's one area that interests me, the criminal side and the civil side. Privacy is an offshoot of everything I did when I was on the bench. I normally will not normally, but I was on a regular basis faced with issues about whether I should allow discovery into something that's quote private close quote. Maybe it's medical history. Maybe it's financial information. Maybe it's tax returns. But those are all things that I dealt with on the bench. And that got me interested in privacy. Another area I'm interested in and writing about is whether or not materials can be exchanged in discovery under confidentiality orders, whether things can be filed under seal. That's also privacy-related to some degree. And that all got me into the new technologies involving security and the like; we've done programs together on cybersecurity. There are more coming up. And dealing with cybersecurity, obviously, is privacy. I mean, privacy and cybersecurity are really two different things, in a sense, but that interests me, privacy/ got me into privacy laws. We can talk about the GDPR all day. I don't talk that much about it, frankly, because I don't do that much on cross-border transfers between the US and the EU. But certainly, they are American companies that are impacted by GDPR. We also have privacy laws in different states. We have a sectoral approach on the federal level. And a lot of states still have that, too. So those are all the different reasons why I became interested in privacy issues. And then, as I said, cybersecurity issues today.


Debbie Reynolds  11:26

Exactly. Excellent. I want to talk a bit about this. And this is something that sort of fascinates me. So when people hear us talk about privacy, they talk about it in such broad terms that it, you know, you have to kind of flesh out what they mean by privacy. So like you're talking about maybe a criminal context, if someone's phone is taken, and they say, okay, this is my private phone, maybe there's data on there this, you know, secret to them, you right, there's not really considered legally private, right? You know, there's a difference between kind of secrecy, confidentiality, privacy, things like that. So tell me, just flesh those concepts out. For me, because I think people get confused.


Ron Hedges  12:13

Let's talk on the criminal side. Let's talk on the criminal side first; privacy is rooted in the Fourth Amendment. And it talks about the warrant requirement and the particularity requirements. So before you can seize things, if you're in law enforcement, you have to have a reason to base it. And you have to know what you're looking for, more or less. That's the easiest description. The Supreme Court, in case law over the last 60 years or so, has developed this idea of subjective and objective expectations of privacy to decide whether the Fourth Amendment attaches to something. So that basically means Yeah, I have a cell phone. I think it's private because there's information on it that's private, but a society is prepared to accept that that information is private. So the Fourth Amendment applies to it. The answer to that, generally speaking, is yes, there is a supreme leading Supreme Court decision called Reilly versus California, where a guy is stopped. The government wants to look in the phone, and an officer looks in the phone without a warrant, eventually winds up in a Supreme Court, and the Supreme Court said, Look, this is private information. It's protected by the Fourth Amendment. If you want to get into it without a warrant, you have to have a circumstance or reason not to have a warrant. And that's called incident to arrest. But you get to search incident to arrest without a warrant because you're looking to avoid the destruction of property and protect the officer. So the Supreme Court said, Well, you know what, it's a cell phone, the office is not endangered with it. More than any other object other than a weapon or something like that. And as far as, excuse me, that was my phone. As far as the destruction of evidence. Well, there are things called Faraday bags, and police officers can stick a phone in a Faraday bag, which is fascinating in itself, because when I've got the program, now, I see companies on display on display for offering for display and sale, Faraday bag sort of cages, whatever you want to call them, including one big enough to drive a car into and you stop and think about it cars these days. Last time I had someone look at this. There are at least 21 computer systems on a car that collect, retain, and sometimes transmit data. So you want to protect everything in a vehicle, or you drive it into a Faraday box, whatever you want to call it, until you gotta warrant. So that's the criminal side. We could spend an hour talking about nuances of that. There's a Supreme Court decision that came out three years ago called Carpenter versus United States. The police were investigating a series of robberies in the Midwest around your area. And they track the vehicle for over 120 days. And the Supreme Court said, Look, this is too much. If you track something that someone does that long, you really can find out about a person's life that can trigger the Fourth Amendment. Although the Supreme Court said, well, it has to be seven days or more of tracking. So we've got two states to deal with before that, but interestingly, it was also a five to four decision. And I wonder whether if that case came to the Supreme Court today, you would have gotten the same ruling. But leave that one aside; that's just political musing or philosophical musing about what the Supreme Court did, the majority the Supreme Court might do. So that's the criminal side. civil side privacy, you go back to the things I talked about before, we have always said the tax returns are protected, medical information is protected, medical information is protected because the HIPAA tax information is protected. My recollection is there's a statute or regulation that says it's protected. But if we're talking about what's private these days, it can be for the business side; trade secrets, intellectual property, falls within privacy. It's not personal privacy. But those are things we protect because we care about them; it has value. And then on the civil side, I hate to say we know it when we see but someone would come to me on the bench and say, Judge, look, here's some sensitive information about marital status, or this or that, I don't need the world to see it. And unless the other side really had a good argument and said, Okay, you can exchange this in discovery, you do a confidential reality order. If you want to use it later on, there are ways to deal with that. And if you want to file it for some reason, then you got to deal with the First Amendment and a right of access. Again, we don't need to go into that. But the nice thing that's coming up now, I mentioned before Debbie, other than this concept of sectoral privacy. And that's because HIPAA of medical information andf Gramm-Leach if I have the statute right for financial information. The statutes that are popping up now and California has one, Virginia has one, Colorado has one comprehensive privacy law. They all define information as deemed private. It can be biometric information, financial information, addresses, whatever. But you look at the statutes, and then you decide what can be protected. So there's a lot of approaches to privacy. I don't know that there's any one uniform way to look at it, Debbie. I agree.


Debbie Reynolds  18:19

And it's getting more complex because all these states and even cities now are passing their own, you know, statutes or regulations related to privacy. And then to me, I was just on a call with someone this morning, we were talking about the confusion about not only was private, but you know, the thing that the states are doing? Well, two things that I think the state regulations on privacy are doing. One is they are creating a system where notice isn't sufficient. So they're creating these consent requirements. And then they're categorizing data sensitivity levels in some way. So some data has more fines than others because they consider more sensitive and, and then the terminology isn't the same, you know, in the statutes either. So I don't think the New York Shield Act calls sensitive data sensitive data; I think they have tiers of data. You know, they're so it is confusing, and I think it's only going to get more confusing.


Ron Hedges  19:24

Well, let's leave aside the possibility that the feds do something on a comprehensive basis because I don't know whether that's going to go through Congress. Well, the last time I looked over 100 bills pending in Congress to deal with something including AI, facial recognition, technology, and the like. There are a couple of main rooms I think all these statutes have in common. I mean, number one, I got a definition, whatever the definition may be. Number two, there's a threshold for an entity to be subject to the statute. Now you mentioned the Shield Act, the Shield Act. Is it a privacy law? Yeah, it's more cyber privacy, and cyber security, I suppose, is the best description of that. So, in theory, any entity that has any information on any New York residents subject to that statute, including law firms. So if I'm sitting in New Jersey today, if I had a law firm in New Jersey, I'm admitted in New York, I'm working with the New York resident. I have information about her that's protected by that statute. I'm at least in theory subject to that statute. Now, we can argue about whether that really works for one resin or whatever, but the statute seems to say that pretty clearly. So number one, we have this idea of a definition; whatever it may be in a various number two, we have applicability. And that can be, for example, California, Colorado, how much business do you do there? How many residents' data do you have and the like? Then there's some kind of notice that's common, that we're collecting the information or using the information or whatever. There's the consent provision you talked about. There's a great thought we should have consent to use, as opposed to opting out. So it's opt-in or opt-out. Most of what I've seen it's opt-out. So you can't just you're not affirmatively saying I can use it, but you can use it, you're saying I don't want you to use it. And then there's some enforcement mechanism. And that varies all over the place to in Illinois. Illinois has a biometric statute that allows private causes of action without limitation. New York does not definitely, and California, Colorado, have regulators to deal with things. California allows some type of private action, although the damages are limited, with recoveries limited, as opposed to Illinois that has no cap on what can be awarded. So there are some common features. And the other one is notified when there's a breach, and there's got to be some notice to someone that, the Comptroller of the Currency, or someone just came out with a Regulation A day or two ago talking about notification of a breach. And behind all this as the possibility that something's going to be done at the federal level, I can just about guarantee, whatever it is, it's going to be minimal because you've got to deal with a lot of interest around the country. And that's going to lead to questions about preemption, the whether the federal laws going to effectively say we're the only regulator in the space, no one else can do anything. There are three different kinds of preemption. We don't need to go into that today. But, and then we also have the prospect of other states doing things. Washington State came close to a comprehensive law, Florida came very close to a comprehensive law, those, those both failed because they couldn't work out whether or not there was going to be a private cause of action or what it was going to entail. I understand they're both being reintroduced to have been reintroduced. There are other states that are looking at this. Now I just saw a notice. I noticed a blog yesterday about Massachusetts having one that's being proposed. So the question is if you're dealing with all this stuff, which law applies. And if you're a business entity, and you're doing a multi-state business, you may be faced with five states laws that apply, as well as some federal law if you're dealing in a particular area.


Debbie Reynolds  23:51

Yeah, it's definitely complicated and can be very confusing. I would love to switch to talk about surveillance. So okay. This is a particular topic that you and I like to talk about. But I guess the thing about surveillance is in the past surveillance, you know, in my view, like in movies, surveillance was, you know, someone peeking through the window with a camera or whatever. So it was sort of something that caught someone's attention, and then surveillance happened. But with technology now, like we're constantly self surveilled, we're being surveilled outside or inside, not for a particular purpose, but because we have technology that's recording, you know like a phone was recording our steps. You know, the places that we go, the cameras outside are recording kind of our comedies and goings. What is your thought or concern in the legal space about kind of the proliferation of surveillance and surveillance technology?


Ron Hedges  25:01

Well, let's think about that in two ways. Number one, what do we do to ourselves. And the easiest example of that is a smartwatch or a Fitbit, whatever you want to call it, that's recording, essentially body telemetry. So it's recording your heartbeat or blood pressure or whatever it's doing. And it may be transmitting it, or it may be storing it somewhere until someone wants it. So we're kind of responsible for that. Because we bought a device and we hopefully know what the device can do and how it works. Then there's the cell phone, and cell phones transmit location information all the time. So that's something I suppose you can say we do it voluntarily, we don't have to walk around with a cell phone, we can leave a cell phone someplace, but I don't think anyone thinks about that much anymore. And I doubt even less people consciously do that. And then there's surveillance done broader than us, and maybe, there's a drone flying overhead. Maybe there's a camera on a light pole near me, or whatever. So we're being surveilled a lot of ways. Some of the surveillance, we control some of it, we don't. And that's, that's issue number one. So if I was litigating that the first litigating some issue, someone's, let's say, someone's arguing they're being illegally surveilled. The first thing I would want to know is exactly what surveillance are we talking about here? What did that individual agree to? Or not agree to? And I'd also want to know what laws are going to apply. If I'm walking on a street, and I walk past a light pole, and there's a camera on it. I don't know what rights I have to be very honest with you, I suppose. If that device is capturing facial recognition information, there may be issues depending on the jurisdiction you're in whether it can be captured or not. We could spend an hour just working through all that. But those are the immediate things I would think about. Putting that aside, the next issue is the difference between whether the government does it or whether a private entity does it. Now I read a law review article yesterday. It was written by Paul on my belief, who teaches out on the West Coast. There is a procedure under federal law where government agencies can request internet service providers to preserve information. And his argument is that that's a seizure under the Fourth Amendment because one enforcement is enlisting, and indeed, under the statute requiring information to be kept that would otherwise have been disposed of. So there's something about surveillance, you know, we get cell phones to get information all the time. So that's another issue. And that brings in Fourth Amendment issues and statutory issues. There's just there's a lot there to think about. And you really need some kind of a flowchart. When you're talking about surveillance, and then that, of course, brings in a lot of other ones you were talking about peeping toms, there are statutes that criminalize harassment that criminalizes hate speech, threats on the internet, or threats through social media, whatever. Not much there is federal law. And most of the case law there is on the state level because that's where a lot of those prosecutions take place. And the question is whether or not the statute is overbroad as an invading First Amendment protections for freedom of speech protections and the like. That's where we see a lot of case law on that. And then the other question, of course, is whether the conduct that is complained of falls within harassment statute. You see this come up a lot in domestic relations issues, if not domestic relations, issues, again, issues where children are being surveilled, and the like. But again, we have to start back with that. Number one, we look to see what's being done number one, number two, we look to see whether it's a statute that fits it and then after that, then you get worried about whether there's going to be prosecution or the like, also comes up a lot in divorce proceedings and the like, or harassment proceedings involving I was dating someone I'm not dating anyone anymore, but now I'm harassing her out allegedly, that those are other things where you see all these issues pop up in surveillance, and those raised a lot of problems.


Debbie Reynolds  29:57

Yeah. What other issues talking about the future. Now, what other issues can you foresee? You're very good at seeing? If you hear something or see something, you're very good at thinking, Okay, this is what the implication of this could be in the future. So what thing right now is concerning you this on the horizon related to privacy?


Ron Hedges  30:23

Well, the immediate thing that always concerns me is social media. It's not new. Obviously, it's not a new technology. It's been around. But it's getting incredibly intrusive. Section 230 of the Communications Decency Act of the Stored Communications Act precludes liability from social media providers for various things. And that was done to make social media grow back in the 1980s or the 90s. We're past the stage of growth, and these are giant businesses now. And that invariably leads to arguments about whether or not those should be regulated. I mean, I kind of think no, just because I don't know what the regulation is going to look like and what it's going to resolve. But there's so much out on social media; now you can call it hate speech, you can call it the right-wing, left-wing, whatever. I don't know what we're going to do with that. But I expect what we're going to see in the next few years attempts at the federal level to reform to 30. Some states have already passed laws, to me that is a dubious constitutionality, attempting to regulate social media, but believe that again, aside for another day, there are lawsuits popping up all the time, by individuals who are saying, some social media provider took down my content or blocked me or this or that. And the first big hurdle for those plaintiffs is there's no state action here. So you can't rely on any constitutional principles. Unless you can somehow argue that that provider is engaged in state action or assisting in state action that leads you to other statutory bases to sue state or federal, and then you've got this preemption question. So social media concerns me as a general proposition, and it's gross, and where we're going to go with it. And of course, the legal issues with that they're their criminal and civil issues all the time that pop up involving social media use. That's number one. Number two. Interesting, I did a program yesterday; we just talked about these intangible assets, we're going to see the growth of more intangible assets, and not the piece of art that really consists of electrons that sells for X million dollars. Just a lot of things that we have; we haven't seen much case law on this yet. For example, When I die, social media content, is that mine? Does it go to the provider? If it is mine, and I die? Can it be passed on to my son? Can my son have access to it? What happens if my son doesn't have the key for me to get into cryptocurrency, something different? We're going to see more in the future. What's the liability of a cryptocurrency provider, whatever you want to call it. If I'm the one that has the only key to it, because of blockchain technology, I die, and my state doesn't have access to it, I think we're going to see a lot of issues involving technology and new technologies and involving access. The same thing in the area of artificial intelligence, we're going we've seen a lot going on with artificial intelligence. Lately, the program you and I are doing in January for a long line, you're going to be talking about the discovery of AI, in this ability of AI. Whenever there's a new technology, we're going to have to fit it into something that already exists as a construct. And just to remind everybody, you know, the Constitution from the 1780s, subject to the 1865 civil rights amendments, and we have to take all these new technologies and figure out how they fit into something that was never anticipated back in the 1780s. When we wrote the Bill of Rights, so well, you know, that's, that's where we are.


Debbie Reynolds  34:43

Yeah, that's all true. Oh, my goodness, oh, my goodness. I want to talk about data retention. So data retention, and this comes up in eDiscovery cases, probably kind of pre, you know, a complaint is how much data you have how much you keep, and then it creates privacy issues. Because if you have data that you're retaining longer than you need it, it creates more risk for an organization. So I feel like that data, you know, I feel like that data retention question needs to be addressed in all those areas.


Ron Hedges  35:24

We need to separate what we're talking about now. So an organization collects data, it collects data, it stores data, it uses data when it's done with its use of data. And let's assume it's not in the market of selling data or whatever. At some point, in theory, there's no need for the data anymore. All that data should presumably be put in a records retention policy. And we know records retention policy is a euphemism for records Destruction Policy. So number one, there may be policies, in effect, that talk about getting rid of data at a certain period of time, I hoped it would be. Because the longer you keep data these days, the longer you wait, the more you're at risk of a data breach. Thereby an insider or an outsider who's trying to get your data. And data is valuable. We know that. So that's number one. records retention policies, how does the retention of data fit into that? Number two, we're seeing all these statutes that are popping up now that talks about minimization. And minimization means we don't use it for anything more than we need it. So that side we don't, if we collect the information for sales purposes, we don't turn it over to marketing purposes. So it can be used for something else or given to a different part of a company or division. That's something that can be enforced by a regulator and an action if companies are kept too long. But putting all that aside, and by the way, there are some statutes and regulations that require data to retain, be retained. For example, if you're a job applicant, and your subject, your company that you're applying to, is subject to the EEOC, you have obligations to keep employment data for a certain period of time. Put all that aside. Preservation is a legal concept. And it often arises before litigation is commenced, ecause, you know, litigation is going to commence before the complaints are filed. Maybe I sent you a demand letter. Or maybe I had a conference, and we see on an issue your bets off because you did something wrong. At some point in time, you reasonably anticipate litigation, and we can talk for a long time, but reasonable adaptation. Anticipation means if I'm a plan if I presumably anticipated litigation before the violence filed because I had to decide to file a complaint and talk to a lawyer about it before it was filed. If you're a defendant, you might have gotten a preservation letter before the complaint was filed. And even if you didn't, when the complaint was filed, and you were served with process, you had to preserve them. So number one, there's a point of time when a duty to preserve attaches, then you have to think about what to do he extends to easiest description, anything relevant to the case, we can go into more detail about that. But if it's relevant to a case or an issue in the case, the easiest thing to say is we have to preserve that. And we can talk about how why that is; we can talk about how long we preserve, but you have to keep it. And by the way, does include information; it can be subject to privacy laws. But remember, these are different concepts now. So maybe after the information is preserved, you're the plaintiff, I'm the defendant, you tell me I want all this information. I'm filing on behalf of 10 people. Because we've all been enough. We've all been denied promotions because of age. So I call you and say, look, I understand you're suing on behalf of 10 people, but that's protected information, age information, employment information, wages; I can't give it up. That doesn't mean it doesn't have to be produced. We talked about this before, maybe do subject to a confidentiality letter. If you want to use it in court, it may be sealed as a totally different concept there. But at some point, the litigation ends when the litigation ends. All that information that I can because of the lawsuit should have been within some type of records retention policy in the first place. And it should have gone back into the policy. And maybe in the course of litigation, maybe a little later. Two years old, maybe the type of information that subject to a hold under the retention policy should be gotten rid of. So someone should be thinking of the organization's should be saying to itself, I've got this information back, I don't have to preserve it anymore. Do I have to retain it because of my records retention policy, because of some independent regulation. And I think that's sometimes where we have this issue about minimization Debbie, that data comes back. And unless you have some automated tool put together, or some type of mechanism to check where it fits in records, retention, information is kept, it couldn't be gotten rid of. So I would separate preservation from retention, either because of a statute or rag or an internal policy. And the two come together when preservation obligation pulls stuff out of records, retention. Basically, litigations over it go back into records retention, and then you the organization have to think about or have a process to get rid of it, it doesn't have to be kept anymore.


Debbie Reynolds  41:13

Right? Right. So for things that let's say file or the scope, that where, where it no longer has to be preserved, but it's kept anyway. And a lot of that is where there may not have been any legal obligation for companies to delete so they could have kept it, you know, forever. But now we're seeing, especially with a lot of the data breaches, the type of classes of data, there's a lot of cybercriminals like to get as this data is kind of laying around. And people don't really care about that much. Because it may not be as highly protected, it may be it may not have a high business value at that point, but it has a high business risk if that data gets breached.


Ron Hedges  42:02

Well, then you have to start thinking about cybersecurity. As I mentioned before, privacy is one thing cybersecurity something else they meet because the information it's private has to be protected. And that goes to cybersecurity. All these statutes and the case law says you have to be reasonable in what you do; there's not that much guidance about what reasonable means that we're going to be developing that organizations are going to come up with guidelines, we've seen that NIST has guidelines, I saw a new set of AI guidelines that just came out. And I emailed you earlier today, that the UN agency working with the Department of Defense came up with AI. The only listing I know in law, specifically as to cybersecurity or security measures, is the New York show that, and that's because the shield act adopted cyber cybersecurity cyber information. I can't remember what the initials are of the organization. But the New York legislature wrote into law, the security standards that were come up with by a private organization and said, look, these are minimum things you can do to protect data. And interestingly, some of those were protecting the data physically, like locking the doors, controlling access to a room where it is, or this or that. So that's something else we're going to see developing, I think over time that we but unless you've got an obligation to keep it, and you've got a reason to keep it. I don't know why an organization could keep the information or because it's just that much more that can be subject to a breach. And breaches lead to a lot of problems. There. There are obviously financial issues involved. There may be fines, and there may be sanctions. And there may be a loss of sales because a company, an individual consumer, may say I don't want to deal with these folks anymore. They don't protect my data. And it may hurt the company's reputation. So there's a lot. There's a lot to think about when an organization decides to keep stuff longer than it needs it. And to me saying, Well, I don't know what we're going to use it for. But maybe in two years, we'll think of a way that I don't that doesn't come from.


Debbie Reynolds  44:27

I agree. I agree with that. Let's talk about AI a bit. So the thing that concerns me about AI and automated decision-making is that I have a lot of concerns. So I'm concerned about bias in that area for obvious reasons. I'm concerned about the harm that can happen to people, for which I think there may not be any adequate legal redress. You know, if someone's denied, you know, let's say someone can't get it. College because the algorithms that you know, their parents, you know, don't fit some type classification or something like that. It's like, you know, it to me brings up also access to justice issues. So if someone who maybe, for example, someone who's indigent, is impacted negatively by an algorithm, how do they get redress? I mean, I don't know. I don't, and I think I'm just concerned there because I've no regulation is important. But I feel like we have to be more proactive and how we deal with artificial intelligence because of the harm.


Ron Hedges  45:40

Well, let's separate what the government does from what the individual does. One of, well, individual's entitled to due process. And if the government does something to take away a benfit, or decide to confer a benefit or remove it, the government has to act rationally. Somehow, that is the US Supreme Court decision from years ago to talk about that. So we have seen some challenges, for example, to unemployment benefits, welfare benefits of different types that are challenged because an algorithm is being used. And it's making some choices that are arbitrary, who sets the allegation, some cases on that not that much. The leading case in the country on this is Loomis; it was a decision out of the Wisconsin Supreme Court. Loomis was being considered, I believe, for pretrial or post convictions, supervised release. And an algorithm decided he's not a good candidate. It was challenged because some of the criteria had to do with the neighborhood he was in and his color. And the Wisconsin Supreme Court effectively punted and said, look, there may be problems here. But the results of this AI, or whatever you want to call it, were just one factor that a judge has to take into consideration to consider all 10 consider all these issues when the judge makes a discretionary rule. We have the California State CCPA may address this. I know there's a provision in the statute that says the the the Information Commissioner, whatever his title is now, in California, who's going to be enforcing this law with the Attorney General, is supposed to adopt regulations about giving access to the decision, automated decision making and what the results were. I haven't seen regs yet on that. I don't know that they're out there. I really haven't looked. This is going to be something that's going to be developing law, unless we have one of these privacy statutes that somehow it fits into Debbie and me. There's a lot to go on this. But say that AI is another one of those fields I mentioned before, you talked about new things to be concerned about. AI is going to be an adventure for a while now because we have to think about circumstances where and how we do discovery. If you're challenging in litigation, civil or criminal, how it gets admitted. And how transparent do you have to be, or what do you need to do? To have, quote, ethical AI? There are a set of standards that a European agency came out with a few years ago that I like; I usually talk about the standards I mentioned from DOD, have their own ideas about what ethical standards, what ethical AI might be another developing area. It's just another fun thing to talk about.


Debbie Reynolds  49:04

I think we're coming up with more webinar CLE ideas as we speak.


Ron Hedges  49:10

There's always a new technology, and there's always interest about it. Have you mentioned the metaverse before? We have a program coming up in January. I believe on the metaverse. That's fascinating. And it's going to be interesting to see what that's going to lead to.


Debbie Reynolds  49:27

I agree completely, so if it were the world according to Ron and we did everything that you said, what will be your wish for privacy anywhere in the world, you know, whether it be regulation, human stuff, technology.


Ron Hedges  49:46

Well, I'd like people who breach things to stop doing it. But frankly there's a lot of money in a book Willy Sutton that was Willy Sutton said, why do you rob banks? It's where the money is. There's money and data. And unfortunately, we have a lot of we have state-sponsored hackers, or whatever. That's one thing I'd like to see dealt with. I don't know how that ever is going to go on are in the States. I'd like to see a single comprehensive privacy law. But for reasons I mentioned before, I honestly do not see that at the federal level, I think we're going to have, we may have some overarching things on the federal side, maybe there's been a lot of talk about a uniform notice requirement, for example, for breaches or things like that. I don't think we're going to get something the equivalent of the Virginia statute, the Colorado statute, but the California laws at the federal level, I suppose the best we can hope for Debbie is assuming there's not any kind of uniformity in the country. Everyone agrees that there are a set of standards that are going to be deemed reasonable. So I don't have to find an entity, I don't have to do something for Colorado differently than I have to do for California or New York, or New Jersey, or wherever. And to be honest, as much as I preach about that, how good it would be, I don't know, when and if we're going to see that agreement on what's reasonable. It's going to be a lot of litigation on this in the future, involving with governments to litigation involving with providers do. Breach litigation and collection, litigation surveillance, like you mentioned before, and good for lawyers in the practice of law and for the experts and whatever, and consultants that the lawyers use. And we'll see what happens. Yeah, that's a great answer. I agree with that, for sure. I also want to want to thank you, you. You know, even though we collaborate on the New York State Bar Association, technology stuff, you've made a great effort to reach out to me and involve me in some stuff you're working in. And I just want to thank you because it meant it means a lot to me, and I enjoyed chatting with you and exchanging articles and jumping on calls and doing stuff. So I really appreciate it. And one of the things we as a profession need to do is to reach out to minority populations more and get more people involved. And you're a member of them to minority, it's not to minority, or female, and African American obviously. And to the extent I can reach out to people of different backgrounds, and may we need to do it because we need to bring different backgrounds in because you have different experiences than we do. And experiences lend themselves to developing good standards. I'm glad you're doing it with me. Thank you.


Debbie Reynolds  52:59

This is great. I'm so honored that you're able to do this podcast, I'm happy. It's funny that we're already planning our next year stuff.


Ron Hedges  53:11

In the works, yeah.


Debbie Reynolds  53:13

So I would love for people to stay tuned for us the stuff I'll be publicizing as well, some of the other things around you and I are working on for next year. But again, thank you so much and this is fun. Thank you.


Ron Hedges  53:27

We thank you very much.


Debbie Reynolds  53:28

Thank you. Thank you.


Ron Hedges  53:31

Alright Deb, goodbye.