"The Data Diva" Talks Privacy Podcast

The Data Diva E195 - The Honorable Judge John M. Facciola and Debbie Reynolds

July 30, 2024 Season 4 Episode 195

Send us a text

Debbie Reynolds, "The Data Diva" talks to The Honorable  Judge John M. Facciola, Federal Magistrate Judge for the United States District Court for the District of Columbia. We discuss Judge Facciola's extensive career and his pivotal role in shaping the intersection of technology and the law, particularly in eDiscovery and digital evidence.
Judge Facciola shares his unique journey, which began with his appointment as a judge in 1997 and an early case highlighting digital data's complexities in legal proceedings. This case, involving backup tapes from the Department of Justice, led him to pioneer the concept of proportionality in eDiscovery, utilizing principles from economics to balance the benefits and burdens of data production. His innovative thinking in this area garnered attention from the Sedona Conference, a significant milestone in his career.

The discussion also explores the evolution of technology in legal practice, from the early days of digital data management to the challenges posed by artificial intelligence and deepfakes. Judge Facciola emphasizes the importance of integrating advanced tools like continuous active learning to reduce the costs of eDiscovery, thereby improving access to justice. He shares his concerns about the high expenses associated with digital discovery, which can be prohibitive for many litigants, and highlights ongoing efforts to address these issues through technological advancements.

Judge Facciola and Debbie Reynolds also touch on the critical issue of data privacy in legal contexts. They discuss how privacy considerations are becoming increasingly relevant in discovery processes and the impact of regulations on corporate practices regarding employee data. The conversation underscores the need for balanced approaches to data management that respect privacy while fulfilling legal obligations.
Towards the end of the episode, Judge Facciola reflects on the broader implications of technological advancements on access to justice, particularly for those who may be priced out of the legal system. He advocates for innovative solutions to make legal processes more affordable and accessible, including AI's ability to handle simpler legal tasks without requiring extensive human intervention.

This episode offers a deep dive into the complexities of data privacy, legal technology, and the ongoing efforts to ensure that justice is accessible to all in the digital age. Judge Facciola's insights provide valuable perspectives on the challenges and opportunities at the intersection of law and technology, making this a must-listen for anyone interested in the future of legal practice and he shares his hope for Data Privacy in the future.

Support the show

43:35

SUMMARY KEYWORDS

case, judge, privacy, information, judges, computer, work, file cabinet, lawyers, discovery, evidence, technology, created, thinking, search, data, justice, sedona, court, cost

SPEAKERS

Debbie Reynolds, Hon. Judge John M. Facciola


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The data diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, the Honorable Federal Magistrate Judge for the United States District Court of the District of Columbia. John Facciola, welcome.


Hon. Judge John M. Facciola  00:41

Thank you very much. It's nice to be with you.


Debbie Reynolds  00:44

I'm happy to have you on the show. Well, we know each other in a couple of different ways. I collaborate a lot with Ron Hedges. He's a former Magistrate Judge for the State of New Jersey, and he and I do a lot of technology and legal things together, but Ron had asked me to participate in a conference in Washington, DC. It was for the eDiscovery Federal working group, and it was hosted by the FDIC in Washington, and you, Ron, and I were on a panel of ethics for two hours about AI and technology. Because I collaborated with Ron so much, I said, hey, why don't we do something together? So I'm happy that you decided to be on the show, and thank you so much.


Hon. Judge John M. Facciola  01:25

My pleasure again.


Debbie Reynolds  01:28

Well, I want to talk a little bit about your journey and your career. I know you also teach at Georgetown right now, but I'm always interested in this interplay between legal and technology and your path there?


Hon. Judge John M. Facciola  01:45

Well, my path was a weird one. I was appointed in 1997 to be a judge and minding my own business when a case came in. It was an unusual case in that the person claiming sexual harassment made his complaint directly against the director of the agency head of the Bureau of Prisons, who said that after he resisted the director's homosexual advances, his career went in the opposite direction, from up, we came across the situation that during discovery, there were some backup tapes. At that time, the Department of Justice was using a backup system involving tapes, and the argument broke out as to whether the government would have to produce them. Well, this was all terribly new in 1997, and I remember sitting in my chambers, looking at the file cabinet, and saying, hey, wait a minute. This is nothing like a file cabinet. First of all, this information is dynamic. Second, it was created whether the people who were involved in the conversation wanted it to be preserved at all. Also, there's so much more of it, and it's unstructured. I wanted to go over to that file cabinet and find all the work I did in the XYZ case, there would be a folder in there saying, XYZ here, there was none. So, I kept thinking about it and decided that it was appropriate to think about this in the context of proportionality. More particularly, I use the words marginal utility concept I stole from Economics 101, which measures the likelihood I would do something, for example, when one weighed its benefits against this cost, which is, of course, the very standard articulated in rule 26 of the civil rules, which emphasize that discovery should be proportionate, weighing benefits against the burden. So, as a result of that opinion, I ordered that the Department of Justice sample some of the emails, and from that, we would reach a conclusion as to whether an additional search would be necessary or appropriate, I should say. They did that, and in fact, I found there was no reason to go any further. So in the middle of all of that, the phone rang, and it was somebody from the Sedona conference, and they said, Judge, we read your opinion, and we think you've invented a concept here of marginal utility. I said, no, Adam Smith invented that in 1776; I just applied it. But that began my adventure, my adventure in many respects, but one aspect of it was my close association with the Sedona conference, where I met, among other people, Ron Hedges, my fellow Magistrate Judge at that point in New Jersey and Sedona led me to work on a lot of things, one that I was particularly proud to have worked on, it was a proclamation on cooperation. Lawyers had somehow gotten into the heads that zealous advocacy means being unreasonable at every possible opportunity. We simply wanted to say that there was absolutely no inconsistency between being zealous and being cooperative. If you could do so, it didn't harm your client's interests, and more than that, it was the general work of Sedona principles, which now have become pretty standard guidance to the judges on how handling discovery, disputes, and cooperation proclamation is cited again and again. Richard Braman, God rest his soul, was the president, and Richard had the brilliant idea of getting judges to sign it as an articulation of what they expected of lawyers. I don't know what the final count is or if there is a final count, but to my knowledge, over 100 judges have signed it. Now, as all that was going on, I was a Magistrate Judge. Magistrate Judges, among their many duties, are the review of orders, for example, in the stored communications out looking for cell site data and search warrants, search warrants which, in many cases, seek the contents of a digital device, a computer. And I wrote several opinions grappling with how the Fourth Amendment written in 1789 could apply to these technologies that I was grappling with. So those were some of the things I did; the cases and the warrants kept coming at me. I kept writing. I kept associated with the Sedona conference, and that led me to other things as well. I guess the greatest joy of it was the companionship and being on the same journey as my fellow judges. That was really thrilling because we felt we were really the vanguard of what was going on. All of that led, well, to a lot of things, to a lot of writings in that area, including parts of books, a lot of lectures, a lot of decisions, and my affiliations with Georgetown continued, going back early in my career as an assistant United States attorney. Indeed, they just told me, I've I just got now have 20 years of service to the university, of which I'm very proud. But I worked with the advanced eDiscovery Institute, which is the fall program, and then Kimura Grossman, Tom O'Connor, and I got together and created the Discovery Training Academy; as it now indicates, this wasn't getting off the plane, giving the lecture, get back on the plane. This was five full days of training, all day long, both from a technical viewpoint and from a legal viewpoint, culminating in the students doing a meet and confer before Judges like me and Judge Graham and others who graciously volunteered to do it. So on the basis of all that, I created one of the courses I teach at Georgetown, which is called Information Technology and modern litigation, and I attempt to equip my students with what I think are the fundamentals they must know if they're going to engage in litigation in the world in which we live.


Debbie Reynolds  07:58

That's tremendous work, and I've read a lot of your work, and it is seminal work, especially around digital information. I want to dig a little bit deeper into something that you said. It triggered some memories that I had of many years ago when you were talking about file cabinets and paper paper documents were obviously more time-consuming to create. When you thought about file cabinets, people obviously go through maybe the thought process of what was important and what wasn't important in the digital age. So we're in now so much more data is being captured, so sorting through data and figuring out what's important becomes a lot more tricky. I think that was true at the beginning of the Internet and people putting digital records into computers and things like that. But I feel like with this next wave of AI, we're reassessing that once again, but I want your thoughts on that.


Hon. Judge John M. Facciola  08:52

Well, I couldn't agree with you more. The problem was, in a digital universe, somebody made a decision whether to keep this or not, whether it was the yearly cleanup of my office or what else. Now, that process is infinitely more complicated because everything I've done has been captured from a silly email to a friend saying, I want to go to a baseball game to one of my judicial opinions. Now, if I were a good boy and took my own advice, I would, at certain times, stop and separate what I needed from what I didn't need. But who has time to do that? Now, the consequence of that, as I looked at this more globally was, this is where the expense was, the money being spent on ediscovery was hideous, hideous, and it was just getting a room full of people with banker’s boxes and one by one saying whether this was relevant. From my perspective, and from the perspective of someone who's deeply concerned about access to justice in the 90s, this was going to bankrupt everybody. No one could possibly afford this unless they had the gold of Midas. So blessedly, during all of that, I became familiar with Information Science, mainly because of my friendship and being on so many projects with Maura Grossman and Gordon Cormac and their creation of the ways to use Information Science in the search for discovery and the notion of continuous active learning. So there came, thank God, this way of searching that could find what was looking for and could do so because the machine could reduce the cost and expense terrifically. So I became quite a proselytizer of that, urging the bar and the bench to realize how this remarkable tool was now available to us, and it had to be integrated and how we thought about these things. Now, Artificial Intelligence is the next stage of that. I have not seen all of the AI tools that are being made available by the traditional vendors in this area, but I know from my own work and my use of those devices how it simplifies the search of looking for things. So now we've gone from this being revolutionary to being a matter of course in the federal courts. So, I worked, for example, with IEEE, I always forget how many E's there are in that, which is the International Organization of Electric Engineers. I wrote a preface to their protocol. They came up with an idea, which is, let's not reinvent the wheel every time we do this; let's see if we could reach a fundamental agreement on how we are going to do this search, and after we do that, we'll have an equal agreement that we will agree that a certain form of doing it will validate it. That is, we will use the principles of statistics and mathematics to prove that, while we may not have gotten everything, we certainly have gotten enough to indicate to us that the search for anything else is not worth it. So that process, which at one point was revolutionary, is now so much a matter of course that we have the lawyers trading protocols, and that's very exciting to me, very, very exciting, because now, instead of lawyers throwing out words they don't know mean about confidence level and confidence intervals and all of that, we have come up with a methodology that has been created by engineers, And we can adapt it to what we are doing. So I hope that will become a popular way of doing this. Frankly, Debbie, the hardest sell is to get through to human beings that they're not perfect. Even worse, we now know they're not very good at this, and I remember one day, I was teaching somewhere with a friend. We went from eight hours all day long, break for lunch, and about 4:30, I had done all my presentations. I pulled out the studies done under NIST, the National Institutes of Standards and Technology, and the information they yield, all of which showed that human beings didn't do this well and machines could be made to do it very well indeed. I was just about to finish. It was 4:30, I was running on fumes, and after all of this, including math and the calculus and everything else, there was still one hand, yes, I still think a human being should do this, at which point I almost bang my head against the nearest wall, so you run into that, you just have to deal with it. This is not to say that I understand why the elimination of human beings from this is a bit terrifying and requires us to think through how we are doing it, how we test it, and how we validate it as being very central. So I don't mean to be cynical and suggest that it's some sort of magic button, and we push it and everything's happy, but we've got to keep our wits about us, but when we do, we have the hope of reducing the cost of this for all litigants, which is absolutely crucial.


Debbie Reynolds  14:47

I agree, and I remember when more documents were put in digital systems, the challenge was it was just impossible to do it in a manual fashion. And I think I. The way that we're moving forward with technology now, the rapid advancement of it, the complexity of it, it is going to be impossible to do it without using some of these more advanced tools.


Hon. Judge John M. Facciola  15:11

Yeah. I mean, as we talked about at our seminar, it raises all kinds of new issues, but it is better to confront those issues than pretend they're not there or be ignorant of their existence. So this law firm, perfectly understandably, has an overall system for its work. So, lawyer number one has an antitrust matter, and a tax issue has popped up. Now, by going into the firm's database, he may find some excellent materials that will help her. But the problem is, suppose that information was developed in the representation of someone else. We may have problems of privilege and of privacy and so forth. So we have to be conscious of all of these, but better to be conscious of them and to think about them than to not be aware of them. There's just no excuse not to know the technology it's there.


Debbie Reynolds  16:09

Yeah, I agree. I want your thoughts on something that concerns me a lot, and that is the authentication of evidence, especially now that things like we're in the age of deep fakes and technology that looks real, we don't understand the lineage, or don't or can't track, really, either the chain of custody or how certain data has come about. I just want your thoughts on a high level about what's happening with technology as it relates to authentication of evidence?


Hon. Judge John M. Facciola  16:44

That's a wonderful point, and it is at our front doorstep. Dr Mora Grossman and Judge Grimm have spent a lot of time thinking about this and have suggested that the present rules on authenticity may not provide sufficient protection to that end. In October, they made a presentation to the Advisory Committee on the Federal Rules of Evidence, suggesting how the rules could be amended so that we would have a better structure to test this information and go so far as to say we may reach a point where even though it may be authentic, the Judge may still not want the jury to see it, because he may decide it is just too prejudicial, so we would have Susan Smith. Susan Smith was passed over for a promotion, and she now sues on the grounds of his due to her gender; the defendants have shown up with an MP4 file of a recent Zoom meeting, and during that meeting, she appears to be drunk. Now, is it sufficient for her to say, I didn't do that? The defense said, oh yes, you did. Would we simply use the traditional principles and say, well, it's more likely than it's not that it may be authentic or inauthentic. We will admit it and let the jury decide. So now, given the way these deep fakes are developing and how ridiculously seemingly accurate they may appear, and given the profound influence that piece of evidence might have on the jury, Judge Graham and just Grossman have suggested the Judge could be permitted the express direction not to admit it if he thinks he cannot negate that potential unfair prejudice by anything the judge could do. Somewhat revolutionary. It's not the usual way we do it, but it is a matter of extreme concern to the judiciary, and it's interesting. Yesterday, John Jorgensen, a forensic scientist, and my friend, we are both photographers, amateur photographers. He's almost a professional, and we shoot with Leica cameras, German high end cameras. He sent me the brochure that accompanies the camera he just bought, which is brand new, and it is amazing; it has inside software that works so that it encodes the image so it cannot be tampered with, and the photographer can, therefore always establish that was the product of her work, not the product of AI, and that's in a camera now. So we're on the cusp of something, and we have to think very seriously about how this is going to impact when this stuff is produced and offered in evidence.


Debbie Reynolds  19:57

Wow, that's amazing. I want to talk a little bit about this case. You and I are on a mailing list that Ron Hedge has, and he says some really interesting things. I'm sure you saw this one. So, there was a recent case with a judge in Washington State where there was some video evidence trying to be introduced in a murder trial. I think a cell phone video of this altercation that happened outside of a bar, I believe, and the Judge actually threw it out because apparently, they had some technical people who work, maybe in film or some other type of industry like that, make some AI alterations to that video. As a result of that, the Judge decided not to include it in evidence, but that's along the lines of what you're saying. I want your thoughts on that case.


Hon. Judge John M. Facciola  20:43

Well, I think in the traditional way of doing things, we could anticipate at a pre-trial hearing, each side would produce experts, and we would have a sophisticated discussion as to whether the evidence should be admitted. Now, we may be in a situation where it is undetectable using traditional forensic means, and as we see on a daily basis, the war between the people creating this and the people who say we can detect it, and every time they one makes an advance, the other measures it. As recently as Sunday, the Washington Post had a story in the business section in which independently valued systems that claim to detect AI and deep fakes, and they found that they failed quite significantly. So the Judges are just like me and my file cabinet. They're on the cusp of this and are going to have to figure it out. Of course, the major concern would be if you start tinkering with rules of evidence or civil procedure; this thing, as we have discussed, seems to be changing as we look at it. You go to all this trouble, and you promulgate a rule, and it's moot the day that comes out because the technology has run away. So it's a very difficult situation to be in. And as I say, as technology advances, we will have to look very carefully at the claims of certain portions of the industry, saying we can detect this as a deep fake.


Debbie Reynolds  22:26

Right, especially in high-stakes situations.


Hon. Judge John M. Facciola  22:29

Certainly, in high-stakes situations, yes.


Debbie Reynolds  22:32

I want your thoughts about access to justice and Artificial Intelligence. This is a topic that we talked about in our seminar in Washington, DC, and I know that you're very passionate about access to justice. The thing that concerns me about the complexity of computing is the expense of access to these advanced tools by people. I feel like we're entering a digital caste system where it's not just the haves and the have-nots. It's the knows and know nots. But some people have more information and more insights than others. But I want your thoughts on that as it relates to access to justice.


Hon. Judge John M. Facciola  23:10

Some people would say, well, this is a lawsuit. She's the plaintiff; they're the defendant. They will go to trial and will get it resolved. That is not what the American system is. We try less than 1% of all the cases filed; the rest go out on motion or, more commonly, are settled. So what we have to remember is we may reach a point, perhaps we're already there, where even so-called simple litigation is so expensive that people will not go to court and will seek some alternative way of resolving their differences. That is already happening, whether it is in baked-in arbitration clauses in the contracts you sign when you buy a computer or a watch or anything like that. That would be one way, but in another way, there are a lot of people, particularly in the academy, who thinks this is an opportunity for Artificial Intelligence to lessen the cost and maybe to the point where some relatively simple things can be done without lawyers intervention. I would imagine, from what I have seen in most urban courts, a large percentage of the divorces are now handled per se. Now, that digs the Judge in very deeply to deciding who will pick the kids up after soccer on Wednesday, but it's very demanding on the Judges. But if that's the way it's going, I've often said in the 17 years I was a Judge, the worst thing I saw was to watch the middle class disappear from the court. They were just priced out of it, so Congress passed the statute that gave them certain rights. Rights, but to enforce those rights may simply have been too expensive to do so. So that's a terrible problem. One of the scary things about eDiscovery is how much it costs. Craig Ball, my colleague, and I, Craig, did something quite a few years ago now. He called it the Edna project. He took an about-to-be-divorced couple and gave five or six of the discovery vendors he knew the facts of the case. They did their banking on the computer. They had email accounts, had their subscriptions, and so forth, and asked the five vendors if they could do the discovery and bring it home for less than $5,000. None of them could or did, and they were as concerned about that as anywhere because that's a market they would like to be able to exploit. But how do you bring the cost down? Another friend told me in his work that in a big case, storage on the cloud is costing a fortune. It's almost indefinite when you could end it, the appeal may take a year and all of this. So that's another rather terrifying cost I am anticipating. The next few years will see many organizations, both academically and non-academically, seeing what all of this has taught us about access to justice and seeing if there are alternatives. Also, COVID had an impact on that. They tell me that if you hold traffic court at night and permit people to come by Zoom, you triple attendance. You also tripled the number of fines you collect. We've gotta think about this world indeed. Two days ago, when I was speaking, a gentleman came up to me, a JAG officer now, who's specially assigned to do Artificial Intelligence work in the DOD, I believe, and he said to me, he said I want to ask you a question I've been asking every judge. What's that I said? Will the courthouse disappear? I said, you know, that's a damn good question, and I could see why you're asking. So we've got some work to do, and it's something that concerns me deeply.


Debbie Reynolds  27:14

Yeah, I know you're very passionate there, and I really love the work that you do in that area because it needs to be spoken about. I want your thoughts about Data Privacy and how it's seeping into legal cases. Now, I remember back in the old times when there was litigation, especially in a corporation. Corporations could take employees data, and they didn't have to ask, or they didn't have to get consent, or they didn't have to be concerned with some of these other regulations that are now coming to bear, and then also some of these regulations are trying to make sure that companies are not retaining things longer than they have to. Obviously they have to, if they have a legal hold or they anticipate litigation, but that's definitely changing the business landscape. But I want your thoughts on that, on how privacy is impacting legal.


Hon. Judge John M. Facciola  28:06

Yeah, well, in terms of the individual's privacy, it's not one of the factors to be weighed under Rule 26. Nevertheless, I've seen some excellent scholarships suggesting it should be. I've also seen some cases where Judges not really caring that it isn't there, are weighing privacy in the discovery calculus. Why do you need every Facebook entry? That's not going to happen. Okay, we've got to figure out a way to get you what you need and still preserve the privacy of these people. They didn't lose everything more to the point they've communicated with other people with privacy. So the question of privacy is now at the floor of discovery and is going to be reckoned with as a ground for not producing something in terms of the question of the privacy of the employees in what they create while they're in the boss’s employ, whether they are in the office, which nobody is anymore, or in the office on their computers at home. We are always haunted by the mixed use of the computer, where I always tell my students there's something terribly wrong, where the date of highest Internet use is the day after Thanksgiving because everybody's shopping and the next is the day after the NCAA brackets come out. I'm not particularly fond of Joe Blow from Kokomo, who sits there and bets on the boss's computer, but the courts are now working their way through this problem that is, can the boss tell you when you go to work for the boss, that everything you produce is ours, and we can monitor, we can look at it, and so forth. There is a compelling argument that under the principles of agency law, that seems to belong to the boss. There are some courts, particularly one in a New Jersey case called Stugart, where a young lady, thinking the axe was about to fall, used her office Google account to get into her personal Google account, where she talked to her lawyer. The company came in and said every time she opened that commuter, there was a big notice. Says, whatever you do on this computer is ours. You don't have any privacy right in it. But the New Jersey court nevertheless chastised the lawyers for looking at what was obviously privileged despite their contention, why in the world did she put it on that computer? There are contrary authorities to that. So, this is anything but clearly established. The argument of the corporation is you're warned, you're told, for crying out loud, to take the phone out of your pocket. Don't use this stuff here because that also gets us into the problem where we have to go through all of that stuff because you're doing the NCAA brackets for shopping for Thanksgiving, all of the other issues; you would love to see legislation that talks about that, but apparently, a lot of noise. But I used to represent Indian Tribes, and they would say of the white man's promises, often do we hear the thunder, seldom do we feel the rain. Well when it comes to this body of law, and we look to Congress, often we hear the thunder, seldom do we feel the rain. Oh, regulations that talk about this, and the thing that drives me crazy is that in the same period of time, our European brothers and sisters did a sophisticated and demanding regulation about all of this endless number of conferences and so forth, and somehow got all the members of the EU to agree to it. It's a remarkable first start, and it's a wonderful chart on which we could now graph. I don't know if anybody is going to do that, but it's hideously overdue to have as many of these standards as there are States or cities, for that matter. Doesn't make any sense at all.


Debbie Reynolds  32:10

It's so complex, and I think it's complex for businesses and it's extremely hard for consumers to be able to navigate what their rights are. Yeah, you mentioned something earlier. You had done some work around the Fourth Amendment, so I have a question about that. There was a bill, for example, that's being floated in Congress. I'm not sure if it's going to pass, but it's called The Fourth Amendment Isn't For Sale, and what they're trying to do is close the third-party doctrine loophole, where people's digital information, if it's not in their possession, has lesser rights than things that are in their physical possession, and that creates a privacy challenge. But I just want your thoughts on that, the third-party doctrine, and how that plays in.


Hon. Judge John M. Facciola  32:56

The third-party doctrine is based on two cases, one called Miller. The other, I think it's called Smith, and in a nontechnological world, many years ago, the thesis of it was that what I share with a third party cannot possibly be protected as privacy. Okay, so if I took this information and shared it with my account and I couldn't claim privacy, that was contorted into the belief that when I sent an email, I was sharing it with the computer system, which transmitted it in a case by Carpenter. The Supreme Court rejected that way of thinking about those and said you had to get a warrant. So I was hoping, in light of that, that the third-party doctrine had died a very peaceful death. Apparently, it has not, and I could understand why someone would want to solidify the notion that that which I use the computer to transmit does not mean I am publishing it; that is, I'm just using it as a means from get it to point A to point B. Now, that's only part of all the many privacy issues that are out there, but I'd like to see that one quickly resolved because I think the Supreme Court's opinion is quite clear, but you probably have read about geofencing. We just published an article in the Federal Courts Law Review, which is the Law Journal of the Magistrate Justice Association, in which the Judge grappled with all of the problems that arise when you are grabbing all the phone numbers and the metadata that go through this portal because, in that haystack, you'll find the needle of the one phone number of the guy who robbed the bank that's three feet away. So the Judges have scrambled with that. One of the things I was always very sensitive to and wrote opinions by was if there is a seizure of a large amount of information. Information after the filtration of it so that we can determine what is relevant. That which is irrelevant should be destroyed. It should not go into another computer and become part of a massive collection of information. Because if it is the people who were using the phones at the time and got caught in the Geofence, will be surprised to learn that the Feds now have their phone number and their metadata showing who they are and who they may have called. So, as I say, this is significant. The other thing I used to say about my 17 years as a Judge was it is impossible, impossible to exaggerate the increase in the technological capacity of law enforcement. What law enforcement could do now doesn't even vaguely resemble what they could do when I first became a prosecutor. Debbie, we used to do a lot of narcotics cases. I couldn't get a cop to use a camera if I sat on my head. You know, they're up on an observation post watching this guy deal drugs on the corner. My God, get a camera on them, and now they're at a level of sophistication that is absolutely incredible in terms of what they can do. So this also has to be on our agenda, because data is the heart and soul of so many criminal prosecutions. I was not surprised to see that the US Attorney in the Southern District of New York, who, interestingly enough, clerked for Judge Garland, where the Judge was on the circuit, had some money to hire people. Instead of hiring lawyers, he hired seven data scientists because of the FTX case, the case with the crypto, and the kid with the wacky hair and the shorts; that guy, you can imagine what it was to put that case together, since who knows what's on the records. One of the students in my class in evidence, which began last Wednesday, she was a paraprofessional, and she worked in the Department of Justice and my old office, the US Attorney's Office in DC, putting together all the digital evidence in the January 6 cases. I met the assistant who did that. She's just remarkable. She gave up what else she was doing, and has done that now for three years, and they've done such a beautiful job of bringing it all together. No surprise to me, they've gotten 700 convictions. So that's the way this is going. But in terms of your question, what is an unreasonable search and seizure? What can they seize? Does a computer permit them to be required to be selective in what they get? I think using search techniques, they can be. That was one of the things I said, in my opinion. The Department of Justice disagrees with me, that battle goes on.


Debbie Reynolds  33:42

So, in your geofence example, my concern always over data collection is that we're creating a guilty until proven innocent type of situation, and for someone who can't afford to fight their way out of a situation like this, it could be credibly damaging to the person.


Hon. Judge John M. Facciola  38:20

Then you get into this question when you try to make sure the playing field is level, that is, the government has sophisticated computer capabilities to produce this stuff. How do we get the defense counsel an equal ability to understand it and see how it's going to be used and what else it may contain? So, that raises some fascinating questions. One of them is increasing the capability, for example, of the public defenders all over the country, and efforts are being made by the administrative office of your courts to do that on the local level. We're going to have to have significant funding for that. One of my former colleagues said big time-consuming things to do is now to review the videos. Every cop has a camera on. What's there? What's useful? What isn't? What might be exculpatory? So it's a whole new series of things that judges have to do.


Debbie Reynolds  39:19

Oh, my goodness, if it were the world, according to Judge Facciola, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be technology, human behavior, or regulation?


Hon. Judge John M. Facciola  39:36

Besides teaching evidence, I also teach contracts, and I think It is a snare and a delusion to pretend that people read the terms of their service agreement. What we have now is people using this stuff and some ridiculously tiny percentage even read it, and those who do read it probably can't understand it yet. That is the source of the argument made in court. Well, you waive that right. So whether we do it in the form of an FTC regulation or, better, in the form of a statute, I think we have to figure out ways to balance people's understanding of what is happening and their expectations with what they are, in fact, being obliged to do. So, I would like to see one of these text companies put the terms of service in something resembling English that can be read. I don't know, but I think the traditional way of thinking about that, that they are consenting to this, is silly, not happening so as part of how they interact with these now tremendous monopolies, monopolies never before seen on the earth in terms of their dominance of the market, given that we've got to try to figure out how they can be protected, what they need to do their jobs, and what the consumers should have. Again, the Europeans looked at this, and they did it. What are we waiting for? What's that great line? Search for the perfect always throws out the good, and here we go again. So better we try something and see how it works. We have a way of getting used to technological change when it's fed to us in a responsible way. That would be my great dream. On the criminal side, the words unreasonable searches and seizures in the Fourth Amendment written in 1789, need a new fleshing out, and again, part of this understanding of what law enforcement should be permitted to do with communications made on the Internet, how to search for them and so forth, and under what circumstances they are going to be admissible and when not, it's not for want of trying. Magistrate Judges, for example, frequently testified in support of legislation that would do this, and it's just very frustrating.


Debbie Reynolds  42:13

I share your dream and your frustration there, and I hope in the future we can get there, and I think we should try something. Don't just not do anything.


Hon. Judge John M. Facciola  42:21

Well, we’re in a happy position. There is on a panel, Professor Dean, I think, at Penn State, was saying, we can watch Europe screw up and take advantage of it, and what we do? Well, okay, yeah, I would like to be in a country where we go first, not wait around for somebody else, the mistakes. Come on, man, we dominate technology. What possible excuse can we have to be last?


Debbie Reynolds  42:47

Very true. Well, thank you so much. This is tremendous. I'm sure the audience will love our conversation as much as I do, and thank you so much for sharing so much for your wisdom with us. 


Hon. Judge John M. Facciola  42:58

Yeah, well, it'll be belated, I suspect, when this is published, but maybe I can wish you all of the wonderful people who are listening to serve their country our gratitude and our hopes for an enjoyable Memorial Day weekend.


Debbie Reynolds  43:14

Thank you so much. This is amazing. We'll talk soon.


Hon. Judge John M. Facciola  43:18

Take care and you too. Have a nice weekend.


Debbie Reynolds  43:20

Ok, thank you. Thank you so much.