.jpg)
"The Data Diva" Talks Privacy Podcast
The Debbie Reynolds "The Data Diva" Talks podcast features thought-provoking discussions with global leaders on data privacy challenges affecting businesses. This podcast delves into emerging technologies, international laws and regulations, data ethics, individual privacy rights, and future trends. With listeners in over 100 countries, we offer valuable insights for anyone interested in navigating the evolving data privacy landscape.
Did you know that "The Data Diva" Talks Privacy podcast has over 480,000 downloads, listeners in 121 countries and 2407 cities, and is ranked globally in the top 2% of podcasts? Here are more of our accolades:
Here are some of our podcast awards and statistics:
- #1 Data Privacy Podcast Worldwide 2024 (Privacy Plan)
- The 10 Best Data Privacy Podcasts In The Digital Space 2024 (bCast)
- Best Data Privacy Podcasts 2024 (Player FM)
- Best Data Privacy Podcasts Top Shows of 2024 (Goodpods)
- Best Privacy and Data Protection Podcasts of 2024 (Termageddon)
- Top 40 Data Security Podcasts You Must Follow 2024 (Feedspot)
- 12 Best Privacy Podcasts for 2023 (RadarFirst)
- 14 Best Privacy Podcasts To Listen To In This Digital Age 2023 (bCast)
- Top 10 Data Privacy Podcasts 2022 (DataTechvibe)
- 20 Best Data Rights Podcasts of 2021 (Threat Technology Magazine)
- 20 Best European Law Podcasts of 2021 (Welp Magazine)
- 20 Best Data Privacy Rights & Data Protection Podcast of 2021 (Welp Magazine)
- 20 Best Data Breach Podcasts of 2021 (Threat Technology Magazine)
- Top 5 Best Privacy Podcasts 2021 (Podchaser)
Business Audience Demographics
- 34 % Data Privacy decision-makers (CXO)
- 24 % Cybersecurity decision-makers (CXO)
- 19 % Privacy Tech / emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6 % Media / Press / Regulators / Academics
Reach Statistics
- Podcast listeners in 121+ countries and 2641+ cities around the world
- Over 468,000 + downloads globally
- Top 5% of 3 million + globally ranked podcasts of 2024 (ListenNotes)
- Top 50 Peak in Business and Management 2024 (Apple Podcasts)
- Top 5% in weekly podcast downloads 2024 (The Podcast Host)
- 3,038 - Average 30-day podcast downloads per episode
- 5,000 to 11,500 - Average Monthly LinkedIn podcast posts Impressions
- 13,800 + Monthly Data Privacy Advantage Newsletter Subscribers
Debbie Reynolds, "The Data Diva," has made a name for herself as a leading voice in the world of Data Privacy and Emerging Technology with a focus on industries such as AdTech, FinTech, EdTech, Biometrics, Internet of Things (IoT), Artificial Intelligence (AI), Smart Manufacturing, Smart Cities, Privacy Tech, Smartphones, and Mobile App development. With over 20 years of experience in Emerging Technologies, Debbie has established herself as a trusted advisor and thought leader, helping organizations navigate the complex landscape of Data Privacy and Data Protection. As the CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, Debbie brings a unique combination of technical expertise, business acumen, and passionate advocacy to her work.
Visit our website to learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E224 - Mike Swift and Debbie Reynolds
Debbie Reynolds, "The Data Diva" talks to Mike Swift, the Chief Global Digital Risk Correspondent at MLex. Mike's extensive background in journalism and his focus on the intersection of technology and the law gives him a compelling perspective on the conversation.
Mike discusses his career journey, from reporting on major tech companies like Google and Facebook at The San Jose Mercury News to covering digital risk, privacy, and antitrust issues at MLex. The conversation highlights the increasing overlap between privacy and antitrust. Mike offers insights into significant cases involving Google's ad tech practices and the ongoing debate between Apple and Google over app store control.
The episode's central theme is the evolving view of personal data as a consumer protection issue and a national security concern. Mike reflects on recent legislative efforts to restrict the flow of Americans' data to foreign adversaries, marking a notable shift in privacy discourse at the federal level.
Debbie and Mike explore the growing influence of data brokers, who often operate without direct relationships with consumers while amassing and selling vast amounts of personal data. They discuss the 23andMe breach and raise questions about the adequacy of credit monitoring as a remedy for biometric data leaks.
Artificial intelligence also takes center stage as the two unpack the debate over regulation and innovation. Mike shares insights on California's legislative efforts to regulate powerful AI systems, emphasizing the need to balance technological advancement with consumer protection and privacy safeguards.
The discussion highlights the U.S.'s lack of comprehensive federal privacy legislation, exposing consumers to risks while creating inconsistent protections across states. Mike underscores the need for stronger regulatory guardrails and advocates for recognizing privacy as a fundamental human right.
This episode offers a deep dive into the intersection of technology, law, and privacy. Mike provides valuable insights on the current and future landscape of data governance and shares his hope for Data Privacy in the future.
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now, I have a very special guest on the show, Mike Swift. He is the chief global digital risk correspondent for MLex. Welcome.
[00:36] Mike Swift: It's great to be here, Debbie. Thanks.
[00:38] Debbie Reynolds: Well, I'm really happy to have you on the show. As is the case with a lot of people who end up on the show, you've probably put some witty comment up or some post that I've seen, and I've reached out to you and wanted to have you on the show, so I'm really happy to have you here.
[00:55] Mike Swift: No, that's great. Yeah.
[00:57] I love writing about privacy.
[00:59] I think it's a really important topic for our time. I think it's increasingly a civil rights issue and our data is what describes us. Right. Our identity. So I love writing about it for that reason.
[01:12] Debbie Reynolds: Yeah. And I love the fact that your work is in digital risk, because I think the privacy is all about digital risk. Right?
[01:22] Mike Swift: Yeah, it sure is.
[01:26] Debbie Reynolds: Tell me about your. Your journey and how you came to your career as a digital risk correspondent.
[01:32] Mike Swift: Sure. So I have been a newspaper reporter. My. My journalistic background was in newspapers with most recently with the San Jose Mercury News covering Google and a little company at that time called Facebook.
[01:47] Mark Zuckerberg was literally one of my neighbors in Palo Alto. And it was a very different time. And I joined mlex. MLEX was a new startup at the time. Basically, it was founded by two former Bloomberg reporters who really wanted to provide very granular, high quality coverage of the intersection between technology and the law.
[02:11] And that's essentially the mission of mlex, that we really try and focus on data privacy, cybersecurity, antitrust, and really writing about the fact that technology moves very quickly, the law does not move so quickly, and as a result, we have a lot of fender benders between those two things, and that's what we write about.
[02:35] So my journey as a journalist has really taken me from being a business reporter and writing about computer assistant, population trends, about race and demographics into technology. And now I really sort of straddle the fence between legal journalism and business journalism, which is a really interesting place to be.
[03:00] Debbie Reynolds: Yeah, you're in the thick of things.
[03:02] Mike Swift: Definitely.
[03:03] When people ask me what I do, I said, well, I write about Silicon Valley's legal problems and invariably people will say, well, you've got enough to keep you busy. And that's certainly the case.
[03:15] Debbie Reynolds: So you mentioned antitrust and this piqued my interest. Right. Actually I, I co wrote an article that was in Bloomberg a couple years ago and I had a very spirited debate with the antitrust attorney at the time who didn't think that privacy and antitrust was related.
[03:32] And I said it is, but I want your thoughts.
[03:36] Mike Swift: Bear, you were right, he was wrong. I mean it. You only have to look at. There's a case right now, there's a, a lawsuit that was filed in 2020 by the state of Texas and a number of other states against Google over its ad tech business.
[03:50] And that case has antitrust claims, but it also has privacy claims that, you know, essentially that Google wasn't honest about how it collected data and how it used data in, in targeting advertising.
[04:04] And that case is going to come to trial and it's going to be very interesting sort of overlap between privacy and antitrust. But beyond that, you just have to look at the very important ruling that was issued in the antitrust case over Google's Google search a few months ago where the judge talked a lot about how privacy was an issue in that case.
[04:29] And so I think over time you're seeing, you know, more overlap between the two because data is a currency.
[04:38] And if you want to monopolize a market, control the market, one way to do it is through personal data. And so I, I think these two are really inextricably linked.
[04:47] And that's something that we, we realize sort of every day at mlex as, as we are, we're sort of covering those two areas that there is this huge overlap.
[04:57] Debbie Reynolds: Well, I'm glad I was proven to be right on that.
[05:02] You know, some companies say, well, we need like companies with app stores. There are only two right now. Sometimes I hear claims saying, well, we need to keep our app store closed because it's a privacy issue.
[05:16] Right. So that sometimes that gets intertwined in when people say, well, you need to open it up because we need to have more competition. They're like, well, we need to close that because that's a privacy risk.
[05:27] What do you think?
[05:28] Mike Swift: Yeah, I mean, those were exactly what the arguments of both Apple and Google were when they were both were sued by Epic Games. I think Apple was more successful in that argument than Google has been.
[05:40] We had a very important ruling from a judge in San Francisco where Judge Donato is ordering Google to basically for three years at least allow rivals to open App stores in Android.
[05:54] But it seems like Apple was more successful in saying there's a big privacy and security issue. If we open up our app store, we're able to very tightly control this.
[06:04] And a judge agreed with Apple on that. So it's kind of a tale of two companies there, that there was somewhat of a different outcome with the, with the two.
[06:13] Debbie Reynolds: Yeah, it's very interesting. Definitely watching this really closely as well.
[06:18] Mike Swift: We are too.
[06:19] Debbie Reynolds: What's happening in privacy or technology in the world that's concerning you right now?
[06:25] Mike Swift: Well, I think one thing that's been really interesting has been how in the past, you know, my, the 12 years I've been writing about privacy, I've really always written about it as sort of an individual consumer protection issue where, you know, company A collects my data and uses in a way that I didn't expect.
[06:44] And so therefore we're going to have a legal dispute over that. And I think there was really a significant turn in that. Really for the first time, personal data is being seen as a national security issue as well as a consumer protection issue.
[07:00] And there's been a lot of criticism of the US Congress that they haven't yet passed a comprehensive privacy law. And you know how the United States is now really isolated from the rest of the Western world and not having a comprehensive privacy law.
[07:16] But Congress did actually pass two laws there that had to do with privacy that were actually also national security laws. And for the first time they are really starting to restrict the flow of Americans personal data outside the United States specifically to adversary nations like China and North Korea and Iran.
[07:42] But that's a really significant switch and I think it's something that's going to continue going forward.
[07:48] Debbie Reynolds: Right. I actually did a video about that about the countries of interest and how that's going to be very tricky because the list, I guess the State Department maintains the country of interest list and it can change.
[08:06] So I think it'll probably be something that's tracked very closely, like with export controls probably, which I think is another fascinating area around data.
[08:19] Mike Swift: Definitely.
[08:20] Debbie Reynolds: So you mentioned the thing about Congress not yet passing comprehensive privacy law. And I want your thoughts about, I guess, the gaps there. So in the US A lot of our laws around privacy are very consumer based.
[08:38] And then we have, I think the national security stuff is kind of in its own different world.
[08:43] Mike Swift: Yep.
[08:44] Debbie Reynolds: Right. And then I think the gap that we have is kind of the human part.
[08:50] So a lot of times in the US if you're not consuming, you really can't exercise some of the rights that you have. And then also another problem that we're having is that a lot of the regulations are written, the consumer laws are written as if you know the company that has your data.
[09:08] So you have a relationship with them, you gave them your data. They're supposed to handle this certain way. But we know there's like an underbelly of this whole multi billion dollar industries where they get data that you don't know about and they do stuff with it.
[09:23] And then now you get like a letter in the mail saying, hey, I had your data, we had a breach. You're like, who are you?
[09:30] What's going on? So what are your thoughts about that?
[09:33] Mike Swift: I think that's a huge problem that I think increasingly we're seeing regulators in the United States, like Lina Khan, the chair of the Federal Trade Commission, and Sam Levine, who's the head of consumer protection at the ftc, saying that the system that you're referencing, notice and choice is broken, that it just doesn't work anymore.
[09:54] I mean, the argument they're making is that it's a fiction that Americans actually read privacy policies. And even if they did, as you were just describing, your data can be obtained by an entity, a data broker or somebody else that has no direct relationship with you, and then it can be breached and you have no control over that.
[10:16] You're completely powerless. And so one of the things we're really lacking, lacking in our federal laws at least, is which the Europeans have, is the principle of data minimization, that companies, yeah, they should be able to collect your data for uses that help you, but there should be a limit on that, how much they can collect and how long they can keep it.
[10:40] And we do have some of those limits for children, but we don't for adults.
[10:44] And I think that we should have more control over our data.
[10:49] I'm a resident of the state of California, so I have more comprehensive privacy laws than people in many other parts of the country. And I don't think that's fair. I mean, I think everyone should have the same protections.
[11:01] We're all Americans. So that's the shortfall with the states regulating privacy rather than Congress.
[11:09] Debbie Reynolds: I agree. Even though California has always been the leader on a state level in terms of privacy. And it doesn't seem like from some recent Supreme Court cases, they are definitely kicking those decisions down to the states.
[11:24] And so we're really looking at California very heavily. And a lot of states do follow California, even though I feel like some states are trying to have their own special sauce so they make their laws different enough to be annoying, Kind of just to be different.
[11:40] Mike Swift: Yeah.
[11:41] Debbie Reynolds: I want your thoughts a little bit about in this conversation. I hear a lot, that we hear a lot, especially in the AI world where. And I want to ask you about artificial intelligence as well, about the thought that somehow regulation is going to stifle innovation around privacy.
[11:59] So. Oh, we have all these privacy laws and if we have too many privacy laws, we aren't going to be able to innovate.
[12:05] Mike Swift: Right. Well, I mean that was with artificial intelligence. We just had that debate here in California with a bill that was proposed by state senate or actually passed the legislature that would regulate very powerful AI systems.
[12:20] And Governor Newsom vetoed the bill and that was a case, we covered that pretty closely at amlex. We thought it was a really important debate to capture both sides of it.
[12:33] And one element of that was that you had almost Hollywood versus Silicon Valley, you know, sort of these two major drivers of the California economy that were on different sides.
[12:44] You had the screen actors union that was saying we do need this law, we need limits. But you had all some very powerful prominent voices in Silicon Valley saying that you are going to stifle innovation if you have these very strict limits on powerful AI systems.
[13:03] This, you know, law would have forced AI developers to have like a kill switch. So if you know, your AI starts behaving erratically or goes rogue, then it could be basically shut down.
[13:15] And there was a feeling that that would make it much tougher for startups to get funding from VCs. And it's sort of two sides of the coin, I guess.
[13:26] But Governor Newsom ultimately vetoed the bill.
[13:33] Debbie Reynolds: Well, I will say California, I think is like the fifth largest economy in the world if I'm not mistaken. And I've, even though I don't live in California, I've visited many times here, very heavily regulated state.
[13:46] I just can't believe all the regulations you all have and somehow that hasn't stifled you all's progress in the world.
[13:53] Mike Swift: Yeah, well, I mean another interesting antitrust debate right now is that California has always, it's been illegal to enforce non compete clauses that make it easier for people to jump from company to company.
[14:07] And the FTC has been trying to sort of extend that rule nationally. And there's a school of thought that that has been one reason why this state has been so vibrant economically and in terms of technology innovation, even though we do have a lot of regulation.
[14:27] And I can Definitely vouch. As a California resident, we have a lot of regulation and a lot of business regulation and sometimes it's really difficult for small businesses because of that.
[14:37] But there are also. I think it's more complicated is what I'm trying to say, that we also have some really good beneficial qualities that make innovation very vibrant here. We have a very diverse population.
[14:52] We have people coming from all around the world that bringing fresh ideas. And when you actually live in Silicon Valley, it's really exciting to see how universities and VCs and companies, startups work together.
[15:06] And it's a very difficult thing to replicate. But I think a lot of it is just sort of the culture of optimism that California has always had that makes it such a great place for innovation.
[15:18] Debbie Reynolds: I feel like there's always been this whole anti regulation bent in business. You know, not all regulations are bad. Right, Right. And for me, what I always like to say, and I use the automobile industry as an example, like stop signs and stop lights and lines on pavement didn't stop the automobile industry innovation.
[15:41] Right.
[15:42] But if we didn't have those things, the automobile industry could not have been what it is now. How could you get to work on the highway if you, you know, everyone's going in different directions.
[15:53] Right.
[15:54] Mike Swift: You know, I saw a really interesting poll that came out in the San Jose Mercury News where even people in Silicon Valley are starting to have a bit more negative view of the tech industry and, and not trusting the tech industry.
[16:07] And you know, with privacy, we haven't had much reg. We haven't had national regulation. You can argue that we have very little regulation on the privacy side until the last few years.
[16:18] But I think there's a risk that that could boomerang on the tech industry, that if people feel like you were recounting the other day the experience that we've all had, you get that letter in the mail, oh, my data's been breached.
[16:34] And often it's with a company that you didn't even know they had your data.
[16:38] And it's a really bad feeling, at least for me, when I've gotten those letters.
[16:45] And so I think that the lack of regulation might hurt the tech industry going forward with privacy and that it would be really good for Congress to get its act together and pass a national law.
[16:58] That was thoughtful.
[16:59] Debbie Reynolds: Something. Artificial intelligence makes privacy more challenging because now you're handling data in different ways. So just like we say, it's complicated to assume that the person has your data. You give them the data and then you're kind of managing that relationship.
[17:17] But AI, it adds, like, another layer of complexity because now your data somehow can be in a model. It could be manipulated in different ways. There could be decisions made about you that you don't know about.
[17:32] There's a lot of kind of, you know, black box data handling and data management. And it's hard to grapple with that because it's such a evolving thing. It's hard to kind of pin it down.
[17:47] But I want your thoughts.
[17:48] Mike Swift: Yeah, it's so opaque, isn't it? And, you know, if you get turned down for a credit card or, you know, a landlord won't rent to you, you might not know that it's because some screening algorithm has said that you're not a good risk.
[18:04] So much of what AI does is it allows companies to discriminate, necessarily in a legal sense, but they're trying to discriminate one customer from another. You know, who is the best person for us to do business with.
[18:19] And I think there's a huge risk to that. I mean, it can become illegal discrimination. We've already seen cases where Facebook had to settle with the Department of Justice because they were allowing people to, say, buy ads that I only want to advertise to people of a certain race or sexual orientation or whatever.
[18:37] And, you know, so AI can really enable illegal discrimination on a vast scale, and that's a big risk. And so we as consumers really need to have visibility about how these systems are making decisions about us, and we need to be empowered by that sort of transparency.
[18:57] Because I just think so much of how these systems are operator. So opaque. You know, it's so difficult. It's hard enough for us to read a privacy policy. It's not really fair to expect consumers to do that.
[19:10] But as you said, within an AI system, it just makes it even more difficult for us to really understand how these decisions are being made about us.
[19:18] So I agree. It's a big problem.
[19:21] Debbie Reynolds: Well, since you do reporting on business and legal, I wanted your thoughts. What's kind of the temperature in the business and legal area around privacy?
[19:31] Mike Swift: Well, I, you know, I think companies have become much more sophisticated about it. I think, you know, when Europe passed the General Data Protection Regulation, any company that wanted to do business in Europe, which is a huge market, you know, needed to comply with that.
[19:46] And I think, I think industry has become so much more sophisticated about data protection. And so I think the big companies, you know, they do try and be respectful of privacy.
[19:59] Obviously, there are a lot of. There's a lot of litigation out there. That we cover about how companies may have fallen short on privacy. Google is now facing an awful lot of some significant litigation over how it collected data through its Chrome browser.
[20:17] But I think one area that needs to be looked at more closely is really data brokers, because so often they don't have a direct relationship with consumers, but they're able to collect so much data and combine it with other databases and make inferences about people.
[20:34] But, you know, it's so difficult for consumers to really have any. I think most people don't even have, aren't even conscious that there's this giant industry out there in the commerce of their data.
[20:45] And, you know, there needs to be more transparency in that area. I think, particularly with data brokers.
[20:52] Debbie Reynolds: I agree with that wholeheartedly. I think especially this new data breach came out, National Data, the data broker company that was breached. And they say, oh, well, we think almost everyone in America, your data was breached.
[21:05] And they're like, okay, well, call a credit agency. It's like, is that the right answer for everyone has had a data breach. All of us individually have to fight our way into a credit agency, try to find out what's happening.
[21:19] Mike Swift: Yeah. When we do. And it's so unfair. I mean, my data, my Social Security number has been breached so many times. Often it's by like my, my health insurer a few times.
[21:28] And so I had to put a freeze on all my credit reports. And I can't do anything that requires any sort of borrowing without then going to all three of them and unfreezing my account and then refreezing it, you know, for 24 hours later.
[21:44] Because my data is out there and it can never be brought back. You know, it's, it's out there. So, yeah, I mean, we're all having to do much more work as consumers to be knowledgeable and to take steps to protect ourselves.
[21:58] Like two factor authentication. And it's very frustrating sometimes. Like I was trying to log into my bank the other day and they were like blocking me. I had to actually phone them up and go through a long process of authentication.
[22:12] And you know, on one hand I'm glad that they're being careful about, you know, that my, to keep my identity from being stolen. But on the other, it's really a pain in the neck.
[22:21] It's, it's a real headache for consumers. So it's a tough situation.
[22:27] Debbie Reynolds: Is Google in an antitrust case right now? Google?
[22:31] Mike Swift: Several of them.
[22:32] Debbie Reynolds: Yeah, several of them. I thought, the first thing I thought from a consumer perspective was like, oh, God, do I have to do more clicks now? Like, you know, instead of having, you know, service bundled together, now I have to go out and search, you know, how much more work is it going to put on me as a consumer?
[22:49] That's the first thing I thought about, that's for sure.
[22:52] Mike Swift: Yep. There.
[22:54] Debbie Reynolds: There's one. One breach I wanted to talk to you about. And what's your thoughts? I find it very curious. So it's the 23andMe breach there, and I'm sure you guys have covered this pretty heavily.
[23:05] Mike Swift: We are. We're covering that very closely. Yep.
[23:08] Debbie Reynolds: But so I heard, and this kind of made me upset when I heard that. I think they had a settlement in that case for, I want to say, $30 million. I'm not 100% sure how to check.
[23:21] And they would give people three years of credit monitoring. Like, I think if your biometrics is breached, is credit monitoring really the right thing, the right remedy?
[23:34] Mike Swift: Yeah. I never had to provide a DNA sample when I was applying for a credit card. Right. So that seems like a disconnect. And the other thing is credit monitoring. We've all had our information breached so many times that there's really no value to that anymore.
[23:48] Right. I mean. Right. Because I've had credit monitoring offered to me at least 10 times, I'm sure, because my data's been breached so many times. But, yeah, I don't think that that settlement has been finalized yet.
[24:04] I believe that it's still being reviewed by a federal judge. I think the 23andMe and the plaintiffs have proposed it to the judge, but the judge has not yet settled off, so settled out.
[24:18] So you, as a consumer, still have the chance to object to that and file an objection with the court.
[24:24] I can't recall exactly which judge it is, but I know it's here in the Northern District of California where so much of the privacy litigation happens. So consumers do have that option still with this case?
[24:37] Debbie Reynolds: Well, I was never a consumer and I was always kind of suspicious of them for a good reason, mostly just because I feel like some.
[24:46] There's too much data being asked, too much information that's being asked for, and not enough stringent security or protection around that data. And to me, especially DNA and biometrics. I know we talked about national security, but I think a lot of this, our talk needs to shift to safety.
[25:09] Who's going to be doing what with this data? You know, this creates all types of domestic problems with people. Right. Whether that be stalking someone, tracking someone's car, or their Vehicle.
[25:22] Just the fact there's so much data out there to be captured by maybe someone who's a bad actor, someone from various purposes. I think, you know, we don't talk a lot.
[25:33] We don't talk enough. I think about privacy as a safety issue, but I want your thoughts there.
[25:38] Mike Swift: Oh, I couldn't agree more. I mean, we're seeing a lot of that, that. But I think it's like that with any technology.
[25:44] You know, ultimately, anything that can be used for good, it can also be used for bad. Almost anything. And, you know, one interesting case that we've been covering is litigation around Apple's air tags.
[25:56] You know, those little things that you put in your luggage to find your luggage if it gets sent to the wrong place. But domestic abusers are using that. They're like, using that to basically stalk women who they've had relationships with, like hiding it in the wheel well of a car.
[26:15] And the allegation in this lawsuit is that Apple has not done enough to help people protect, you know, to know that they. These tags have been planted on them and to.
[26:27] So they can. If an abuser tries to do that. It's a really fascinating case. You know, Apple's, Apple's, you know, made some strong arguments like that. We're aware of this.
[26:37] We've taken a lot of specific steps to prevent this, this technology from being used for these bad purposes. But, you know, I think you're right that safety has become such a more important, a bigger issue.
[26:52] And, you know, I think, I think there almost needs to be more education of kids in school that obviously this technology has totally changed our lives. What's possible with the smartphones and trackers and everything over the last 20 years.
[27:08] But people really need to have an education of how it also can put them at risk. And I'm super careful, like you said, with biometric information. I try not to share any of that.
[27:22] I even worry about letting Apple unlock my iPhone with my face. But so far as I can tell, they've been really careful with that data. But Clearview AI is another example.
[27:35] That company, which they've got a huge database with all our faces in it. And that technology makes a lot of mistakes and there's a lot of ways it can hurt people.
[27:47] And people need to be more educated, I guess, is what I'm trying to say.
[27:51] Debbie Reynolds: I think you mentioned Clearview AI and we talked about 23andMe. I guess the two. The issues with those two companies that I'm concerned about. I want your thoughts. One, I think people like me are concerned that 23andMe will get sold to another company that's going to take advantage of the DNA or biometric information that they have.
[28:13] I think people are really concerned about that. Like, who are these people? Are they trustworthy? Right.
[28:18] And then with Clearview AI, they've had a situation recently, I think one of their Illinois settlements where part of the settlement was that they had to give the defendants, I think, a share of the company or something because they didn't have the money to pay for it.
[28:37] But I just want. These are all wacky things.
[28:40] I want your thoughts about this.
[28:43] Mike Swift: Yeah, I just did a story about that, actually, that you described it just right, that Clearview AI, to settle this lawsuit over the Illinois Biometric Information Privacy act, has offered to share a portion, it's almost a quarter of its IPO value with members of the settlement class.
[29:03] The interesting thing is we're now starting to see some objections filed in the case. And that's a story I wrote about where a bunch of people are saying, this puts me in a really weird position because I'm now having to root for the company that stole my privacy, you know, to do well in a, in an ipo, you know, public stock offering.
[29:23] And that's doesn't make any sense at all. So it'll be interesting. We don't know how the judge is going to react to that. I think there'll be a hearing pretty soon.
[29:32] It's Judge Sharon Johnson Coleman in the Northern District of Illinois who is. Has that case. So it'll be really interesting, interesting to see how she reacts to some of these objections that have been filed and, and whether she'll go along with the settlement that Clearview has proposed.
[29:49] Debbie Reynolds: You know, that was quite novel. Very interesting. Well, and they're, as you know, they're involved in litigation all over the world. So it's like you're paying these big legal fees, obviously, as well as these fines or penalties in different countries.
[30:03] So.
[30:03] Mike Swift: Yeah. Yeah. Well, one little interesting twist that we've written about is how both Google and Meta were sued by the state of Texas for their use of facial recognition technology. And part of the defense for both companies has been, well, look, state of Texas, you're hiring Clearview AI to provide all this facial recognition services like the Texas Rangers and a bunch of law enforcement agencies in the state of Texas are using Clearview AI.
[30:34] And so the defense that Google and Meta have been making is that, well, you guys can't use Clearview AI and then say that it's illegal for us to use facial recognition.
[30:44] So in the case of Meta, they settled. So that case never got tested. But Google is making the same argument in litigation that may go to trial. So we'll have to watch that one.
[30:55] But it's kind of a little interesting sidelight if you're a privacy wonk. And sort of interesting how clearview AI comes up in other cases.
[31:04] Debbie Reynolds: Yeah, these are all hot. Is there any particular hot case we haven't talked about that you're watching?
[31:11] Mike Swift: Oh my gosh, there's so many. You know, one of the big ones we're really watching is how TikTok is now being sued for most of the kids privacy issues.
[31:22] The state of Texas sued TikTok.
[31:25] California, New York and 11 other states in the District of Columbia sued TikTok, basically saying that they use data to essentially addict young people to their platform.
[31:37] And so that's a super hot one. I mean, there's a bunch of cases also brought by the state attorneys general against Meta. So that's a huge area of focus for us.
[31:48] I think maybe one of the most interesting cases just for me, is the fight between the ftc, the Federal Trade Commission and Meta over whether the FTC is going to be able to reopen its settlement with Meta and block them from using the personal information of kids under 18 for any sort of commercial purpose.
[32:08] I mean, that would be huge sanction on Meta.
[32:12] Remains to be seen whether the FTC is going to be able to do that.
[32:15] There's two separate court cases in Washington that we're following in that case, but that one's going to be heating up soon. So there's no shortage of things to write about in privacy.
[32:26] It's just every year gets more exciting and more impactful, it seems. So I'm really lucky that I get to cover all this stuff.
[32:38] Debbie Reynolds: Well, your reporting is stellar and I love the things that you write and you keep us all engaged and informed about what's happening with all this stuff. So it's definitely interesting.
[32:48] Mike Swift: That's very kind of you. Thank you so much for saying that.
[32:52] Debbie Reynolds: Well, if we're in the world according to you, Mike, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be legal regulation, technology, or human behavior?
[33:07] Mike Swift: Well, that's a good question.
[33:11] I'm a native born Californian and I live here in Silicon Valley and it's amazing what these companies have achieved and there's incredible people that work for these companies too. When I was more on the business side, I got to meet the people that were literally in charge of the search engine at Google.
[33:29] And they were just incredible people, amazing scientists. And so I think these companies have a great power to do good, but I think we're now at the point where we just need much stronger guardrails.
[33:45] So my wish would be that the Congress really gets this act together and realize that this is a more important issue than partisan bickering and that they owe a duty to the American people to give us privacy rules, that it should be rights.
[34:02] They should be basic human rights, that we control our data. We have the final say over how our data is collected and used. And so I would like our federal government to really see privacy as a fundamental human right that we all deserve.
[34:20] And so that would be my wish.
[34:22] Debbie Reynolds: Well, I think we're kindred spirits in that regard. That's something I've. I think that's the gap that we really need to fill in the US And I would love to see that even constitutionally.
[34:33] I think that'd be great.
[34:34] Mike Swift: It would be great. It's in the California Constitution.
[34:37] Debbie Reynolds: It is. It's 1974.
[34:39] Mike Swift: Yep, that's right. Yeah. Yep. So I guess it will never be in the federal Constitution. Probably not. Not in my lifetime anyway. But I would like to at least see it in law.
[34:49] That would be a great thing.
[34:51] Debbie Reynolds: It would be. It would. Absolutely. Well, thank you so much. This is amazing. And, people, please follow Mike and his reporting. And mlex, you guys are doing a great job of providing us details about what's happening, and also kind of these intersections of law and law and privacy and tech, sometimes you get one or the other.
[35:13] So being able to mix it up, I think makes your reporting that much richer.
[35:18] Mike Swift: Well, thank you. And we're launching a new service on artificial intelligence. So we will be writing a lot more about not only the privacy issues around AI, but also copyright issues and, you know, whether it's fair use to take your copyrighted information and use it to train AI systems and.
[35:37] Yeah. So we're very excited about covering AI going forward.
[35:42] Debbie Reynolds: Excellent. Well, thank you so much.
[35:43] Mike Swift: Much.
[35:44] Debbie Reynolds: And we'll be in touch. And I'm looking forward to us being able to collaborate together in the future.
[35:49] Mike Swift: Thank you so much. This has been great.
[35:52] Debbie Reynolds: All right, have a good day, Ra.