"The Data Diva" Talks Privacy Podcast

The Data Diva E103 - Cameron Kerry and Debbie Reynolds

October 25, 2022 Season 2 Episode 103
"The Data Diva" Talks Privacy Podcast
The Data Diva E103 - Cameron Kerry and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva” talks to Cameron Kerry, Former General Counsel and Secretary of the U.S. Department of Commerce in the Obama Administration, Ann R. & Andrew H. Tisch Distinguished Visiting Fellow, The Brookings Institution, Center for Technology & Innovation. At Brookings and the MIT Media Lab, Cameron Kerry is applying his experience as a government thought leader on technology and public policy as a speaker, writer, and researcher on current issues in these areas. His work is focusing especially on privacy and information security and the application of privacy principles to fast-changing global business and technology. We discuss our meeting at the Berkeley Forum and the now mainstream nature of privacy, his unique path in public, private and academic spheres, why privacy is so important for him, surveillance no longer requires human presence, recent developments regarding the Cloud Act and data transfer between the US, the EU, and the UK, 2 streams of thought on surveillance and data transfers, the ADPPA (American Data Privacy Protection Act), California privacy vs. the ADPPA, US Chamber of Commerce objections to ADPPA, his privacy concerns about current technologies, human vs. consumer rights and his hope for Data Privacy in the future.


This episode is a MUST listen for anyone who wants to know what is happening with Data Privacy in the U.S. on a state-federal and International level. 



Support the show

49:11

SUMMARY KEYWORDS

privacy, people, data, surveillance, government, law, united states, california, protections, debbie, deal, brookings institution, privacy laws, protect, privacy policy, information, state, collect, preempt, microseconds

SPEAKERS

Debbie Reynolds, Cameron Kerry


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy Podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show, Cameron Kerry. He's the former General Counsel and Secretary of the US Department of Commerce in the Obama administration. He's an Andrew H. Tisch Distinguished Visiting Fellow at the Brookings Institution Center for Technology and Innovation and the MIT Media Labs. He's a speaker, writer and thought leader on privacy, security and technology policy. Welcome.


Cameron Kerry  01:00

Thank you, Debbie. It's great to be joining you today.


Debbie Reynolds  01:03

Yeah, well, we had the pleasure of meeting actually, I followed your career for many years, so it was a pleasure for me to meet you. You and I spoke on a panel at the Berkeley Forum recently. And we hit it off, and we thought it would be great for you to be able to be on the show. And especially because there are a lot of things in the oven before midterm elections in the US. We have a lot to talk about. So thank you for being on the show.


Cameron Kerry  01:34

Well, thank you, Debbie. I'm flattered by your interest. But yeah, there is a lot going on. I think it's a reflection of really how mainstream privacy has become to issues and government public policy over the last 10 - 15 years.


Debbie Reynolds  01:53

I think you're in a unique position. And in fact, I don't know anyone else in the US that has really had the type of past that you've had. So you have deep roots in public office, government and private practice, academia, thought leadership and think tanks and stuff like that. So I think you have just a unique perspective that a lot of people don't have about privacy. And I feel like you, for many years, there was just not a lot happening on privacy. Now. There's just so much excitement and so many things happening. So tell me a little bit about why privacy is such an important issue to you and your trajectory, your career and how you got here.


Cameron Kerry  02:43

I will. Thanks, Debbie. It's yeah, I've thought about that question, you know, as I write, and as I engage in sort of public advocacy and outreach on this, I certainly am about all right, why is privacy important? You know, why should we care? Why do we care about this issue? I'm, and I think it comes from my mind, from a lot of things. I mean, I grew up as a child of the 60s and 70s when we had the surveillance of Dr. Martin Luther King. We had the surveillance of the anti-war movement, that was something that I was engaged in, and my brother John Kerry was a national leader in that and was the subject of surveillance. You know, I remember being at one of the important anti-war events in Washington and hundreds of thousands of people there, where he spoke and being in a park with him afterward. And, you know, he went off for a walk with his wife and seeing what were just a bunch of obvious undercover policemen or agents going off after them. So I certainly grew up conscious of you know, the risks of government surveillance. So when I went to law school, it was a subject that I was interested in. And my practice in law was always involved in communications, cable TV broadcasts and telecommunications, which is technology driven that was conscious of the opportunities to collect data. And you know, those are areas that have early privacy laws. some of the earliest ones. When I joined the Obama administration in 2009, I went with a sense that we needed to address privacy and security as part of sustaining the digital economy. I was aware of how much data is out there, how much that was increasing with digital technology, and then the advent of mobile phones. So I think none of us quite grasped, then. And I think even today, don't grasp fully how much that data has exploded. So we set out to build trust in the digital economy. That is very much unfinished business today. I hope to see that happen.


Debbie Reynolds  06:35

And to your point about some obvious agents, you know, following your brother, John Kerry, now you don't need humans following you; you have digital devices to track you everywhere, right?


Cameron Kerry  06:50

Well, absolutely. You know a lot of our law is based on 18th-century concepts. And yeah, so the courts talk about what the constable can do. Funny, there's Supreme Court opinions. I mean, who knows what a constable is today, but that's the language that the opinions use. Well, you know, the constable could not engage in 24-by-7 surveillance. And so we see the Supreme Court saying, you know, yeah, putting a location tracker on a car goes beyond the kind of surveillance you could do tailing somebody. So, you know, that takes a warrant, the same thing with getting location data. So, you know, we are adapting, but we have a lot more to do.


Debbie Reynolds  08:00

I agree with that. I want your thoughts. So there's a lot brewing or has been brewing over the years with the Cloud Act. And we recently saw well two things are happening. One is the US and the UK, their data access agreement just went into force. And then we are right on the cusp of having the new transatlantic Data Privacy framework between the EU and the US pass. What are your thoughts about that whole brew? What's happening around data transfer internationally between the US and the UK and the EU?


Cameron Kerry  08:45

Well, Debbie, it's a big question that covers a lot of ground. But in general, in this day and age where we are linked by an Internet that operates at lightspeed and transfers data around the world, the ability to do that is essential to the value of that communications network and it's not just a value for the providers, it provides value that, you know, network connections to all of us. It's part of what brings the world together in many different ways. economically, socially and politically. And also in adverse ways. I mean, we see that in you know, the speed with which disinformation moves but you know, we need to preserve the benefits of that connectivity and deal with some of the risks? Well, one of the important benefits is still our dealings with the European Union. You know that is the biggest trading relationship in the world. And accounts for almost half of the global economy. Both sides really depend on those data flows and communications. But, you know, the Europeans have their privacy laws that provide strong protections, and they want to make sure that that the protections are comparable, or what they call essentially equivalent on the American side, so that, and the European Court of Justice has twice now struck down arrangements to enable that, mostly because of the surveillance that was revealed by Edward Snowden. And there is a perception, I think that that surveillance is more all-pervasive than, you know, as much as the US government can do. It's as all-knowing as the perception goes, the US has placed some strong limits around how we collect foreign intelligence, as well as the due process protections that we have under our Constitution against the government collecting information. Those don't extend under our constitution to people outside the United States. But President Obama in 2014 issued an executive order that extends protections to people outside the United States comparable to those that we, as American citizens have and puts in place some procedures to guarantee data, make sure that there's oversight, make sure that you know, people can bring complaints. But the court said that didn't go, wasn't independent enough. Didn't ensure some judicial redress? So the new executive order, I expect will set up, will go further in terms of the protections for people outside the US. And we'll set up a body with judicial powers that are independent that you know, can protect the privacy rights of people outside the US as we protect those inside the US. I'm hopeful that will stand up to what I'm sure will be another round of litigation in the EU. I think the issues of government access you raised with the Cloud Act are related but a different issue. But it all comes from all of that digital connectivity. And increasingly, governments that are looking for evidence of criminal cases, need to go someplace else, including outside national borders, to get that evidence. That evidence needs to be governed by rules and due process. Like we have if you're getting that in the United States and the Cloud Act sets up a mechanism for people to go and get evidence outside the United States but provides reciprocal protections with other countries so the US, UK agreement essentially says, okay, you know, the US can have a process to get evidence that, you know, has judicial review, and standards as probable cause standard, like we have in the US, and that the UK can do the same with the same kinds of procedural guarantees. And, you know, this is a way of sort of leveling up the standard across the US and like-minded countries.


Debbie Reynolds  15:51

All right. You know, obviously, there's been a lot of talk about this over the years, a lot of concern there. I think there are two streams of thought here. So one is, companies, you know, they just want to do business and transact stuff, and they don't really want to be involved in anything having to do with surveillance or have the government bust in or whatever. So that's part of the frustration, but then also, you know, needing to have some type of mechanism that deals with law enforcement, situations where data may be needed. So I think these agreements, in my view, not only clarify how those processes should work but then hopefully, will give people who are doing commerce, have them feel a bit more comfortable with those data transfers.


Cameron Kerry  16:45

I think that's right. You know, I think when it comes to dealing with the government, I think businesses just want to know what the rules are. And to try to comply with those rules. So I think having a common set of rules with the United States, Europe, with the United Kingdom, and other democratic countries will help that.


Debbie Reynolds  17:22

Let's talk a little bit about the ADPPA, the American Data Privacy Protection Act, which was introduced in June of 2022. So I know this is a topic. I read a lot of your work on this topic. I think we have a slight difference of opinion, I guess we can talk about that. That would be a fun debate, I guess. So tell me a little bit about the ADPPA. We know we need legislation Federally, right? But there's a fight and a squabble happening right now. And I think especially for not only our national but our international audience, they would like to know why can't we get this thing done? What's your thought here?


Cameron Kerry  18:15

Well, it's unfinished business. For me, Debbie, I led the Obama administration's work on privacy policy and proposal called the Consumer Privacy Bill of Rights, which included having national legislation went out and under Barack Obama's signature is the first time that a President of The United States has called for National Comprehensive privacy legislation .10 years later, we still don't have that because it wasn't enough of a political priority for people on Capitol Hill and certainly not enough to overcome some of the opposition that was out there among businesses that didn't want people messing with their business models. But you know, 10 years later after Edward Snowden and the demonstration of just how much data is available out there. A succession of data breaches, including high-profile ones like Sony and Equifax. And then, of course, the Cambridge Analytica revelations which really brought it home because so many people are on Facebook. And you know, this wasn't just the government. This was a commercial sector. So all of those things, I think, just injected steroids into this debate. And we've had, I think, over the last four years, a pretty serious debate. And, you know, it's, you can be cynical sometimes about Congress, but they've actually done some serious work on this. And people were laughing back when they called Mark Zuckerberg in and some of the questions that members of Congress were asking; they've learned a lot since then and gotten pretty sophisticated about the privacy issues. And that's reflected in the ADPPA. The most important thing that it does, is to put some boundaries around the collection and use and sharing of data. And that is the thing that has been lacking to this point. And it's been lacking in most other privacy legislation, both proposals in Congress and the state bills that are out there. For years, we've operated on the system of notice and consent. We all deal with this, right? You get those checkboxes or things that inform you about cookies, and you say, yeah, I consent. But those don't really do anything to control the information that companies collect. Just because somebody has a privacy policy doesn't mean they're actually protecting your privacy. And right now under that system, they get to set the rules. The Privacy Policy defines what they're going to collect, how they're going to use it, and who they're going to share it with. Well, companies write those policies and nobody reads them. They are very broad. And, you know, I look at, I spend a lot of time with, you know, people who are privacy geeks, like me, and who will say, well, I don't read those policies. I don't understand them when I read them. They're meaningless to me. So how the hell is an average non-technical person supposed to deal with that? So what we need is a system where people can trust that their information is going to be used in ways that are consistent with their privacy interests, with their other interests, with their expectations with their dealings with a company, and not just being distributed out there in the wild. So having a law that put some boundaries that says you can use data for purposes that, you know, are reasonable and proportionate in order to provide a service or a product, you can use it for certain other specified purposes, like security, like customer service, but you can't sell sensitive information, like health information like sexual preference. Other things that we treat very, very carefully without somebody's consent. That would limit the data brokerage that will protect location information that's, you know, so important to reproductive freedom and, you know all of that information is sensitive. Right now, in the system that we have, it gets spread around; there are apps and data brokers that collect that information and put your phone number together with strings of location data and other information about you. Cambridge Analytica said it had about 5000 data points on 200 million Americans. You know, we need to shrink that down, and the ADPPA is the first bill that will really have an impact on those data practices.


Debbie Reynolds  25:11

So let's talk about California. And I think this is where our differences may live here. So, California is a State, right? One of many States. They're very progressive, though, in privacy and have been for a long time. So we know that Nancy Pelosi, she's Speaker of the House, we know that the California delegation was very against the ADPPA because they didn't want a lot of changes happening in what they're doing. And California, for me, it's not just any State. So you know, it's the most, as you know, it's the most populous state in the US, you know, 13 of every 100 people in the US is a Californian, they have had privacy as a fundamental human right in their Constitution for over 50 years. A lot of the stuff that we take for granted, you know, like privacy policies in general,  they were the pioneer in that. And I can see how California has done its work. And they're very protective of what they're doing. And I understand why they oppose this bill. So I don't think it's that they don't feel that the US needs a Federal privacy law; they just don't want that. They want to make sure that Federal privacy law is a floor, not a ceiling. So what are your thoughts on it?


Cameron Kerry  26:40

Well, two things. So first, I think that the ADPPA is a much stronger bill than California's CCPA. And that's first and foremost, for the reasons that I just described that California is still rooted in that notice and choice model. The fundamental thing that CCPA is about is the ability to opt-out of targeted advertising and out of having your data sold, but what data is collected and how it is shared subject to that opt-out, is defined by the privacy policy. So you still got the companies making the rules. So that's the first thing, but you know, the most fundamental thing when it comes to protecting people's privacy, so I think it is worth, you know, in order to have that kind of protection, worth and having that protection for everybody in the United States and having stronger protection for people in California than they have today. To have a national law that sets a single national standard and pre-empts. You know, we are dealing here with the Internet, that is, if anything was ever, interstate commerce, you know, in our commercial world, it is that and Congress regulates that and people ought to be able to have a single standard, a single set of expectations, regardless of where they are, or where they traveled to. So I think there are a lot of advantages to consumers and having that kind of single standard. Other ways that ADPPA is stronger than California, it extends civil rights protections to the use of personal information in ways that discriminate on the basis of race, sex, and other protected categories. That is new ground in privacy. It includes protections relating to the use of having companies, large companies, do algorithmic impact assessments. So having every company do some kind of assessment. It also covers sectors that are not covered by the California law, you know, including extending this baseline privacy protections to small businesses. California carves out small businesses altogether, carves out to sectors like health care. So I know there are a lot of stronger protections. And it's an advantage to have those protections for people in California and for everybody, every man, woman and child across the United States,


Debbie Reynolds  30:34

Very passionate about this interesting thing. So the US Chamber of Commerce is against the ADPPA. Their reasoning, though, is very unique. And I agree with it a little bit. Okay. So one of their reasonings behind them being against it is because it doesn't preempt laws, like the Illinois Biometric Information Privacy Act, which we know is a general applicability type of law. So what are your thoughts about when people say things like that in terms of preemption?


Cameron Kerry  31:09

I think there's some sensitive line-wearing being done here. With colleagues at the Brookings Institution a couple of years ago, we outlined a sort of grand bargain around a bit of both preemption of State laws and private rights of action. That's, I think, very much been kind of the model that people followed in the ADPPA, and other bills, part of that amount and preemption kind of not that of doing sort of black or white, either or approach. You know, where you're going to leave all state laws in place, or you preempt all state privacy laws. And that's kind of where the debate started out. So the Chamber of Commerce and businesses were saying, no, no, no, you just got to preempt. Well, that would wipe out 100 years of State privacy laws and some of the things that you're talking about in California and other places. So instead, you take a jigsaw. And you leave a number of laws in place, I think, look, if I were making the decision, I probably would include the Illinois Biometric Privacy Protection Act in what is preempted, but our proposal also allowed for States and municipalities to deal with biometric things like surveillance cameras, in public places. So you're regulating the streets and other things. You know, we do need to leave some of that play for State laws. You know, part of that trade-off too was allowing people to bring private lawsuits, but really making sure that those having some filters that make sure that those States, part of that involved allowing some private litigation, but having some filters that ensure that those cases have merit, that you don't get kind of nuisance cases coming in that you know, end up benefiting class action lawyers without really benefiting the public and have expenses for businesses.


Debbie Reynolds  34:17

What's happening right now in the world with technology that's concerning you the most around privacy?


Cameron Kerry  34:27

You know, I think, Debbie, what I think concerns me the most is what's called ad tech, you know, the platforms that support online privacy. Yeah, I believe that there is value in advertising and I've been blown away by it. Political activists engaged in campaigns appreciate, you know, the importance of being able to reach your audience. And I think that's, that is true in the commercial sector as well. So, you know, being able to, you know,  promote products and services is important to the economy. But what we are seeing is incredibly intrusive. And the problem is that you know, you have, the way that online advertising works is there the system of what's called real-time bidding, where literally in a matter of microseconds, if you are online, on a site, you know, that conveyed the subject, you know, the target of ads that are sold in microseconds based on whatever characteristics they're able to identify about you. And, you know, a lot of that is supposedly not identifying to specific people. They've been out, there are ways of tracking that don't do that, but a lot of that information is so leaky. So some information gets exchanged in those, that microsecond transactions of selling the ad, and then people are able to collect that, and then they're able to aggregate that information with other databases, and then all of a sudden, whatever that search was, whatever that product, the ad may have clicked on, you know, ends up with data brokers, and you know, correlated with a lot of other information about you. And, you know, that becomes connected and essential, and that's part of how our data is just getting spread out in the wild. The reason that we need those limits on collection, use and sharing. So, but it's, you know, advertising is incredibly complicated. I mean, you know, I've spent a lot of time talking to people in ad tech, getting briefed on developments in ad tech systems. And I find it bewildering. I'm so you know, it's an area where I think to realize the benefits of advertising. But to get out to protect people's privacy, protect that data from being used in ways that are against your interests, against our interests, as individuals, we need a better system. So I actually think that this is a place where the ADPPA could do better by having the Federal Trade Commission do rulemaking in this area to help define the boundaries. So you know, what's good targeting what's bad targeting? So, you know, you don't want the stuff that's going to reveal somebody's sexual identity, in most circumstances, but then sometimes that may be right, or if you know, you, you're trying to sell cosmetics for dark skin, you do want to be able to reach people of color. And that's probably a benign use. So, you know, we need to have the right balance here. And I think an agency that you know, can delve into ad tech and explore these norms with some boundaries established by Congress is the right way to go.


Debbie Reynolds  39:51

What are your thoughts on human versus consumer privacy? So a lot of the things we're talking about in the US a lot of it is around consumer privacy, not really human privacy. So do you think that we can bridge that gap? Especially like, the FTC currently, even though they have a lot of power within the US, you know, they don't regulate all industries and things like that. So I think what people want is more human rights, definitely consumer rights. But also more human rights. So do you think we'll ever be able to bridge that gap in the US?


Cameron Kerry  40:27

You know, I think we're getting there. Debbie, look, I agree with you. I think privacy is a fundamental human right. It touches on things that are basic to us as individuals, to our identities, our autonomy, and our relationships. And privacy is something that enables all that, enables us to formulate our thoughts about politics, love, whatever. I'm, and, you know,  that's what we should be protecting. You know, when I talked about some of the work we did at the Brookings Institution on privacy legislation and the way forward, we actually took some of the proposals and, you know, pieced them together and cut and pasted and adjusted things into a model bill and called it the Information Privacy Act. For exactly a lot of the reasons that you are talking about and talked about individuals, not consumers. The ADPPA actually does some of that in the sense it talks about individuals; it extends the coverage of some sectors that have been excluded. You know, some of it is unavoidable because of the way our system works. I mean, Congress regulates interstate commerce. And that's what the Federal Trade Commission is, so you can't avoid the consumer context entirely. And certainly, much of the data collection that goes on occurs in that context. But you know, we still have to keep an eye out on what the objectives are here.


Debbie Reynolds  42:52

So if it were the world, according to Cameron, and we did everything you said, what would be your wish for privacy anywhere in the world whether it be law, technology, or human stuff? What are your thoughts?


Cameron Kerry  43:06

Look, my number one wish is, you know, to finish that business that I started on when I joined the Obama administration in 2009, which was to have a comprehensive, strong privacy legislation, comprehensive legislation in the United States. And I think that, that that would have an impact on the world. I mean, right, as it is today. There are 151 countries that I think it is that have comprehensive privacy protection laws, including China. You know, that it's mostly about protection from companies, not from the government, which can collect anything and everything their wants, and certainly does that. I'm but, you know, the United States become the outlier. We should be leading in this area, we have a long history of privacy protection. In a real sense, privacy, as a legal concept was invented in the United States, you know, we have the Bill of Rights, which protects you know, papers and effects and, you know, your thought, your religion, your expression, and we've protected that, you know, from the beginning of our country, and you know, the right to privacy as q legal concept was originated by Louis Brandeis, and what's probably the most famous law review article ever, in 1890. And there's a lot of law that's built up around that. The Europeans like to claim credit for the first comprehensive data protection law in the state of Hesse in Germany in 1970. Well, in 1978, we adopted the Fair Credit Reporting Act, the first national law, and many of the fair information practices that are a foundation of privacy today, including a foundation for European privacy law, come out of work that was done in the US government, in the 1970s. So when I say privacy was made in the USA, there's, there is background there. But we've, we've kind of abdicated our leadership over the past 30 years. So if we can step up. You know, I think we regain some moral credibility that can help on many fronts. And I think what happens with data flows and surveillance coming out of the US - EU agreement, I think, advances international norms on government transparency, limits on surveillance, and rights of people who are subject to surveillance. And there's an increasing amount of discussion about that around the world. There's discussion underway about how to deal with artificial intelligence and how we should we regulate that. I think it is hard for the United States to engage in all of those discussions without having comprehensive protections in the commercial sector. So our credibility is at stake. I know if we could get that done. We can help to advance the discussion around the world on privacy.


Debbie Reynolds  47:51

I agree with that resounding agreement from me.


Cameron Kerry  47:55

We are in violent agreement again, just like we were in violent agreement in Berkeley. We are today.


Debbie Reynolds  48:03

Yeah, definitely. Well, that's wonderful. Thank you so much. I think this will be very instructive for the audience. And I'm thrilled. It's such an honor to have you on the show. I love for us to be able to find ways we can collaborate together in the future. That would be great. Well, thank you so much, Cameron. Thank you.


Cameron Kerry  48:23

All right. Good. All right. Good to see you. Okay.