"The Data Diva" Talks Privacy Podcast

The Data Diva E74 - Dr. Nicol Turner Lee and Debbie Reynolds

April 05, 2022 Season 2 Episode 74
"The Data Diva" Talks Privacy Podcast
The Data Diva E74 - Dr. Nicol Turner Lee and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to Dr. Nicol Turner Lee Senior Fellow, Center for Technology Innovation, The Brookings Institution. We discuss this issue of the Digital Divide, her upcoming book, “The Digital Invisible: How the Internet is Creating the New Underclass”, her career path lead to her work at the Brookings Institution, the lack of digital access, and examples of real-world impacts, using applied sociology to highlight challenges in digital life, how poverty can affect privacy, addressing  AI harms and delayed redress,  unknown factors of digital life and Data Privacy which can affect consumers, the need for US Federal privacy legislation, emerging digital strategies, and her hope for Data Privacy in the future.

Support the show

36:26

SUMMARY KEYWORDS

people, privacy, data, technology, thought, happening, Internet, digital divide, ai, chicago, computational models, brookings, privacy legislation, book, agree, work, debbie, person, digital, consumers

SPEAKERS

Debbie Reynolds, Dr. Nicol Turner Lee


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds. This is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a super-duper special guest on the show. I'm so excited to have her here. Dr. Nicol Turner Lee is a Senior Fellow in the Brookings Governance Studies Department, the director of the Center for Technology Innovation, and Co-Editor of TechTank at the Center for Technology Innovation. Dr. Turner Lee researches public policy designed to enable equitable access to technology across the US and harness its power to create change in communities around the world. Dr. Turner Lee's research also explores global and domestic broadband deployment and regulatory and Internet governance issues. She is also an expert on the intersection of race, wealth, and technology within the context of civil engagement, criminal justice, and economic development. She's also the author of a forthcoming book, Digital Invisible: How the Internet is Creating the New Underclass. Welcome.

Dr. Nicol Turner Lee  01:34

Thank you so much, Debbie. You can shorten that if you need to. You can actually use that I'm KJ and Chloe's mama.

Debbie Reynolds  01:43

I think it's important for people to know what you're doing. You have such a unique role. I've been such a fan of you here. So the stuff that you do. For me, I think people don't talk enough about the digital divide. That's something I always come back to again and again in your work. So I saw you testify in front of Congress recently about broadband issues. I'll see you at different things. I see you're writing a book, and I saw your book.

Dr. Nicol Turner Lee  02:12

It's almost done.

Debbie Reynolds  02:17

Yeah, it was great. The Digital Invisible: How the Internet is Creating a New Underclass?

Dr. Nicol Turner Lee  02:23

Yes.

Debbie Reynolds  02:24

We have to talk about this. So first of all, tell me, how did you come to your career? And why is this your passion?

Dr. Nicol Turner Lee  02:34

So it's so interesting. And again, thank you for having me; I always appreciate the opportunity to be surrounded by black girl magic. And the opportunity to speak to a black woman who is actually delving into these areas is always an honor and a very humbling experience. So how did I get into this, and it's funny because the book that I'm writing actually is about the digital divide in the United States. But it's also a memoir because I've had 30 years of working in the space. One, as a digital activist working in the city of Chicago, running computer labs, you know, those spaces with eight to 10 PCs and a printer. And then, I had the opportunity to come to Washington, DC, and be an advocate. And I got into this space primarily as a volunteer, and I wasn't looking for it. I was finishing my Ph.D. in Sociology while I was in the city of Chicago; I was a student of urban sociology, so very much attuned to the structural circumstances. And it created many parts of Chicago that were racially segregated and poor. And I was volunteering at a small computer lab in an affordable housing building. And that's how it all started. While I was actually working on my thesis, I was, by night, the tutor person, you know, tutoring young people and helping people apply for jobs online. And I realized back in the early 2000s that there was something that was happening here. You fast forward to today and what I experienced professionally as a person who, you know, just your listeners understand, I did eventually get my Ph.D. It took me 10 years after working in grassroots activities and social service organizations, etc. But part of what I decided to do professionally was to become an applied sociologist. And that's really taken me where we have policymakers who need to hear, you know, what we're talking about when it comes to digital inequity. So I'm, I call myself a professional scholar who intersects race, technology, and social justice. And that has really been undergirded Debbie by my on-the-ground experiences with real people who look like us and other vulnerable populations trying to navigate through a new Digital Highway.

Debbie Reynolds  04:55

That is phenomenal and fascinating. I'm a Chicagoan as well, so to black Chicago girls, I love it.

Dr. Nicol Turner Lee  05:01

Well, let me correct. I'm from New York originally, and you didn't hear the accent? But I actually was in Chicago for about 17 years. So I do know, South Side. When I first met. President Obama, I met President Obama, like earlier in my career, at some event I was in Washington. And he was like, and I told him I was in Chicago. He was like, what part of Chicago are you in, Southside? No, I was at Northwestern. I'm on the Northside. And I was like, not cool because I was actually going to school on the Northside. So I just had to put that out there.

Debbie Reynolds  05:33

Oh, my goodness. Oh, my goodness. Yeah, I lived in DC for 10 years. I'm pretty familiar with those circles. Pretty interesting, pretty interesting. So I saw some other talks you had done about COVID and privacy and stuff, and I want to bring something up to you. It reminded me of something I saw related to your book. And it's a statistic that you talked about, which is that of the 7 billion people on Earth, less than half of those people are on the Internet. Yeah. And this is something I have to tell people all the time, because, for example, and I want your thoughts on it. So when people were talking about let's get a COVID app, right, and so okay, and then when I was talking to people about they're like, okay, well, we have a COVID app, that'll solve COVID. And I'm like, well, less than half of people on Earth have smartphones. So how's this going to solve it? So I just want to talk to you just about that part of the digital divide. People assume that people have more access than they have and move to the next thing, which is a problem.

Dr. Nicol Turner Lee  06:43

Yeah, I love that question. So I've been in this space, as you've heard, for like three decades, and as a result of that, I've had the opportunity to come to Washington DC with, I think, an authentic view of the Internet from the grassroots level. And then, I'm a person of color, and I represent some of the lived experiences of the people that I talk about in my book, with the exception of those from rural areas because I grew up in more large metropolitan communities and cities. My point is that we've commonly thought that the digital divide is very binary; this is what I write about. We've seen it, as you know, a battle between the haves and the have nots, who has Internet, who doesn't. Who has a device? Who does not? And really, at the end of the day, it is less binary today than it's ever been before. And I'll tell you why. It's no longer about who has a device and who doesn't, who has access to broadband service, or who has digital literacy training. The Internet has become sort of the onboarding of our new economy. And what the pandemic did is it actually showed that the people who are not disconnected, much like we found in the pandemic, the people who are immunocompromised or medically underserved, are people who sit on the other side economic, social, and geographic isolation. I assume the digital divide to be less about the hardscaping. And you know, the issues that we have to concern ourselves with regards to just fiber? I think that's important. But I honestly think that if we're going to solve the digital divide, we also have to solve poverty; we have to solve rural isolation and geographical isolation in our urban communities. And so, as a symptom of these types of systemic inequalities and thinking about things like vaccination passports and the extent to utilize technology for telehealth, you have to address the fact that we already have fractured, fragmented systems in place. So giving people the opportunity to be in a modernized economy, particularly around what we saw with the vaccination, is great. But guess what? Half of the people could not get online to schedule an appointment or actually figure out the symptoms that they were experiencing because they weren't connected to the Internet. And so we have to always keep in mind that globally, we have a digital divide. But here in the United States, we failed millions of people simply because we think we're on the Metro in DC, and we see a whole lot of people with the phones out in their hands, shaking and bobbing that everybody's connected. And I think the last two years really showed that people like me had been right all these years when it comes to the lack of digital conductivity.

Debbie Reynolds  09:22

All right, I agree with you there. Thank you for those great points. One thing I totally love about you when I hear you speak and the things you write about, I mean, real talk, right? So the things that you talk about, you do it in a way that's very inviting and brings people into a conversation that includes them into the conversation. So I think a lot of people, when they think of think tanks, like they're thinking about someone with a tweed jacket with leather patches on their elbows. Ivory tower terms, but you know, even in some of your testimony to Congress, it's like, okay, here are my thoughts, and here are some practical ways that we can address the problem. I think that's what's needed to be able to move the ball forward. What are your thoughts?

Dr. Nicol Turner Lee  10:12

Well, thank you. I mean, thank you for that. I mean, I'm not a lawyer. I'm a sociologist. And they're like, what I mean, come on, for those who are listening, sociology was always like, considered one of those majors that you took to kind of get through college. And I'm the person who's showing you that you can actually do a lot with a sociology degree, particularly one that applies sociology in the circumstances in which we face. For me, it's like giving honor to the people who allowed me to be in their lives these last 30 years, and I write about those people in my book; I mean, you've seen stuff that I've written, I decided, as a policymaker, after doing a lot of white papers with stats and policy bullets that we in the Beltway are out of touch. The experiences that people have, and the storytelling that comes with those public policy experiences, really have to enlighten us to move forward; we are in a virtuous cycle in Washington. And for people, you know, I was an advocate; I sat with an advocacy organization for a good portion of my career going before Congress to sort of win a  position, a policy position. And you know, it's a hard job, and it's exhausting. But what it does is it keeps you on either the side of the winner or the loser. And sometimes it's not always healthy and helpful. But sometimes, it actually brings people back into the reality of why certain policies are not working or applicable to certain populations. So for me that that's kind of like driving my life right now. Everybody in my household is like sort of fed up with me because I am that person who gets up in the middle of the night or watches the news and gets really concerned about the plight of people of color and vulnerable populations, like those in the LGBTQ community, or the disabled, or the older populations. And now, the reason my book is called The New Underclass, I'm concerned about the farmers that I met The ones that can't compete with large farming industries, who actually need broadband just to order supplies. So yeah, that's who I am. You know, I mean, I'm not changing; I'm too old to change my ways. So you got to take it or leave it.

Debbie Reynolds  12:17

I would love your thoughts on privacy. And I want to tell you my thoughts, and this is one thing that drew me to you and your work about the digital divide. I feel like we're almost creating a permanent new caste system, not just the haves and the have nots, but who has access to data, who has access to insights and who doesn't have access to insights, or who can protect their privacy. A better example, for me, let's say Apple, for example, they had rolled out this app transparency feature that gives people who use Apple products more privacy about advertising, right, but not everybody can afford an Apple product. So I think it's good that companies like Apple are looking at that. But I feel like there's such a wide gap between who has agency to be able to protect their privacy at all. What are your thoughts?

Dr. Nicol Turner Lee  13:16

So I love the way that you frame that because I, you know, having been in the privacy space, believe it or not, for about 10 to 15 years, I first met Cam Kerry, who is one of the fellows at the Center for Technology Innovation at an Aspen Institute session where at that time, he had just started with the Obama administration, it was talking about the rules of the road when it came to privacy. That's how long I've been involved with this discussion. And I think, from a large perspective, or large scale perspective around privacy, obviously what we're talking about with Apple, I think it's a continuing effort to do privacy by design—so ensuring that in the Internet architecture have the app or the hardware that we're considering privacy-enhancing strategies, and techniques, as well as technical cadence that allows us to ensure that there's some privacy built into that product or service. I think the second thing is what we've seen in the privacy debate is sort of this consumer agency that you've been talking about, right? Unfortunately, your privacy comes with the fact that you have to have the means and the education to control that. But we do see in applications like you know, Facebook, or now Meta, and other online platforms where people are able to control their location settings, as well as what's collected on them. One of the greatest contributions of the General Data Protection rules out of Europe was that we all get that message about cookies. And we have the consent even if we don't know that we're not talking about the Cookie Monster. But we're talking about the cookies on that app, that we all have to do something, so it gives us some agency over that. And then I would say third, which is I think one of the reasons why we do not have privacy legislation to this day, is that we're sort of getting hot now because the technology itself has advanced the very rudimentary functions of the Internet. Two or three years ago, a decade ago, talking about Data Privacy, we were talking about privacy-enhancing technologies. You know, maybe five years ago, we started talking about human agency when it came to online platforms. Today, we have this added risk with machine learning algorithms, a body of work that I actually lead at the Brookings Institution. And we also have the fact that we are now more enabled by online applications and broadband, that in many respects, it sort of blends into our physical realities, right? We don't know when the Internet starts. And when it stops, we don't think about, hey, am I giving up my privacy when I'm ordering, you know, DoorDash, or getting in my Uber? We know that these technology conveniences work. But we're not necessarily always thinking about the implications in the long run and the extent to which our information can get into the hands of bad actors. Or we can actually just hand it over to those actors because we want a product or a service. So this barter economy that rests upon our photos, our addresses, our Social Security numbers, attributes of our face, all of that now is part of this new data system that makes privacy management and enforcement much more difficult, Debbie. And so, I think we've come a long way in thinking about that. And, you know, I think to your point, we have to be careful, because what are we going to have, like you said, I love the way you put it like a digital caste, where those who can actually afford to protect their privacy will versus those who cannot? Or are we going to see that, you know, these walled gardens around private spaces, where people are able to negotiate what they want to be seen, what they don't want to be seen, or, you know, mined. So it's interesting going forward and particularly nice because we have no Federal privacy legislation right now. So the sky's the limit with regards to what companies can actually collect on us.

Debbie Reynolds  17:07

Yeah. Fascinating. What is happening in technology right now, as it relates to privacy, that's concerning most? What's on the horizon? What makes you say, wow, I don't really like that. Are you concerned about this?

Dr. Nicol Turner Lee  17:23

I mean, look, I think that as the debate has not been resolved at the congressional level, we have seen a patchwork of state laws that are trying to address some of the hard problems that come with this unfettered access to our personal information. I also think, as a result of us not getting a handle on this, we're now seeing many of the exploitations that vulnerable populations experience offline migrate online. Then that is where we have seen a lot of the debates and discussions around disparate impact and differential treatment of data sets, right where some data sets will err on the side of creditworthiness, whereas others will err on the side of credit rejection. And LaTanya Sweeney at Harvard has suggested that some of this has little to do with whether or not you have good credit; it may have more to do with the sounding of your last name, the address, or other proxies that may lead lenders to actually not provide to you. My colleague at Brookings, our incline suggests that now, some banks may look at the type of hardware you're using to access the Internet, and Mac users have a higher tendency to be more creditworthy than those of us like myself who are still on PCs. Listen, when you start adding that together. For me, one of the most pressing concerns of not having solid privacy legislation is not having a civil rights regime that is able to withstand these types of harm. And so a lot of the work that I do at Brookings, we have a couple of papers coming out just published a paper that will be coming out on racial equity, and antitrust, also one on racial equity and privacy and surveillance and policing and other surveillance technologies. Listen, we've got a lot of work to do. I mean, the accommodation law that my parents grew up with that I was the beneficiary of is not the same type of accommodations law that we see on the Internet. I mean, think about not just if you don't mind entertaining me for just a moment. I mean, Airbnb was able to look at the faces and photos of potential guests, and some of those renters decided not to rent to people of color. We've seen instances where certain scrape data in emotion AI, for example, is able to gather some biometric information about you. That is a pre-screening tool for employment, and guess what? You could be looked at as a nonfit, and we see that with black and Latino male applicants when you start to bring all that together, it creates, I think, a harder rationale to really reel in privacy. And that's why I think, instead of sounding so pessimistic about them, first, we need to pass Congressional legislation. But more importantly, we need to make civil rights part of the duty of care. We expect anyone who was handling personal data to do it in a way that is compliant with civil rights laws, period, done, mic dropped. And I think we're still debating that for some reason as if there's no correlation between what people know about you and the extent to which they can manipulate or exploit you.

Debbie Reynolds  20:39

Yeah. I know that you work on AI. You mentioned the paper that you have; I feel like I'm concerned, obviously very much, about AI because of the ability to turbocharge the biases and the harm that can happen to people. And for me, if that harm happens as a result of these automated systems, there could be no adequate redress, right? You can't wait to go to court or whatever, and you may not be able to afford to go to court to be able to sort those things out. But I think one thing that I'm saying that I'm hopeful about with AI is that a lot of the bias and discrimination that we think about that we knew was there is being coded into technology where someone looks at it, they can say, this doesn't look right.

Dr. Nicol Turner Lee  21:34

Well, yeah, I mean, I think that's what's actually going on right now. Right? It's sort of the coded bias that is happening. The formation of AI computational models is, you know, it's scary because you cannot go to the Supreme Court and suggest voter suppression like we saw in 2016 when foreign operatives used the Internet to manipulate folks to stay home or convince them that their polling stations had changed. You, it's hard to suggest that you were denied an appraisal that was coming through a generated, automated predictive decision when you don't know what actually constitutes that model. And that's why I think it's very important for us to do a couple of things. And I actually write about it; I have a chapter coming out in an Oxford University Press book on AI Governance that we need laws. And you know, right now, Congress is conducting a hearing on the Hill on keeping big tech accountable. And a lot of that has to do with accountability laws and algorithmic models and the extent to which we're doing real third-party auditing. But I think another thing I'd like to add to the debate, which is in my article, is that we also need to have some type of incentives in place for companies to do the right thing, as well as government to do the right thing when they actually deploy these models. I call it the Energy Star rating system because I'm, like many of your viewers and listeners, essentially go into a big box store, and I look for that big yellow label that tells me that that appliance is going to work with the standards in which it was built. We don't have that for AI. And one of the things I've been trying to argue is, why don't we have trustworthy AI or signals to consumers that this is what they're getting. I hate to tell people who are listening, but your Credit Karma score is not going to get you a car loan; it's going to be the score that comes out of it. Experian and TransUnion have all been determinants of verifiable data when it comes to credit. But a lot of people don't know, Debbie, like a lot of these decisions, happen on algorithms. And so if it were me, and I have like a wand that I could wave, you know, obviously, I would like to see more of a Better Housekeeping seal, like this Energy Star rating approach, where people understand that there's been some due diligence, particularly in sensitive use cases. But I'd also like to see us as consumers having some input into what these models are saying about us and being able to have those models disclosed when they're applied. Because I really think that lack of algorithmic literacy, in addition to the lack of literacy around how much data is actually collected about us, is really not good for our economic and social governance.

Debbie Reynolds  24:25

Yeah, I agree with that. I agree with that. I just want to throw an example of something that I read about a while back, and this very much concerns me because it's around risk modeling—the way they do algorithms. So there are companies that use Google Maps to look at the areas that people live in and the rate they adjust people's, for example, auto insurance rates, based on how they interpret the data about the area someone lives in, right? So your insurance may not be based on what you have done personally, maybe your driving, it may be because, let's say your neighbor's garbage cans turn over or something when they looked at Google Maps, that they think, okay, this person, we need to make them a higher risk and charge them more insurance. So I'm concerned about those things where it's not visible to the person and how they're then really not being judged fairly or individually or in groups.

Dr. Nicol Turner Lee  25:40

You know, I agree with you. Because I think, again, a lot of these computational models, they're not like, how do I put it, a validated and verified dataset. A lot of these computational models are not necessarily like those models that we have been trusted with some type of interrogation when it comes to their execution or deployment and real-world consequences. I mean, let's face it, part of what has made the technology ecosystem so fascinating is permissionless innovation, this rush to actually create and ideate and get ideas out there that, you know, usher inconveniences, I mean, think about it, when I was growing up, we would have never had like a LinkedIn, you had to literally type your resume, right? And look at the newspaper and apply for a job, right? It wasn't like you could go on, and some type of AI would say, hey, you're really good at this public relations. And they match you with jobs. I mean, come on, who thought of that, right. But AI, in all of its conveniences, has come with these risks. And in the EU, for example, they actually designate certain algorithms as high risk. And so I think we need to think about that here in the United States. The same way we thought about these things when we started to see this narrow path for certain communities with homeownership, or the ownership of cars, you know, where I mean, as a sociologist, the research is clear, you know, women go in and buy a car, they get higher rates, black people go into certain car dealers, they get kicked out, come on now. I mean, it doesn't get any clearer than that, that these things can happen on the Internet with even greater precision of discrimination. And so, in my view, I think it's important for us to have conversations around these sensitive use cases. But more importantly, since we know that these biases are normative, we need to figure out what is the right policy intervention? Where should the industry be managing the reputational risk? And how do we involve consumers to sort of appeal to what is happening to them and what they think is happening to them when it comes to disparate impact or differential treatment?

Debbie Reynolds  27:50

I don't know. So much work to do, huh?

Dr. Nicol Turner Lee  27:53

No, right. I feel like the algorithmic judge. Listen, I mean, if I can go back to your earlier question, though, if we actually had privacy legislation, that would be a start. I've become convinced over the last few months that the reason that we're having these conversations about algorithms is because we have another conversation about privacy. And so when someone can take your unit of analysis and splice that in so many ways, oh, okay, that's a black woman with a red sweater or a black woman with black hair, black people wear glasses, a black woman doesn't wear glasses, a black woman likes red lipstick, versus pink lipstick, a black woman wear braids, all of these variables are now available to different people and different companies, and even the government. And so I think it's really important for us to have some type of privacy framework that allows us to have at least some gap stops of where we can, you know, sort of stop the collection limited or be more transparent among consumers that this is being collected on them. You know, the Pew Research Center essentially says that most people do value their privacy. But I think that conversation has very much changed. And it's going to be more interesting to watch how COVID data is going to impact people. I mean, I think about the fact that I had somebody the other day when I told them I've never had COVID. So I'm one of those people. I'm like, I've never COVID right now. And I told the person, I wonder if I were to get it. And my data is going into, you know, the private companies like CVS and Walgreens, the extent to which somehow my insurance company will have it at some point. And that may determine my premiums. And so I think we have to watch for these factors.

Debbie Reynolds  29:40

Oh, my goodness. Oh, my goodness. This is so much. Thank you so much.

Dr. Nicol Turner Lee  29:44

I know I sound like a preacher. Don't I on a Sunday morning.

Debbie Reynolds  29:49

No, your preaching is needed. These conversations have to happen. I think we're at a place where something has to happen. We can't just go along sort of float on the water at this point, and we have to really have some type of strategy. So right now, I think we have a lot of tactics, but we need some sort of overarching strategy to build more privacy and more trust, right, in terms of how people use data.

Dr. Nicol Turner Lee  30:20

No, and I agree with you; I think we do need to have more trust frameworks. And I do think we need to have, you know, going back to your earlier point, again, you know, just more frameworks around how we define privacy? Where do we start? Where do we stop? And I think we have to get the political will to move forward with this. They have sat on this way too long. Now, I know that you're the privacy diva. So I know, you already know what's been happening, right? We can agree on the private right of action, and we can agree on whether we should have certain consent models in place. Listen, one of the things that I've been trying to advocate is if we have a civil rights standard, for example, perhaps we can come up with some narrowly tailored use cases of the private right of action. Nobody wants to be sued because they violated a civil right. And I think, you know, that could potentially bring, I think, some more compromise to the table.

Debbie Reynolds  31:12

My thought is, you know, these regulations and laws, it takes a long time to get Yes. So I know, we're trying to do almost like a Hail Mary. We're just trying to throw everything in, trying to get as much in as possible. But for me, I feel like, for example, all 50 states have a data breach notification law. Yeah. Because if we can have one, Federal data breach notification law that at least harmonizes the definitions of a data breach and what is personal data, that will be at least a building block that we can build on from there.

Dr. Nicol Turner Lee  31:54

I mean, I thought we were going to have something closer after Europe came and shamed us and said that they had the GDPR. Look, then I thought when China came out with their privacy policy, that we were going to think differently about it. I'm not sure why we can't pass this except to your point. The longer we wait, the more we try to figure out how to solve everything. And I think for any policymaker listening, we can. And that's the challenge of technology. That's why the digital divide has been harder to close. That's why it's been harder to reel in platform regulation. That's why we're now seeing these above-the-line consequences when it comes to the use of technology. And that's why we haven't passed privacy legislation, honestly, that we need to figure out and really come to grips with the fact that the train has left the station. And now it's going faster than we can actually reel it in.

Debbie Reynolds  32:43

I agree. Dr. Nicol, if it were the world, according to you, what would be your wish for privacy?

Dr. Nicol Turner Lee  32:52

Let's see if it was the world, according to me, privacy, what would be my wish? Well, I'll break it down into levels; I'll say I'll give a personal and a professional, l mean, obviously, the world according to privacy, we'd have comprehensive privacy legislation so that we can have conversations about this knowing, as you said, with data breach notification what the common language is, from what we're trying to solve. So, we'll get Congress to move because I am fearful of what the future holds when it comes to the data that was collected during this pandemic. As a parent, what does privacy mean? To me, it means that we have to come up with some structure to avoid the possibility that our young people live in an openly surveilled world where everything that they do is open to commentate commentary or scrutiny and that they understand that the things that they do today may be archived against them when they tried to get a job, or they tried to get health insurance, or they tried to engage in political movements. I really wish that we figure out in this privacy debate how to bring more awareness about this new Internet that we have. Listen, I'm like everybody else. As much as I know about the policies and I know how these systems work, I still use them. Because I actually have gotten to a point where I'm starting to understand that I have no choice. When I go on, you know, when I try to call somebody to get help from the customer service line, they redirect me to the back online, to be able to, you know, get the response. I am finding that the analog space is quickly dying. And so the world, according to Nicol, would be to place the interest of our children first as we try to solve this issue.

Debbie Reynolds  34:45

Wow, that's brilliant. I love that. Thank you so much. It's been such a treat to chat with you today, and keep up the great work. I'll be watching all the stuff you're doing, and I can't wait for your book to come out.

Dr. Nicol Turner Lee  34:57

I know you're going to have me back. Okay. I am not private about the fact that this book is coming out in the fall or winter. It will be out in October, November, and I'm expecting an invitation to come back. But you know, thank you. Keep doing what you do. I'm a big fan of yours too. And I think together we can make this work. Okay.

Debbie Reynolds  35:16

I agree. I agree. Thank you so much.

Dr. Nicol Turner Lee  35:19

Thank you, Debbie. I appreciate you. All right, thank you.