"The Data Diva" Talks Privacy Podcast

The Data Diva E57 - Olivia Holder and Debbie Reynolds

December 07, 2021 Season 2 Episode 57
"The Data Diva" Talks Privacy Podcast
The Data Diva E57 - Olivia Holder and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to Olivia Holder, Senior Privacy Counsel, GitHub. We discuss privacy in the public sphere, children’s privacy and the importance of the UK Age Appropriate Design Code, data localization trends in data privacy regulation globally, local financial incentives as part of data localization efforts, the challenge of code re-use in app development, IoT and the Metaverse, Facebook’s move to delete Biometric data and global trends in facial recognition, interoperability and the use of API’ her future concerns about privacy, how privacy professionals can obtain buy-in to get resources and cooperation, privacy professionals an as diplomats, developer’s difficulties incorporating privacy,  increase in privacy publicity, US Federal and local privacy legislation, private right of action and pre-emption as a barrier to omnibus privacy legislation, the global influence of GDPR and her hopes for Data Privacy in the future.

Support the show


50:27
SUMMARY KEYWORDS
privacy, data, Metaverse, people, area, app, developers, laws, localization, business, companies, happening, technology, state, facial recognition, important, talk, point, IoT, developing
SPEAKERS
Debbie Reynolds, Olivia Holder

Debbie Reynolds  00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva Talks Privacy" podcast, where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. I have on my show a very special guest from Seattle, Olivia Holder, who is Senior Privacy Counsel at GitHub. Welcome.

Olivia Holder  00:37
Thanks, Debbie. It's great to be here.

Debbie Reynolds  00:39
Well, I saw your profile on LinkedIn. And I was like, Oh, wow, we need to know each other. I don't know why we don't know each other already. I really like to talk to people. So let me back up. So my career, I've worked with people in Fortune 500 companies for years. And like tons of you know, GC's and people in house. So to me, when I get a chance to talk with someone who's in house at a tech company, it's just extra special. So I would just love to pick your brain. And as you said, before we record it, we can have a geek out about privacy.

Olivia Holder  01:18
Sounds really fun.

Debbie Reynolds  01:20
Now, tell me about your journey into privacy. What made you pivot that way? What kind of caught your interest about it? 

Olivia Holder  01:27
So yeah, I sure started out in law school with an interest in intellectual property. And after doing some graduate assistantship work in the area and some internships, I realized that I was more interested in technology generally and how legal issues affect technology. I did some work with a startup and an incubation program at Georgia Tech, and that was my first exposure to like the tech world, the business world. And in comparison to like a traditional legal environment, it's much more fast-paced, which was exciting to me. And then, from there, I just kind of really fell into a privacy internship at an Internet service provider and then ultimately turned into a full-time role. That was about six years ago. And so I've just been in privacy ever since it's just like I mentioned. Based on my interest in long technology, privacy is just right at the intersection of that, and it's just a really interesting, fast-paced, growing area.

Debbie Reynolds  02:34
Yeah, I think it's a really exciting time to be in privacy right now. What is it that probably most surprised you recently, in the kind of recent years about kind of Privacy and Technology,  I feel like privacy, especially kind of before the GDPR came out, you know, they're just little groups of us, like chatting amongst each other. No one really cared about it. And I feel like GDPR, from you know, a lot of the data breaches. Now, things are in the news. It's just more out there. So, you know, what are your thoughts about that? 

Olivia Holder  03:13
So I think you're definitely right about just the explosion of privacy issues. I remember like starting out in my first role and working on kind of awareness presentations for leadership, and I would find really like the one privacy article that came out in the last few months. But now it's like, every day there's something new, but at the same time, the sophistication of understanding of privacy concepts doesn't necessarily follow for people who aren't privacy professionals. And so, like, I wouldn't say I'm surprised by recent changes. But I think some of the areas where I just see things exploding, for example, is in children's privacy law. So like, the UK fairly recently introduced age-appropriate design guidance for online services directed at children, including those services, which aren't mixed audiences. We have a lot of states, US state privacy laws cropping up around education, privacy, and children's privacy. So I think kind of helping to reeducate stakeholders on the basics as the basics change is one of the challenges. It's really important as a privacy professional to kind of help keep everyone informed.

Debbie Reynolds  04:44
Yeah, well, actually, so this is a topic I know a lot about children's privacy. Obviously, I'm kind of a geek about the stuff anyway, but I also have clients that develop tools that children use, so being you know, very You know, looking at these laws, I think one of the interesting things about it is that, you know, when you think of things like CCPA, or, you know, different privacy laws around the country in the US, you know, some people, they're like, Oh, well, it doesn't apply to me, because I don't make this much money or whatever. It's my way. If you have a kid that you share data with, yeah, it applies to you. Basically, almost everything applies to you. So y'all get up, get out of jail free card because of that. So you know, it is a really sensitive issue in our love to talk, you know, chat about this age-appropriate design code in the UK, because I think it is a great guide, not just for the UK. But for anyone who's developing apps around, chill for children. What are your thoughts?

Olivia Holder  05:54
I think it's one of the core concepts are really two core concepts out of that guidance that sticks out to me is, number one, that it's not meant to be an age-gate for the internet, I think they described it as, but at the same time, where we have services directed at children, they're looking to make privacy, the most private setting as the default. And I think that's a really interesting challenge for really any tech company that is offering a website that a person can access, kind of, like you said, a lot of times you'll work with clients or customers where they say, well, this doesn't apply to me for one reason or another. But like you said; actually, it does apply to a much wider user base. So I think that it's a really a tricky position for really anybody, like hosting a website that can be accessed globally. And even in addition to I know, we're focusing on children's privacy laws right now, but another emerging area is these Data Localization requirements for international laws. Most recently, I've been looking into Russian Privacy Laws, which are not necessarily super new, but were similar to these children's issues where you have the data localization requirements, it's a, like I was saying, kind of difficult, where you have this world of technology where anyone can access websites from anywhere in the world.

Debbie Reynolds  07:30
I love what you're reading. This stuff is so interesting. So let's talk about Data Localization. So we know that countries, especially in Asia, have had Data Localization built into their laws for a very long time. But we're seeing that come up a lot more in a lot more jurisdictions; like I've known for decades, India's had Data Localization related to financial stuff. And then I, you know, I've seen, you know, Korea, North, and South. Yeah, you have a lot  Of things about Data Localization, what do you what are your thoughts? Are you seeing that grow? It seems like, you know, we don't seem to be having the Data Localization conversation in the US, but definitely in other countries that come up.

Olivia Holder  08:28
That's a really good point. I don't know if it's saying the US is behind the times in that respect or not. When I think about Data Localization, I think about it's really this idea that privacy is being treated or personal data is being treated like a, like real estate or like a physical object. And that's not exactly a direct comparison. So I think that the sentiment behind countries wanting to have control over protecting their citizens' data is definitely valid, but I just don't know how much Data Localization would actually achieve that goal. Especially when you think about some of the countries that do have those localization laws, it's often more of a, like security or surveillance-based motivation behind that. So maybe it's not so great for privacy at the end of the day for the actual people and just allows those governments and people who govern those citizens to have more control over the data.

Debbie Reynolds  09:41
Yeah. Also, I think one facet I agree with that. By the way, one facet of Data Localization that I've noticed is that they want to have people who are in that country have those jobs. So part of that is okay, we want someone, you know, just like the PIPL law, you know, they want if you have a Data Privacy Officer or you know, someone who understands where your data is and what they're doing, they want them in China, they don't want them in some other location. So I feel like part of that is a, you know, economic driver as well as in addition to kind of control.

Olivia Holder  10:25
That's a great point. Yeah, that's definitely another angle to that as that motivation to bring more of the tech industry back to those countries or maybe to those countries for, like, the first time potentially. But when I think about that, as a motivation, I think like, yes, it is providing more jobs, but it's not like these companies are headquartered and innovating in those locations. And what will motivate that, I think, will be different sorts of policy. Like, I'm kind of, like spitballing this, but like, for example, tax drivers for new businesses to incorporate in specific locations?

Debbie Reynolds  11:11
Yeah, definitely, definitely. And, you know, like, we're seeing a lot of financial incentives in places like Singapore, so they want businesses to use the cloud and, you know, innovate and do stuff in tech. So they're giving people who have those skills, they're, you know, trying to fund them and make sure that they can kind of have more of a groundswell of local talent in those areas. Yeah, let's talk about it; we were chatting a bit before the recording about apps. Okay, so, so since you're at GitHub, have asked about this. So definitely what we saw with COVID. So, and I think part of this is because of this the way the US is right now. So in other countries, you know, countries that decided they wanted to develop an app, the country, you know, it was a country level effort, right. And then people thought that the US would have a Federal app or something and that just never happened. So like every state has to create its own app. And, you know, a lot of states didn't create apps, I think probably one of the more popular ones or the one that seems more robust at the moment, it's New York. So their Excelsior app, where they're kind of tracking people and making like, get a QR code into restaurants and different things. But the other thing that we saw with COVID is that we saw a lot of kind of non-governmental companies and, you know, developing apps for COVID, like, you know, contact tracing or trying to do contract tracing, or just gathering information about, you know, individuals. So, you know, not necessarily in not involved with the government, not involved with health care, but just these apps popped up. And what some places we're finding are that, when and this happens a lot, where you take code and your user for some other purpose, you know, it may have other code in there that you weren't thinking about that does different things. So there was an example of someone who did an app, a COVID app, and it was previously like a pizza app or something. So it will literally be, you know, ordering pizza or just doing weird things in the background. So I think, you know, from a privacy perspective, when people are creating apps, especially around kind of personally identifiable data, it is really important that the developers really look at that code, even though they want to repurpose some of it, you know, you have to really look at what is actually doing because maybe, you know, if it was a pizza app, no one cared, right, but is this the app that is collecting personally identifiable information. And then also, I guess, I'll throw in here, kind of the FTC, the FTC act, where they, you know, just because a company is not covered by HIPAA doesn't mean that they don't have responsibility for kind of personal health information.

Olivia Holder  14:36
Yeah, those are all really great points. I think, you know, the developers face is super interesting to me; obviously, I'm working at GitHub. When I was in my previous role at Microsoft, I was also supporting developer tools. And these are like a lot of times in product design. We talk about identifying edge cases and solving for those, and like this COVID app example that you gave, I mean, the edge cases are virtually endless. And when you think about, you know, a single developer, potentially working on GitHub or elsewhere, we're just trying to do some good might grab the pizza app code like you're describing. It's not necessarily I don't say like the fault of the developer, the developer doing a bad job, more and more developers really have to become privacy experts. And there's just not a lot of good guidance in that space yet. So, for example, some of the things that I have advised on, for example, publishing open source code or creating a new app, for example, when you're leveraging third party code, sometimes it's extremely difficult to understand number one, is the code that you're leveraging, collecting telemetry, it probably is. And when I'm talking about telemetry, I'm talking about Usage Data that's coming back to the developer or the enterprise that built the app. And then number two, what exactly is that data? What are the specific events? And what sort of information do they contain? That can also be really difficult to figure out. I remember a recent example of something that I worked on; you know, you can, they might not even have a privacy statement or a privacy policy. When you read that information, very often, it's related to like a website Terms of Service, rather than the code itself. So in this recent issue, I was actually working with one of the engineers who helped to build the project. And he kind of shared some way down in the weeds, documentation that was, like, buried in this information with kind of line items of this as the specific event. That's what it's used for. And it can just get really tricky, especially where developers can use free text to define their own fields. They might repurpose a field, but unknowingly that could send like in the COVID example, you know, PII and clear text where it could be where it's not really necessary to do that.

Debbie Reynolds  17:35
You know, wow. Yeah, I think, you know, especially, you know, I know, we talked about the COVID example of COVID app example. But I think the thing that concerns me, and I don't know, what I want to know, your thoughts on it, is like people developing software is used in IoT devices. So typically, you know, IoT to me IoT devices, basically, a computer without a screen, basically. So if you're using a device, you don't really know what's happening or what's going on inside that device. And so, you know, I think, you know, because IoT and the Metaverse, you know, getting all you know, this big play, you know, that's going to be a huge issue going forward, especially because consumers don't necessarily know what to do, you know, they don't know how to, you know, we need help, basically. So we're definitely helping when they're, you know, using these tools, but well, what are your thoughts about that? Just, you know, IoT Internet of Things, the Metaverse in general?

Olivia Holder  18:47
Well, I really like the point that you made, which is consumers need help a lot of time when you're talking about product design and technology and probably under in other industries as well. You talk about creating the minimum viable product to get it out to customers. And there's this sentiment that well, we'll put out this minimal viable product, and then our customers will tell us what, you know, what additional features they want, or what should be changed. But when you're talking about privacy concerns, especially in IoT, consumers don't know what privacy settings are needed. Businesses barely know what privacy settings are needed. Earlier, I was talking about kind of those various edge-case scenarios. Thinking about IoT, like voice-activated devices, it's just little things that might not be caught that really probably require a lot of stakeholders working together to feel through all of those issues. Things will definitely be missed even if all of the best intentions are made. So I'm thinking about the Metaverse and thinking obviously, like Facebook and the Meta. The change the parent company Meta and the recent announcement that they're deleting the facial recognition data from the Facebook technology. I recently read an article that was pointing out that the Metaverse, the new avatar type area that they're focusing on that area, we'll still be leveraging all of these technologies and similar technologies. So it's kind of like transferring the privacy risk from one area to the other. And really, all of those same problems are going to be present in a much more complicated space because it's so new.

Debbie Reynolds  20:59
Yeah, you make a great point. You make a great point. Yeah. Metaverse is gonna make things more complicated. And so I don't know, I think I've  My opinion is that I think Mark Zuckerberg is wrong for the right reasons about the Metaverse. So yes, the Metaverse will be big. But the Metaverse, in my opinion, is going to be about the devices that capture the data, that do the sensors to do you know that stuff. So that to me is what because you know, people who whenever you want to be in a Metaverse, my we were like our big toes in the Metaverse currently. So if you have a smart speaker, if you have a smartphone, if you have an air tag, you know all that is, you know, pre Metaverse, Metaverse prep for the things that are coming next. So I think this is going to be interesting to see how that plays out. But you know, I guess I'm concerned because, you know, when you look at all the news about the cyber breaches and stuff like that, you know, people are still falling for the same old tricks. And we thought we were at a better place right now than we are, you know, so people are still falling for that, you know, give me your password. I can't believe that people still do that. So, going into an area of computing that's that'd be a lot more complicated concerns me greatly.

Olivia Holder  22:36
Yeah, I personally, and I can't even say that I don't use them. But I personally tried to minimize my use of IoT. Like, I'm definitely I'm attached to Google Maps. But as far as like voice-activated type controls, I don't use those. And for similar reasons that you're describing, you know, I think that the intent to create good and useful tools is is a beneficial and not obviously not necessarily a bad thing. But it's, it really takes a lot of effort to thoroughly analyze these areas and come up with a good approach. And I think that's just not really how the world works. The tech world is trying to kind of like push out new products and services as fast as we can.

Debbie Reynolds  23:35
Yeah, absolutely. Absolutely. So what were you shocked when that the news came that Facebook was gonna delete these images?

Olivia Holder  23:44
Ah, it was surprising, but I think it was definitely a good move. Earlier, we were talking about some of the children's legal issues, and I mean, children especially, but other areas where opt-in consent is required. Yes, I think Facebook mentioned in the blog post that they have had opt-in consent for this data uses since 2019. But at the same time, actually being able to obtain informed truly informed consent is very difficult. And so I think they, you know, reduced a lot of legal risks and did a good thing by making that decision.

Debbie Reynolds  24:27
Yeah, well, the thing that interests me, so I've been watching that case play out over the years, right. So I literally thought so many times it was gonna settle. So the fact that they were kind of all the way shocked me, but I think because Facebook is taking this move to delete those images. It makes me wonder what's going to happen with these ClearView AI cases that are going on in the Data Scraping cases because this is a similar issue.

Olivia Holder  25:01
Yeah, I think that Data Scraping, in general, is a really interesting issue. You know, for example, with services, like GitHub where we're trying to incentivize open-source developers collaboration, products or projects, and companies that have similar types of goals, that's paired with something called an API application protocol interface, where developers can query information that's held in these services. And, you know, that was part of the Facebook Cambridge Analytica issue; they do have policies for their APIs. And developers were able to bypass those policies because it was kind of more words on a page as opposed to having actual technical controls over such usage. And so I think, moving forward, companies are gonna have more of a responsibility to make sure that API usage is appropriate.

Debbie Reynolds  26:10
Right, yeah, because we're in so so much of the future is about trying to create some level of interoperability. So kind of the open-source movement is very important in there. So APIs will play like a huge role. Because companies, you know, I think in the past, it was like, Okay, well, let's use this new process or this new tool, and you basically grab all the data you have and put it into this new bucket. So being able to say, Okay, let's keep the data where it is. But let's create these connectors so that we can kind of share and have data traverse all these different areas.

Olivia Holder  26:50
Mm-hmm. Yeah, that reminds me of a book that I read last year by Shoshana Zubov. I'm not sure if you have to take a look at it. But it's called Surveillance Capitalism. And it talks about how if it's free, you are the product is a bit of a misnomer; really, humans are the means of production. And the product is these huge markets of scale built off of our usage data that are being traded from company to company where consumers might not be aware. And I think that that kind of appropriately controlling things like API usage will help to limit those types of like, dark patterns, essentially. 

Debbie Reynolds  27:44
What do you see this coming up in the future that concerns you most about privacy?

Olivia Holder  27:51
That's a good question. I think that what concerns me most is that privacy controls will not be able to keep pace with emerging technologies. And I think that it takes for an organization to do a good job at privacy, it takes investing in privacy professionals in the business building and scaling privacy teams, not just in the legal department, not just in the like, technology or product department, but kind of in all areas. And that takes work. It takes, you know, privacy professionals, legal professionals, convincing CEOs to make those investments. And I just hope that that can be done quickly enough to keep pace with some of these emerging problems. When you think about laws that are coming out today around things like responsible AI or the emerging some of the emerging privacy laws we've been talking about. They aren't contemplating a lot of the things we've discussed today, like what Telemetry Data is being collected by developers building apps; necessarily, they're thinking about those specific use cases. So it's really important for entities not to be treating privacy as simply a matter of compliance. And, you know, we can reference this case, and we know that this is the clear black and white line to where we are behavior needs to sit. It's more important for as like an ethical obligation for entities to do well at privacy. And so, it just concerns me that that might not be happening everywhere.

Debbie Reynolds  30:00
Right. Yeah, I think someone. I had a gentleman, very smart. I like him so much, Chris Roberts on my show. And, you know, he made a good point that, especially when we're moving into areas with some form of AI and the Metaverse and the kind of all this new data being collected, you know, the harm can be astronomical, and there isn't a kind of adequate redress for that. So, you know, in some ways, you don't get a digital do-over like someone like let's say, some facial recognition, technology mistakes you for criminal or something, you know, maybe you'll get out of jail, maybe you want I don't know, you know, so I think, looking at how this data is being collected, and how it's being used is really important. And I think trying to prevent the harm and the understanding of privacy has to be kind of a proactive thing. And as you said, you made a great point, you know, there will not be precedents for things that will happen in the future. So it's not like you can look back at the past because the future is not going to be like the past.

Olivia Holder  31:11
 Yeah, that's definitely true. And I kind of like I was saying earlier, you know, it takes a lot of, I think it takes stakeholders from a lot of different backgrounds to actually achieve good ends in a space. Like earlier, I mentioned the, you know, an engineer that showed me some documentation that was buried in technical guidance in order to figure out what Data Collection was occurring at that time. It, it takes the privacy experts but pairing with the industry area experts or the product area experts to actually be able to identify these cases where a violation might be occurring.

Debbie Reynolds  32:00
I'd love to talk to you about, well, maybe two things, we'll see how you want to answer this. But I'm always interested in how privacy professionals who are in executive roles, how you know, what are your tricks of the trade to get that buy-in from different stakeholders, different levels, different jobs within the organization? And then how have you been successful in getting the investment buy-in from your organization?

Olivia Holder  32:32
Sure. I really like that question. A lot of times, I think it privacy discussions, turn it like focus on the problems rather than the solutions. So I think this is a good opportunity to talk. A couple of things that I like to bring up is, number one, just continuously socializing that idea that we're not just talking about personally identifiable information. We are talking about personal data, which is a very wide set of content. And that kind of goes back to increasing training and awareness regularly as things change. Another concept that I like to socialize is Data Minimization. From the perspective of it being valuable, not just for privacy, a lot of times when I get into Data Minimization discussions, I am sure that you have heard the same things like we just want all the data right now, we don't really know why, but we'll figure it out later. I often like to come back and say, well,  don't you want to be efficient with your Data Collection? Why do you want a bunch of data that might not be useful, wasting space, and that tends to go over well. A lot of times, things like that can help you pack on to other efforts that are going on in the business, there's always some sort of cost reduction effort going on in a business, and if it's focused on something like data storage, you can kind of lean on those other areas of the business as leverage for doing the right thing for privacy. Another great hook, obviously, is leveraging security organizations in the business. Generally, the goals are, if not similar, definitely overlapping. So definitely like to grab those other stakeholders to build the case. Another thing that I like to focus on and draw away a little bit from the privacy jargon and just take it from the customer perspective or client perspective, or, you know, whoever the end-user customer is to say we want them to, we want to be transparent about what we're doing, we have to put them on notice and get consent for that usage and just kind of start from that perspective.

Debbie Reynolds  35:20
Wow, you dropped some knowledge on us today and some serious executive talk. That's really important because I think a lot of people I know, in the privacy area, get frustrated because they're like, well, why, you know I want the company to do this because it's the right thing to do. But you have to have it make sense within the business framework as well, right? So hearts or minds will only go so far, like, you got to put some numbers to it, you have to find a way to make it make sense for that person to be able to kind of sign-on and advocate for it.

Olivia Holder  36:01
Yeah, earlier, we were talking about kind of privacy in the news and how it's in the news a lot more. I think that's also always a great hook because you have an example. You know, look what happened to Facebook; we don't want to go in that direction. Also, of course, having examples of monetary settlements or fines to bolster arguments is good. But that does kind of go back to another discussion we were having where privacy really does go beyond compliance because we don't have the clear cut. Yes, no, we're often in the gray space. And so, it's important to instill in executives and stakeholders the idea that we are responsible and have an ethical obligation to do things right. For example, when we're talking about the developer space, helping those developers who are using tools and functionality to do it in the right way.

Debbie Reynolds  37:05
You know,  I work a lot with developers, as well as advise companies that are working on emerging technologies. I do a lot of that. And I think the thing that that I don't know if it surprises me, but the thing that happens a lot is that, you know, when I'm working with developers, they're incredibly passionate about what they do. They're really excited about what they're done. And then if there's something they're developing that kind of crosses that line, you say, Well, you know, we really can't do that, you know, so kind of takes the wind out of their sails, but you know, it's not, it's not a knock on the person, right? You know, they're trying to do their best job and they, you know, want to impress people, but, you know, having those conversations early on, you know, not only are they good to have they are vital to have because what I've seen is companies come up with barriers to adoption. So it's like, Okay, we have this cool tool that does all this stuff. And then when the privacy folks, look at it, you know, it's a hard stop at that point. So having those discussions early is really important.

Olivia Holder  38:12
Yeah, definitely. I'm actually working on a privacy by design policy right now and have been reading a little more into that in the Jason Cronk Strategic Privacy By Design Book. Something that really stuck out to me there was the idea that privacy is not a trade-off in like business value or functionality. And that's a big hurdle to overcome. And so, like you're saying, getting privacy involved early on, you kind of help to steer that design in the right direction. But it can take a lot of convincing.

Debbie Reynolds  39:01
Yeah, that it definitely does, and it never stops. That's for sure. So I don't know I; for many years, I had a Google alert for privacy. And before the GDPR came out, nothing like there was not an article that came out even once the law passed in 2016. It was very little, and it's like a little trickle of everything. So now every day, you know, they're like eight or nine articles about privacy that come out. It's pretty interesting.

Olivia Holder  39:31
Yeah, it definitely becomes a buzzword. And that's why it helps us out as privacy professionals to have a lot of practical examples of where things need to change.

Debbie Reynolds  39:44
Yeah. What are your thoughts about legislation? So, you see what's happening in the US where all different states are doing kind of their own thing, even cities, right? So, you know, certain cities, for example, have banned like things like facial recognition, and they were seeing kind of the state-level laws coming about, and people are, you know, always frustrated about not having kind of federal level, you know, consumer privacy law in a way,  like the EU have, or how China has or even Australia, places like that. What are your thoughts about what will be happening? So your crystal ball in the future for legislation in the US, either our Federal or State level. 

Olivia Holder  40:38
I mean, in the future, there will be a Federal Privacy Law that covers, I would say, hopefully, not just consumers, but really US citizens, period. So like, not just in the context of a business doing things, but really, in any entity doing things with personal data. I think it's one of the few fairly bipartisan issues right now. So we have like the outlook is good for something happening in the near future. But I wouldn't be surprised if it's something that takes five years, 10 years to actually come to fruition. I know, in the state of Washington, where I am, there was a lot of tension between including facial recognition, technology in bills, and being able to take facial recognition out of the picture and pause on that usage. Like you were saying, some cities are blocking users of facial recognition as well. It is a good place to start because there are some core principles that are emerging from, you know, whether it's GDPR, we see some of it in CCPA, although it is more consumer-focused than would be, dare I say, fairly simple to get set on the Federal level?

Debbie Reynolds  42:07
Yeah, I don't know. I don't know, and we probably have a whole session just talking about this; the sticking point seemed to be a Private Right of Action, right? This is don't really want that. And then preemption, which states don't want. So the states want to be able to have these laws, and I haven't preempted by some federal, and then, you know, the Private Right of Action. So to me, I would almost not, you know, at least for our first try, right? If we, let's say we're gonna, you as you know, it takes years and years and years to like, develop, you know, comprehensive legislation like that. So it's not going to be like a Hail Mary, or, you know, something that just happens on a snap, but be able to agree on just definitions of things, you know, let's agree on a federal level, what is personal data? What is sensitive data, you know, it's different from state to state and, you know, is at it's hard to even describe it, because, you know like for example, the New York Shield Act, they don't call it sensitive data, like they have like tears of data, right. And then, you know, it's just very confusing. So being able to harmonize on that, to me, will be like a huge win. I think

Olivia Holder  43:30
that's a really great point kind of starting narrowly would be beneficial. You mentioned two things, prevention, and what was the second one?

Debbie Reynolds  43:44
Oh, Private Right of Action,

Olivia Holder  43:45
Private Right of Action. That was what I wanted to chat about a little bit. I think that that's a really interesting problem space, especially when you talk about like, actual harm versus potential harm. And I just think about, like, some letters I've received in the mail, like, oh, you can be part of class action for whether it's like Equifax or a data breach that one of my old health care providers had, and at the end of the day, it's like, you can receive $3 If you fill out this form. And I think it's good that there is some sort of remedy, but the remedies that are available in these kinds of scenarios don't really transfer much of value back to the people who were whose privacy was harmed, except for that. It probably won't happen in the future, in such an egregious way from the company that did have their consent decree put in place or settle for a large amount of money or whatever the scenario ends up being. So I would love I don't know what the right solution is, but I would love to see a better model for recourse for people whose privacy is injured. The preemption thing, I definitely understand where you're coming from there. But I think, I think a great approach would be to have a federal law and enable states to enact stricter laws if they want to.

Debbie Reynolds  45:24
Yeah, that's, that's basically what the EU has with GDPR. So obviously, the GDPR covers all of the EU member states, but those states also have their own laws that may, you know, have special things in them. So, to me, I think that's kind of a hurdle; we just need to think about it in a different way. Because I think that's, you know, you know, states in the US have their own type of sovereignty, so they should still be able to do things that make sense for their state. And not, and you know, not have it be a situation where you're kind of wiping out everything that space of it. But I think the big concern is that the movement in the US on privacy legislation is happening really rapidly on the state level. So at some point, it's like, gonna be so out of hand, it's just like, there'll be hardly any way to harmonize, I think.

Olivia Holder  46:22
And yeah, I love that your idea of what if we could all just agree on what is the definition of personal data would be a great start?

Debbie Reynolds  46:31
Yeah, I think, Oh, my God, we totally need to do that. We totally need to do that. So if it were the world, according to Olivia, and we did everything that you said, What would be your wish for privacy, whether it be, you know, law, technology, human stuff,

Olivia Holder  46:51
I would just say opt-in consent. And by opt-in consent, I just mean, and, you know, what, whatever industry it is, whether it's a, you know, you using a technology product, or going to your doctor's office and filling out your paperwork that entities really do provide the whoever that client end-user customer is fully informed set of data about what they're actually going to collect? And why and give a meaningful choice to that person to opt-in or out of such usage?

Debbie Reynolds  47:34
Yeah, it's, that sounds so simple, why can't we do that?

Olivia Holder  47:39
It's a lofty goal. And you know, a lot of the times like I gave the doctor's office example, you know, you really do have to provide that information to get seen by a doctor and get good treatment. So there's kind of this issue of what is actually required versus desired, and then what downstream usage is appropriate. I think it's just a lot of moving parts. And like a lot of what we talked about today when we have emerging technologies that let you literally live inside of a virtual world. We just haven't foreseen all of the problems. I mean, really, you can look towards like, science fiction, Minority Report, things like that, to see where things might be going and try to stop it.

Debbie Reynolds  48:36
Yeah, I love technology. And I will talk about the future. And it's interesting to me that almost all movies about kind of this area, like technologies, the like, they're all like, dystopian. They're all like, really, about the scary things that are going to happen. So it's like the evil robot movie always comes out. So I don't know, maybe someone will make a positive movie about the future?

Olivia Holder  49:02
That's a great point. Yeah, it's easy to just kind of focus on the negative that could come out of these kinds of privacy issues. But at the same time, like I mentioned earlier, you know, using Google Maps to get somewhere you haven't been before, or, you know, being able to order your groceries on Instacart. Things like that just make life a lot easier, and it is an improvement.

Debbie Reynolds  49:32
Yeah, I agree. I agree with that. Well, thank you so much. Olivia was so much fun to chat with you about all these, you know, all these issues. I'm sure we'll chat more in the future.

Olivia Holder  49:45
You're welcome. Thanks for having me, Debbie. All right. Bye-bye.