"The Data Diva" Talks Privacy Podcast

The Data Diva E64 - Shoshana Rosenberg and Debbie Reynolds

January 25, 2022 Season 2 Episode 64
"The Data Diva" Talks Privacy Podcast
The Data Diva E64 - Shoshana Rosenberg and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to Shoshana Rosenberg, Founder, and CEO of SafePorter Data Privacy tools for Diversity and Inclusion. We discuss her focus on diversity and inclusion involving data, privacy implications of apps and data, her dual challenge as a privacy and diversity advocate, sensitive data and privacy legislation, the difference between types of data, need for ongoing education and the knowledge of privacy rights, ease of getting data, privacy regulations on the State level in the US, State laws that are hastily updated, transparency and privacy obligations with data, a reasonable person test for privacy, cultural differences relating to privacy, problem begins with collection and retention of data,  and her hopes for Data Privacy in the future.

Support the show

45:08

SUMMARY KEYWORDS

privacy, people, data, organizations, transparency, individuals, law, information, consent, question, understand, tied, world, app, inclusion, diversity, states, agree, work, retention

SPEAKERS

Debbie Reynolds, Shoshana Rosenberg


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. This is “The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that business needs to know now. Today I have a special guest on the show. Shoshana Rosenberg. She is a long-time attorney, a privacy advocate, and also the CEO and founder of Safe Porter. Hello.


Shoshana Rosenberg  00:41

Hi, Debbie. So good to see you and speak to you again.


Debbie Reynolds  00:45

So we have met on LinkedIn; I think we comment on each other's posts and stuff like that. You have reached out to me, and we were chatting. And we decided this year that we're going to do some type of collaboration, which we still will do. But I thought it would be fun for us to do this podcast together. So I saw that you had done another podcast with Pedro Pavon. I think we have both done it for him. And he had to have like an email out, and I saw your name. Oh, we take this opportunity to chat with Shoshana to find out what she's up to. So I'm glad you agreed to do this podcast.


Shoshana Rosenberg  01:26

Oh, I'm honored to be here. I am so glad that you're putting out the information that you do. You keep it very, very straightforward. And everyone in the industry is always glad to listen in. So I hope I can add value, but you always do.


Debbie Reynolds  01:40

Oh, sweet. Yeah, I love what you're doing; you do something very unique. So your focus, and I would love for you to tell your backstory as well. But I want to make sure people understand that your focus is in kind of diversity and inclusion and data around that. So I don't know anyone who does exactly what you do. So tell me you know a bit about your privacy journey SafePorter and kind of the privacy issues that come up with diversity and inclusion data.


Shoshana Rosenberg  02:13

Sure, I'd love to. So for four years now, SafePorter has been the privacy by design solution for diversity, data intake, inclusion, feedback tracking. And what's really important there is that the sort of the backstory of it is simple enough, I've been in privacy for about 14 years, international privacy law. And there was a point several years ago where it became very clear that organizations around the world and, as you might imagine, in Europe, were trying to ask their staff, or their members or their students about very sensitive categories of personal data that do relate to diversity and help an organization understand the representation. But that actually put individuals and, frankly, the organization itself at a great at a much higher level of data risk and privacy risk for the individuals, which was my focus. So SafePorter is a suite of tools that addresses that and ensures that identity doesn't factor into where that is stored so that the information is stored without the context of your school information or your bank information. If you're applying for a loan, and they're asking these questions, your job application or actually your employment information. It's an isolation from that it's not tied to anyone's name or email. And we have a patent-pending process that ensures both of that is handled properly and that the disassociation is finite. And also that we have privacy controls that ensure that recency or scarcity don't become identifiable unto themselves. So you're not going to see percentages change where you've just hired someone. So we've worked very hard with some brilliant privacy engineers to check and double-check how we're going about things so that we can help our clients see who is at their organization and critically get the inclusion feedback that they need to in a truly anonymized and anonymous fashion.


Debbie Reynolds  04:23

Wow, that's interesting. So let's break this down a bit. I know a little bit about this. Because, you know, I work with organizations that create apps for students, right, you know, college, you know, kids that are younger than that. And some of these app makers want to know kind of the race or nationality of users, right. So that information, you know, in the US, you know, someplace that they want to have people volunteer that, you know, the school may want to know that for different reasons and stuff like that, but you and I know that some of these categories or information around, in certain privacy regulations around the world may be considered sensitive data. Right, it was combined with other things. So I think that's something that a lot of people don't really think about or think, okay, so how harmful can this information be if it's combined, but you know, we're seeing that be addressed, you know, internationally, where it's being considered, you know, as someplace as sensitive data?


Shoshana Rosenberg  05:37

Well, and the other thing is that, frankly, where you're actually trying to understand the true diversity of your organization, which doesn't just mean, do you have token numbers of something, right? But do you have a balance? Do you have a full spectrum of a universe of different types, and you know, from abilities and types, and we're talking neurodiversity, we're talking all sorts of components that you want to be able to understand who you are as an organization and grow. And what happens there is that you want to ask questions that, quite frankly, most of us would not like the information to reside with you as the employer or the school. Right? And then we most of us, we wouldn't necessarily want it next to our names or emails because this is something that we can give over to facilitate representation. But it doesn't mean that I wish for you to be able to tie it back to me; yes, you should know that there's an Irish Jewish comic book collector, that's me. Over 40, right? Yes, you should know that you have this person, you know, somewhere in the mix. But it shouldn't be something that you can tie back to me. And so that's the notion there are questions around sexuality, gender identity, right, that we want to be able to understand as organizations, do we have representation? Are these, like these different groups and different backgrounds? Are people happy here? And do they feel that other people, they have other people at this organization that they can relate to? Right? These are questions we want to ask. But the data, having that data alongside any contextual information, creates a really unreasonable risk. It's a risk for bias, and it's a risk for some sort of differential treatment that maybe is meant to be the opposite of bias, right. But it ends up becoming something that can prove not only a privacy liability, but can prove an interference where it was just intended to allow for the company to try to get better from a diversity and representation standpoint. And the reason that I always harken back, and you know, this is it to the inclusion aspect is that we always, our tools are bundled in that the inclusion feedback component, which not all companies, not all of our clients are ready to turn on, right, there's a high level of accountability there, you don't always really want to see what's going on, unless you're just at the beginning of your journey, or you're very advanced. But it's important to remember that without that, diversity is just a number, and you're gonna if you're gonna, you've got a lot of unhappy people who are going to leave you in a year, and you're not trying to figure out whether or not they feel included, then you have that problem. So both of those things, being separate from the data about the individual's context, are really critical. So when you're talking about what they're doing with apps and things like that, it really is crucial to look at ways to not keep it all together.


Debbie Reynolds  08:37

No, no, I think you may have a dual, dual challenge in your role. So one, as a privacy advocate, some organizations, they've been traditionally reactive to things, so they may think, okay, nothing's happened. Everything's, you know, the trains are running on time. You know, why do I need privacy? And then, you know, talk with them about the diversity, inclusion, and privacy of that data, and they may not even know how that relates. So tell me how you tell that story to organizations that they don't understand, first of all, the importance of privacy and how privacy impacts kind of diversity and inclusion.


Shoshana Rosenberg  09:18

So that thank you. That's actually a really, really critical question. The reason that SafePorter is a B Corp, we're B Corp pending we're in the final. It's was slowed down by COVID. But we're in the final confirmation stage right now. That the reason that we set out to be a B Corp is that I view it as a utility. And you don't generally call people up and say, Do you need electricity? Because when they figure out that they need it, they go looking for you. And I don't want to be part of an organization that is slapping the name of the organization all over the place and trying to pull people towards something and explain to them why they need it. The truth is two kinds of organizations come to us quite readily. And frequently, that is very small organizations like nonprofits internationally that are realizing that they want this data. And they're doing things in such a careful way that they realize that they don't want the data risk for themselves are the individuals and enormous multinationals who have a privacy person or staff and a diversity person or staff, and I sort of envision the meeting in the hall and, you know, I need the data, you can't have the data. And once that happens, very often, it's privacy who reaches out to me, sometimes it's diversity and inclusion. And it is really our staff has fielded a lot of inquiries from organizations of those two types. It's the middle organizations where they feel that the trains are running on time, and they have both programs, but they haven't quite hit that realization, that maybe they shouldn't be taking on as much data as they are, or they want to take it on in a region, right in the EU, or the EEA. Or the east where they can't do it easily, readily. Or in a way that's welcome to the staff or students.


Debbie Reynolds  11:17

There's a difference between privacy, confidentiality, secrecy, privilege, you know, so those things aren't. I feel like people get those things confused about, you know, especially as it relates to kind of privacy and regulation. And then this categorization of data, which is happening more and more, especially in the US, with the states, just gets very confusing. So like, I have to spend like quite a bit of time telling people, okay, that's not sensitive data. That's fine. You know, that may be personal data; it may be something that you may not want other people to see. But it's not; it doesn't have that protection of kind of law regulation.


Shoshana Rosenberg  12:04

This brings us back to something that you and I had actually spoken about another time as well, which is where do we start the education around this? Right? How early? Can we start to talk to students, in schools, about their privacy, about engaging with the digital world for various reasons, certainly about their privacy, and then making sure that through the schools, citizens have the education around their rights and their concerns and their considerations and where we can have this common language? We are empowered to do so because, and I know that you're, you've raised the metaverse before, right. So the notion is, as all of this, the Internet people, you know, kids call it the well, I don't know, I think people might call it the interwebs. Right. So that this universe that has become a part of our lives, right? That is whether it's the brand meta or the metaverse itself, and in so many iterations, as this becomes a richer, more clearly defined universe in which people are spending huge amounts of time, energy, and data. I think we really have to start teaching, not just our colleagues and the businesses that we help, but I think we have to push down and start doing some pretty mass education. You think?


Debbie Reynolds  13:32

Oh yeah, I mean, I guess it's parallel cyber in that way. I feel like education needs to be continual and ongoing. But then also people need to understand what their rights are. And I think that's how I became interested in privacy. So my mother was reading a book in 1997 called The Right to Privacy. And she was fascinated by this concept. And her interest piqued my interest. So I read the book, and I was shocked. I think, in the US, we think, okay, we're so free, you know, with the land of the, you know, home of the brave land of the free, you know, of happiness and also, Liberty. Like, you know, life, liberty, happiness. And, you know, once I realized that privacy wasn't like a fundamental right, or it wasn't expressed in, you know, detailed terms. I'm like, what, you know, because you think, to me, privacy is like, freedom, you know, so it's like, what, like, that's my primary or, you know, you know, what rights do I have? So that really sparked my interest in privacy, and you know, it still does because I see articles on like, what, what happened, you know, what does this company do with data? It just shocks me, you know, from a day-to-day basis. What are your thoughts?


Shoshana Rosenberg  14:57

Well, it's funny because there's so many books like that. I don't know if you know, but the Future of Privacy Forum has a really good book club. That actually has a lot of great books that I've continued to push myself with and to think alongside scholars in our field. So So that made me happy to think of you coming at it that way, which I think is a very honest way to come by it. I think what's most interesting to me about the notion of the right to privacy is that it's something that it's probably the only right, with the exception of joining the military. The only right that we so quickly barter away, right? So people, well, yeah, you can have my data if I can do this thing that I want to do and not pay. So you don't you can't think of that many other things whereby you give away your rights, right? Like, given you go to your job, you give away your time, and you follow their rules, but your rights remain in place. But privacy is something where we've because of the speed with which all of the technology has grown up around us. And the shiny objects that we wish to see or share or touch or, you know, somehow interact with the sort of theft of privacy by virtue of even where you have some transparency is something that people are willing to let go of. So it's interesting, I guess; I'm sorry, I'm being a bit circular here. You were surprised and engaged at the notion of the sort of the Delta and whether we should have a right to privacy and the way that US law covers it. And what I think is interesting there is, and this is Daniel, sort of, right, he writes about the privacy paradox brilliantly. People don't know that they care. Right? So yes, okay, great, I should have all the rights that I want. But why do I want them? Does it because I can bargain it away for this other thing? Or should I look at it from a bigger picture and understand both? What I want to be the case for everyone a standard? And what I'm willing to disclose? And it's a very, very interesting thing. So what do you actually think about that, that bartering in the value people place on their privacy? Is there a way that we can work to not just educate people on it but improve that? How do we show the return on investment for individuals to hold their, their, their information closed when their banks are being, you know, breached every third day? So they were always getting notifications that their information was exposed? How do you tell them? It's still important?


Debbie Reynolds  17:42

Well, that's a great question. I don't know, and I feel like especially, you know, marker, are you really cybercriminals? I don't want I wouldn't say markers are cybercriminals, but, you know, people who want information, people who want data about you, and they can, you know, they're smart at being able to get that information, a lot of it goes to, you know, how can I give this person immediate gratification, in exchange for them to give me something that they don't know, they don't know, today? That is going to be valuable in the future? So, so yeah, I think, you know, because you work with a lot of people, you know, you're in this is always a sticking point between kind of the US, Europe and our conception about privacy, where for them, privacy is kind of an innate, fundamental human right. So the cradle to grave, right, of an individual where here is like, you know, sometimes you have it sometimes, you know, depending on what you're doing, or what data that you're using, you know, privacy may kick in, and in a way, it's in the US tied to consuming so, because privacy regulation is tied to the consumption of something, you know, when you're not consuming, you know, what privacy rights do you have? Or, you know, people may feel like, okay, well, you know, I will give someone my fingerprints for a $10 coupon on the thing that I want to buy today. So it's like, you know, let me satisfy my immediate, you know, gratification for future harm that is unseen, you know, I think that's what it is.


Shoshana Rosenberg  19:41

I agree. I also think that the tether to consumers on the strongest law we have right the CCPA was intentional, and I think it came from or it came about organically, but at the same time, the reason I think it took purchase was because of that tying into the economy, right? Whereas things get very political when you talk about human rights in the US, and you talk about the human right to privacy. So, I'm hoping that as we see this proliferation of state laws, you're noticing that it's not everywhere, right? It is really CCPA that has focused that direction, California always being admirably very different and its own, and CCPA. People used to complain, you know, they say, oh, can you is GDPR really the gold standard? And personally, I think, yes, I think that was forged in fire. But, there was a lot of hard work that went in there. But is CCPA the best thing that we could do? I think that the work that has been done by CCPA has grown the US up overall, right? Because you don't know who owns property in California, just because they work for you to you know, this, we have been moved ahead by virtue of this. But I'm still seeing you're seeing these other draft laws that are coming into place. They're not. They're not running alongside the consumer aspect. So hopefully, that's an anomaly. But I do agree you're right, where we're taking information that will be vulnerable for a lot longer than the individual imagines and used to catch them. Even if it's as a consumer, in a lot of other ways. Now, I will say, as a lot of good people at Meta and there are a lot of great privacy people at Meta now will tell you individuals like to have personalized marketing, right? So there is that sort of, you know, it's almost its own philosophy class, at this point, I think you could, you could get a pretty good philosophy class going in the full arc of human existence within the Internet and the way we engage with the world. What should you, in theory, want? In terms of privacy? What do you as an individual like that you get out of sacrificing your privacy? And how can we, in this industry, help direct you toward a bouncing test that lets you flag and raise your hand and say, yes, but not this or yes, this but only for these purposes? I do think that there's sort of more research and consideration that can go in there as we move further into these more complex systems.


Debbie Reynolds  22:27

Right, right. So let's talk about the states. So I think you're a great person to talk about the US states and kind of what's happening on the state level in the US privacy. Regulation. So you know, to me, the US patchwork of laws is probably the most complicated of any country I could think of because it changes so fast. So the states can pass these laws pretty fast, faster than we could do something on the federal level. And a lot of these laws are different enough to be in the lane. Right. So they're trying to address the same issue but doing it in different ways, almost bespoke. Right. So I think that some states had envy of the CCPA in California because they're like, California is getting all this attention, you know, internationally. So let me do some law and do something a little bit different to California to make me stand out. And, you know, to me, that's what we don't want. We want more formally, more consensus on just eat, you know, I would be happy we even had just a standard nationally about what is personal data, you know, what a sensitive data that would help tremendously, not only in kind of the privacy regulation, but also kind of data breach as well. So we're seeing a kind of confusion there. But I just want to toss this out here. Let you give me your thoughts on this. So as I think about what's happening in the US with the states, what I'm seeing is that two things that the states seem to be doing that are similar on the upsell. One is they are moving us from a notice regime where all companies have to say, hey, I'm taking your date, I'm going to do XYZ and get a pamphlet in the mail, or you get an email. And they're creating these consent mechanisms or consent requirements for certain types of data. So that's something that we didn't have, and I see that in the state laws. And then the other thing is this categorization of, you know, almost like New York like tiers of data, certain data has certain additional, you know, requirements or certain additional fines based on kind of the misuse or abuse of that information. So what are your thoughts about what it is that the states are doing, you know, together with this kind of moving privacy forward in the US?


Shoshana Rosenberg  25:13

So it's a really great question. And I have to say, all I could think when you began, and we're talking about consent, is, again, the parallel with GDPR. And with Europe and how you're viewing and coming to things, so consent in the US, for me, what is very clear, right, is that we are a very contract-based universe. And so the way we come about consent is not so much privacy is your default. Right. And if you are to relinquish any aspect of it, you must yourself make it an active choice. It is more okay. We know people don't read those notices. They don't read their email. So we at least need to get them to say that they have, and then it's back on them. Right. So I don't think that the changes are coming from this same place that those same results right, are born out of the European platform and standards. But I do think that they are helpful, in that this two-sided aspect of saying, companies have to tell you, but you have to be paying attention. Whereas again, with the UK and cookies, you know, they're not wrong. There's a point where people just get fatigued, and they're just going to consent. So so that is also an issue, and that sorry to come back to it. But with the transparency you and I had talked another time about sort of, is there a way we could do things in a very simple way with symbols or something whereby you sort of the way, the way that you do with food at a restaurant now you say, oh, that's gluten-free? I know that symbol, right? You say, Oh, they're gonna use this for biometric data? I know that symbol; I'll opt-out. Is there a way to make it more shorthand is my question because it's got to be more efficient. The contract aspect of people actually consenting in the US, I think, was an inevitability. Because otherwise, there's sort of no way to prove that the information was effectively exchanged and registered. And even the proof there is limited. In terms of the other piece, that the different tiers of information, I agree with you that there needs to be standard language. It is funny to think do so when I was a kid, and I was thinking, oh, well, you know, I'm going to go to law school someday, because, for whatever reason, I had it in my head from a very young age. And the party of the first part, right, there was this very elaborate formal language that kept the law at a remove even more so than just basic convoluted terminology, kept a lot even more at a removed from individuals than it is now. Like, now we're in a legal design phase, where people are starting to understand that just like privacy-related issues, transparency, clarity, and an intentionally simplified agreement is actually a very valuable thing in this world. And I guess all I'm saying is, where the states are entangling their own language and their own priorities and new bills to sort of carve out little pieces that are stamped with that state's implementer like that, that's not actually helpful for the individuals. And the only benefit that I see is that it yet again removes the law from the individual in such a way that, say, privacy professionals like yourself, and I can find ways to, you know, help elucidate that. Right. So this is not a good move, moving toward more tricky laws and standards, even as you talk about data breaches, right? If you have a data breach or notification requirements, like I don't know about everybody else, but I mean, I have a huge spreadsheet with the that I had had had to make up that sort of lets us know what happens where for any organization, in case of it, who do you notify, and it's never the same. So at that point, we've created inefficiency and inconsistency and hoops to jump through, and I'm not sure it's serving anyone.


Debbie Reynolds  29:13

Yeah, I agree with is so confusing. And then the thing that just really gets on my last nerve tap dances on my nerves. When a state has a law, so let's say they passed the data breach law a couple of years ago, and instead of passing a new law, you know about privacy, they'll just update that old law. So on the books, it looks the same, right? Like you know, I have the same number and everything, but the text has changed some way, or they're trying to have something else going on. And I think it just creates just that much more confusion about trying to follow because you think, okay, well, you know, like you have your spreadsheet; okay, I have all the states and I know all you know the references and all this stuff. And then they go in and say, okay, well, we're going to add, you know, this, this new language to this old law. So they have to go back to our long-term update, what those changes are.


Shoshana Rosenberg  30:12

This is a problem in government contracting, too, right? Like, it's just you get a mod change, but it's the same, it's the same thing and not calling that I will say, California, the California legislation handled that incredibly well as they were making changes to CCPA. And to CPR, like those aspects are being very well done in other places. And I think you've made a very sound point there, right? Where we're looking at, whether we're calling it that or not, we're looking at transparency for consumers. Let's look at clarity and transparency in changes around these laws as they do get updated to accommodate new considerations and concerns like COVID. Right? This is important to be able to highlight, leave or track changes somewhere in the wings so that people who are working so hard to stay abreast of an entire tapestry of laws across the country can do so most effectively for the people that they serve. I think that's a great, great point.


Debbie Reynolds  31:10

Yeah, I agree. So let's talk a little bit about transparency. So I guess there's a tension here, and I'm sure you run into this a lot when you're working with organizations. So we keep we're preaching to people that they need to be transparent with their customers, or the people that they have data about, regarding what you do with your data. But then we're also saying, You need to keep the information private, right? Or you don't really protect the privacy of the individual. So how do you reconcile people? Or how do you explain kind of the difference or their obligation about you know, how to be transparent, but also how to protect data and keep it private when you're handling it?


Shoshana Rosenberg  31:58

Well, and it's rare that a company doesn't have some understanding of it. But I think it's an interesting, an interesting alignment there is. So the transparency part, I'm really emphatic about what you would want to like what you would expect someone to do with your information, right? So and sort of what you would expect to be told based on reading something. So if you come to our site, or any sort of a site, and it says, we're going to take your information, and we're going to use it just while you're on the site, and then we're going to get rid of it, then you would expect that two days later, if they had a data breach, they would not have your information, right, you're going to expect that these terminations in the conditions of what they're taking and how they're using it, and how long they're keeping it, that those will be clear. And I think that really, if it's transparency, then if you're going to do grocery shopping, for instance, online, then I think much like you said, you know, why do people make changes and not call them out to me, if I'm grocery shopping online, I am not necessarily expecting that this is going to communicate with Facebook about products that I love, sorry, Meta or any other organization, I wasn't trying to call someone out. But that this is going to communicate elsewhere. What I'm doing, right? Maybe they're going to keep my data. But I think that transparency is making sure people have a clear, clear delineation of anything they wouldn't expect and the ability to opt-out or opt-in. So if it's not something that you would actually, as a user, normally expect, I don't think you're being transparent unless you have conspicuously called it out because that is what we should be doing. Right? The lowest common denominator is not a person who doesn't understand anything, and it might be a 17-year-old or a 76-year-old who's just not as familiar. And you have to go down to the base. For me, transparency is about trying to really work from a design thinking standpoint, to anticipate the language that's needed to make clear the things that would not be intuitively understood or anticipated to make them very clear. The keeping things private, right? It's I think it's actually a really cool alignment to say; we want to be transparent with you. But we are going to make sure that nothing about you is is there's no transparency there to the rest of the world or to any systems outside those that you've acquiesced to be tied to. So so maybe that's where they meet up.


Debbie Reynolds  34:36

Maybe we need a reasonable person's test in pricing. You know, what is reasonable? So if you go and buy a gallon of milk at a grocery store, you know, does this obligate you to, you know, name your first child as to the grocery store? I'm just coming up with kind of crazy stuff, right? Because we see that we know a lot of times that these agreements between the consumer and the company that wants to collect their data are naturally asymmetrical. But it's extremely asymmetrical in a situation where maybe the company is collecting data is collecting way more than they should that don't really have a good reason. So I'll give you an example. There was a company many years ago that had an app that will take your face, and it will age you make you look older, or something like that. And I was like, the new thing. But if you read a policy rather, like Terms of Service, what they were doing, they basically said, if you use our app, you give us kind of the worldwide, you know, like authority, license forever, for your image. So, you know, you use the app for five minutes, and your picture can be on a billboard in Tokyo in 10 years. That doesn't seem like an even exchange.


Shoshana Rosenberg  36:06

No, I think that's, and that's different to then there's the reasonable personal person says to how to determine what is needed for transparency? And then what do you do with a reasonable person test in a country that will allow us to contract for almost anything? Right? So if you have the transparency, then a reasonable person might very well still, I don't know if it's a reasonable person, but someone who prioritizes it, whether they're reasonable or not someone who doesn't prioritize that ownership of visual data, or that ownership of that would have a hard time? I think that's a really interesting question. And suggestion. I don't think we can say, because there are so many curiously, I think a great example is personalized marketing for the people who like it. Right? Then then the more sources it pulls from, the more refined it is, and then they go, and whether or not they're buying, they can, you know, sort of enjoy a world that's tailored to them. And that's a real thing that needs a lot of data to feed it. And for the people who want it really benefits them. So the reasonable person test, I think, has to be about what would you do? I guess that's my question where do we think something is unreasonable? Yes, we agree that we would have to highlight it from a transparency standpoint and say, Are you sure that you understand this? But then should the law in some way say, should we be talking about fitness for purpose? Oh, that's a weird way to use that term, but fitness for the purpose of the data collected, like the appropriateness. And I just, I wonder should, because I don't think we're going to get any traction to try to restrict the way that people can allow their data to be used. So I'm curious if you're talking about a reasonable person, how would we apply it?


Debbie Reynolds  37:56

That's a good question. Yeah, because I mean, some people may agree to things that we may think is unreasonable. But I don't know, to me, I think, you know, it may be different culturally, depending on where people are and what their experiences are. But I think, to me, I wish there was, and there isn't right now; I wish there was a ceiling to how asymmetrical a relationship could be in terms of a consumer giving their data to another person. So I don't think that using an app for five minutes, and they the exchange will be the hour, give them my image forever. The right seized matters forever.


Shoshana Rosenberg  38:42

That's maybe the cleanest way to do it, now that you mentioned it. So the problem is, it will be very hard to enforce both, especially with secondary processors, right subprocessors in various things. But if people just had a choice to say, you could, and you can keep my data for a day, a year, or a decade. If they could control that, right, and they say like anything tied to this application that I give you permission for all follow on information, you are allowed to have this for this amount of time; you're not wrong in that time would be in a very valuable control, especially where the details and the information weren't. But then you, of course, we'd have to have somebody who would actually follow behind and enforce an audit that which is going to be your big problem, and then the legislation really has to cash in. So maybe if we're clamoring for legislation to do one thing, maybe it's a proportionate test, right around the retention of data that does tie into a certain level of consent, sort of, I want you to keep my data for a minimum amount of time, right like big picture cookies, right? Like you can only use the sort of what you need to function on the site, or you can keep it for a longer but still reasonable time. And then maybe legislation can speak to what is reasonable, just a thought,


Debbie Reynolds  40:08

That's brilliant, we, I think we're going to have to talk about.


Shoshana Rosenberg  40:12

Our collaboration.


Debbie Reynolds  40:16

Because as you know, that's where the collection and retention of data, that's where companies get in trouble. So, if you're not collecting, or you're not retaining, you literally have very little, you know, you lower your risk considerably. But people want people are like, you know, digital packrats want to keep everything, businesses, especially, so being able to kind of limit that retention, and maybe put a cap on it, like you said, so right now, it's like, you know, how long does the data tied to the purpose that you're needing it for? And I think that's harder for organizations because they, because it means that they have to continually, you know, create triggers for themselves internally for when that happens when you say, okay, you know, this type of data gets deleted every six months, or every month, or every year or every three years, you know, maybe that would be an easier way for people to comply.


Shoshana Rosenberg  41:13

Well, and the great part, though, about privacy by design, and privacy, engineering, and GDPR, frankly, the imprint that it's had is that we we do understand that there have to be data lanes from which you can withdraw somebody completely. So it's not that hard to imagine. I mean, I realized that we're doing sort of like our own ad hoc philosophy class here. But it's not that hard to imagine that a person could say I opt in for minimum retention, or I opt in for six year or, you know, indefinite retention, whatever they choose, and then they would their data would follow through all of the stops that it's going to make but on that lane, so it could be withdrawn just as it would for a data subject access request. I mean, it's it's not likely but it's certainly technically feasible and and very, very interesting because when the notices and the consent right, but the notice that consent is premised upon a notice that says we keep your data for six years they get no choices.


Debbie Reynolds  42:10

That's true. Wow. We have we hit on something here. We may have to expand upon this at a later date.


Shoshana Rosenberg  42:17

We're going to get pinged. We're going to find out that we didn't invent the wheel in this already got a huge dissertation to whoever it is that has already been working on this for 10 years. I personally want to apologize for trying to chase down this path without your knowledge alongside us.


Debbie Reynolds  42:34

Oh my goodness, well, we're going to find that out. We're going to find out whoever it is we need to contact.


Shoshana Rosenberg  42:38

Gauntlet is thrown.


Debbie Reynolds  42:39

Right. Drama glove down. So if it were the world, according to Shoshana, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether it be technology, human stuff, regulation?


Shoshana Rosenberg  43:05

So you open up two things. So, I have to go with consistency and clarity, a universal standard with universal terms and understandings of basic concepts that are taught and understood. Wherever the Internet may roam would be the first one, and then I have to say it you touched on something else pretty critical. I don't know if you, I'm sure you've heard of David Ryan Polgar, the All Tech is Human right, is the notion that in parallel to that accessibility of information about privacy to individuals, and in the clarity, there is the responsibility for the builders of these systems and these apps and these tools that we use in the organizations who are building them the responsibility to align with and support both the individuals themselves and their privacy and their ability to exercise their rights. So some sort of nice meeting of the mind there. Wow. You've thought about this question. So that's great. That's great. That's great. Well, thank you so much. This is so fun. Oh, Debbie, thank you so much. It's great to see you.


Debbie Reynolds  44:22

Yeah, this is great. This is great. So we definitely have to collaborate to figure out what we want to do for the new year. And yeah, so this is amazing. And I'm glad that you're able to do it. We'll chat soon for sure.


Shoshana Rosenberg  44:36

Of course, take good care and thank you again for your time. It's always wonderful to hear you and to be pushed ahead by the thoughts, and the questions you're asking. Have a gorgeous day.