"The Data Diva" Talks Privacy Podcast

The Data Diva E122 - Joe Toscano and Debbie Reynolds

March 07, 2023 Season 3 Episode 122
"The Data Diva" Talks Privacy Podcast
The Data Diva E122 - Joe Toscano and Debbie Reynolds
Show Notes Transcript


In episode 122 of “The Data Diva” Talks Privacy Podcast Debbie Reynolds talks to Joe Toscano, Founder & CEO of Mach 9,  Featured Expert, The Social Dilemma. We discuss his journey to the realization of the importance of privacy, his appearance in the movie “The Social Dilemma”, there is a lack of coordination in privacy, algorithms, and AI are supplanting existing social mechanisms, the future is headed towards fiefdoms, people as individuals move efforts forward, what direction he sees US privacy moving in, California leads the way, regulation alone can never provide privacy and his hope for Data Privacy in the future.


(44 minutes) Audio and full transcript here: https://www.debbiereynoldsconsulting.com/podcast/e122-joe-toscano


Subscribe to “The Data Diva” Talks Privacy Podcast now available on all major podcast directories, including Apple Podcasts, Spotify, Stitcher, iHeart Radio, and more. Hosted by Data Diva Media Debbie Reynolds Consulting, LLC


#dataprotection #dataprivacy #datadiva #privacy #cybersecurity #people #algorithms #conversations #ticktock #compliance #law #companies #issue #socialmedia #movie



Support the show

44:36

SUMMARY KEYWORDS

people, privacy, data, algorithms, talk, conversations, built, impact, nebraska, world, compliance, big, social, law, companies, issue, agree, movie, business, emails

SPEAKERS

Debbie Reynolds, Joe Toscano


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world, with information that businesses need to know now. I have a special guest on the show. His name is Joe Toscano. He is the CEO of Datagrade. He is also an author, international keynote speaker, and he was a featured expert on "The Social Dilemna". So welcome.


Joe Toscano  00:48

Oh, thank you for having me. I have made it. Everybody listening, I made it. I'm on "The Data Diva" podcast. This is so cool.


Debbie Reynolds  00:57

The funny thing is when we connected on LinkedIn, we ended up talking. And it's always great when you have those conversations where you feel like you could literally have recorded what we talked about.


Joe Toscano  01:11

I know, that was great. I had a click immediately when I started talking to you. So I got super excited when you offered to bring me on the show. I was like, of course, this is what I need.


Debbie Reynolds  01:22

Yes, yes. Well, it's amazing that we were able to connect, and I'm happy to definitely have you on the show. But I want you to talk to me about how did you get here? Like why is Data Privacy important to you? And what's been your data trajectory you have to get here?


Joe Toscano  01:41

Yeah, well, let me take you pretty far back actually, like childhood level. So it sounds weird now looking back but the craziest thing is when I was a kid, my mom actually went through a terrible sexual harassment case. So I learned at a very young age a lot about the value of privacy. And I actually would tell if you ask my fourth and fifth-grade teachers, they would say Joe wanted to be a lawyer. Well, number one, I'm glad I kind of grew out of that over time; what I ended up getting into was technology, I built a whole career around learning to code, doing data science, and all this kind of stuff. And then as I got into data deeper, both as a data scientist as an engineer, and then when I got out into Mountain View, I was consulting for Google at that point, I built a career spanning across engineering and data science and design all in one. I really started to see some of the troubles with the data operations and how that could increase the dangers to society. If it was mistreated, you know, this was a time, mid 2010. All these social media platforms of digital tools really were blowing up. And the thing that pushed me to speak out was, number one, I felt that there weren't that many people who were in the position I was in, which and I'm not saying I was in some classified special position, but in a sense that I had the knowledge of how to look at the stuff, I had a position in the industry that allowed me to look at it, I was also young enough and foolish enough to make the assumption that I could make a change. So I left you know, I had, I didn't need to supply for my family, I didn't have to put a roof over her head, and for kids and, and everything. And I could take that leap. So the way I got to where I am now is really speaking a lot and teaching people. And then when I let it just build and build and build, I wrote a book, which was really to help expand access to this information. You know, I'm from Omaha, Nebraska, originally. And every time I flew home from San Francisco to go teach at university or my family, it was like, I flew 15 years back in time. So I wrote this book, it just blew up my work because I guess the world was craving that at the time, the world is craving more knowledge that they can understand and talk to their family about peer-to-peer type, you know, information, then what's traditionally put out is books for experts, you know, technical manuals, and that up into, as you mentioned, being featured in the social dilemma. And all the impact I've been able to make. I really could never have imagined it would either be where it is today when I was traveling around out of my co-op, just trying to get this thing off the ground. You had news people telling me I should go home to my parents’ basement, put a tinfoil hat on, and sometimes I feel like I should have, I made it. I made it, and we're here today, and now I'm in a very privileged position to be able to speak about these issues at a larger scale and an effort that I'm super passionate about.


Debbie Reynolds  04:55

Yeah, that's amazing. I would like to talk about "The Social Dilemna" for a minute. So, when this came out, I recommended it to almost anyone that I could possibly think of saying hey, you've got to watch this; go watch this. I have seen some people, I just want your thoughts on this. I have a thought as well, who sort of thought that it wasn't technical enough. And to me, I thought it was perfect. Because, you know, I feel like we aren't going to solve these issues by having ivory-tower conversations about data. So having something that is more accessible to everyone is, is what we are lacking. And we need more of what are your thoughts?


Joe Toscano  05:43

I mean, I agree with both of you. I think that it was definitely lacking technical scope. We can all sit here and say it was. I agree. But I also agree with you that we need something that's not so technical. You know, it's, I would have loved to have all the details about how technologies are operating, or different ways of looking at privacy. There's so many different ways we could talk about this. But then what happens from a narrative perspective, from an actual impact perspective is you lose your audience, right? Like, we had 90 minutes to explain a story, a lot of documentaries, they go the route of, hey, here's a bunch of people who are experts; let's let them talk about it. Let's put them on camera and just film some talking heads. When they were working on I said, hey, look, guys, I go back home, I go to the middle of the country, I go to places where people have no idea maybe haven't even, you know, not where I'm from in Omaha. But definitely farther out in Omaha, or sorry, Nebraska. There are places where the Internet's still not really connected, they barely use it at all. Number one, you need to shorten the content, choose a topic or like us, pretty niche, focus on two or three things, and then build a story around it, put a lot of art in there work really hard to turn it into a graphical thing that people can wrap their heads around, because otherwise, it's just going to go over their heads, you know that that was the root problem. And so yeah, I agree with both sides, right? We had to make it accessible, you had to also talk about, there's a lot of people who have told me how much we didn't provide enough solutions. And I also empathize with them. I agree, I wish we had more time to put in more solutions. I wish that we could have talked in more detail. But the reality is, again, you have so much time you have so much attention. And in regard to solutions. It was so nascent, right? That movie came out in 2020. But that means to film and edit a movie it was worked on since like 2017, you know, 2018, to film something like that, and think, What can we say that's going to be relevant in six months? Or a year or five years? Even? No, no, we're not talking about sensory some things you do. Now architects, for example, they build a building, it might stand for a century of technology; we're talking about work that emerged now is evolving every six to 12 months or faster in some cases, right? And so yeah, it was very hard. And I think what "The Social Dilemna" did well is to raise awareness. It also built a platform, right? The whole point of this actually was a piece of art. If you think of cultural artifacts, this is something that others can build upon. And that's what we want, you know, that's what these podcasts are for. That's what follow-up interviews on national television are for. That's what other movies that are coming out about more specific or more technical things are for. But what we did is we helped build a bridge. And if you start to use that bridge, good on you; if you sit in complain about it, and talk about its limitations, and that's all you focus on, then. That's your right, but we're not getting any farther forward to that. Yeah,


Debbie Reynolds  08:54

Yeah. I feel like we're lacking some type of mainstream cohesion around this topic, right? So you see, you know, a book here, article there, you know, maybe something came up in the news that spikes the conversation, but I think it needs to be more of a free-flowing conversation, and we need to be looking at it at all levels. So the one thing that I liked about the movie that they portrayed, which I think, especially teens and kids don't really understand is kind of a psychological manipulation that happens right? Where like, the guy was very distraught about his girlfriend, they broke up, and this algorithm all they want is to get his attention, right? And so they're like, showing him pictures of his ex-girlfriend. That's very mentally tormenting. Right? And I think that especially a teenager or a child for that reason. They don't really understand. They don't understand what's happening, right, but they're having these feelings and emotions and sometimes negative, right, as a result of this. So I think the attention economy is a problem if the thing that gets people's eyes and gets people's attention can actually harm them in the future.


Joe Toscano  10:17

Oh, I 100% agree. I think that it also did a great job of connecting those dots. Right. That was another thing I mentioned. The narrative behind it was a lot of people call it a documentary. But it was a docu-drama, right? We had partly, you know, film, right? We had partly a movie that they built out a storyline so that you'd see some information. And then we connect it in a real-world example, that you're sitting there and you're like, that's my family. Well, that's my sister. That's my brother; like, it connected, regardless of who was on screen. Right. It was just a narrative that worked. And the other thing too, to build on that, you know, I, of course, going back to the second question, too, I really do wish that we could have talked a little bit more about the details there. But let me put it here because we have more technical people. Okay. I got a lot of pushback that we talked too much about the attention economy and the addictions. And that it is really just something to get people's attention. Right. And I hear that argument. Right, I hear you. There are deeper details over there, more nefarious details. But let's talk about here because I brought this to the attorneys general, right? They're trying to figure this out for antitrust thing I brought to them. And I also had recorded if they had all the snippets from "The Social Dilemna", I recorded one point and said, here's the real root of the attention economy. It's not about addictions, right? It is actually about data. And it just so happens that the addictions that have been built is a byproduct of the business model. Right? I don't think anybody actually went out to try to harm society, I don't really believe that. You, of course, have bad eggs in society. But I don't believe it was really built like that. It was built because it's a venture capital system, right? That's underlying all of this, looking for the most efficient ways to allocate capital that's going to return the most money. And what they found was that having people connected to the device more often created the asset they actually need, which is data. Right? That's what they really want. And then the byproduct was addictions, a byproduct was loneliness, suicide, depression, we're now dealing with that. I don't think a bad business is one that makes mistakes and harms people. I think a bad business is one that harms people, shows empirical research that it is doing this, and then defies that and says, It doesn't matter. We're doing it anyway. You know, I wanted to give them a grace period of change, we're just seeing that they haven't really, now I do think, you know, they're, they're getting to the point where we have to say, this is bad business, and laws need to help come in place. And bigger conversations need to be had amongst the tech audience. But even just there, what we just spoke about, like, the details of the business model I just mentioned here, at a very high level, a lot of people in the movie in a lot of general public, they would have got lost in the first couple sentences there.


Debbie Reynolds  13:05

Yeah. Totally.


Joe Toscano  13:07

It is very abstract. And we do need experts to sit in qualified experts and students represent people and help protect you know, it's, it's a freedom. It's a basic right. And I think, you know, the work that we're doing, I'm not going to talk about the software, you know, a lie. But the work we're doing on a database is all about that. It's about making this more accessible because like a movie grade or like a restaurant-grade, or steel, diamonds, whatever it may be, we need some kind of rating system for this industry that the average consumer can look at. And assess risk, right? We can never tell people to do this, or do that. Or it's also really hard to say, this is absolutely not good. And this is absolutely bad. But you can define by risk, you can start to think through the things that make it accessible. And then again, we just start to bridge that gap deeper and deeper with each step along the way. Yeah.


Debbie Reynolds  14:00

That's great. Oh, my goodness, I agree with that. I want to talk with you a little bit about it since you kind of touched on algorithms and AI; this kind of move I see from kind of a social network to more of an AI-driven network. So when I'm thinking about like, TikTok, right, so where a lot of the social media companies like Facebook are trying to, you know, do these new business models and stuff like that, but we've seen like our huge growth of TikTok, and TikTok doesn't use the same kind of social mechanism. So we're, I feel like, you know, a lot of these social programs are like, okay, let me see. You know, who Joe is who he knows who's around him and kind of gives him information around about those people. Where now you have some like, TikTok it's basically like, we're going to decide what you see, right? And you're going to, like consume all this stuff or whatever. So just what are your thoughts about that kind of where we're going with that?


Joe Toscano  15:12

Well, let's pull it back a little bit. I think the initial contrast there was social media versus like the algorithms, social media. And I want to start there because I do think that was a critical inflection point; you know, I am 33, was born in 1989 in Nebraska. So it's more like, you could be like if I was in New York, it was like I was born in 1979 because we're so far behind. But, you know, I've seen a lot of different technologies over the course of my life. And I've seen it all mature. And the biggest thing that I saw in that shift, there was the algorithm that got put behind discoverability. Right, and really started to suggest to folks that we now live in an automated world. But we didn't have a lot of these issues before the algorithms kicked in, right? Like, I actually, for my book, it was, oh, I was told by a good handful of people this was too political. But I did some work. I talked to some FBI agents, I talked to some people in government and policing. And I did some work to look up like the mass shootings we were having. And if you look into it, and I don't have all the numbers exact off top of my head anymore, but you know, they're written down somewhere, if you want to look at them, I can pull them up for you. It was something like a 200 or 250% increase in the number of shootings per year from 2007. When the iPhone came out until at the time, it was 2018, I was looking at, if you extrapolate it out from the 2012 - 2014 area, when the algorithms really started to overtake, it actually grew, right, it grew exponentially. So we had exponential growth from the onset of phones or an addiction in our pockets. Sorry, smartphones, not just phones. And then we had an even larger leap once we got the algorithms. And now my theory is that what's happened is we're starting to just have, we're starting to see the effects of compounding interest on our minds. Where before, you know, like, when I was a kid, I got bullied a lot, I was a fat kid, I was a late bloomer, I got all the things that came my way. And the thing is, I could get away from it. I could go home, I could lock my door, and I could get away from all this stuff. Kids, I mean, even adults who are now feeling this, they can't get away from a nowadays and it's not just like, one bully at school, it's like, could be 1000s of people all at once, at any given moment. You know, so algorithms have dramatically affected in the spread the access, and utility; even honestly, we have to admit that as much as there are problems. There's also a lot of utility, right? Like, something that didn't come out from "The Social Dilemna" or a lot of other groups is that social media can be beneficial for you in a small dosage each day. Right? It does have benefits. But yeah, I mean, the algorithms really changed the game. And now I think so this is an interesting time you're interviewing me and asked me about TikTok because I just got back on the platform. Now my goal is not to become TikTok famous. My goal is to help evacuate people. But that being said, I got back on, and people like, oh, why are you getting back on because, you know, originally I was on it. I got on TikTok when it was music. Five or six years ago, I saw the original version. I downloaded it, I looked at it. I said, man, I'm 26. There are way too many kids on here. And I take this off my phone. And I did, but I'm back on it, because we need to have these conversations now wherever one is, you know. And the thing that I've noticed, the thing that I've noticed the craziest part about it. The difference between Instagram and TikTok I really do believe is what you're saying is it just pushes stuff. Right? Like, right Instagram. Yeah, you scroll all the time. But TikTok. There's no break period. Right? There's no empty space. There's no like, I can pause. It's just going, and I can't stop it. And it does get more dramatic with each flip. Right? It really does go down the rabbit hole very fast. Short Instagrams recommendations do the same when I'm flipping through and I'm liking but this is a whole new buzzsaw. And it's pretty scary. You know, it's pretty terrifying where this is headed in regards to the impact on our communities, you know, and the fact that well, as far as it's been reported, there are different versions in China than there are in other parts of the world. I mean, that's right. That's concerning. Yeah. Feels a little bit like Cold War Information Warfare. Yeah, but you know, yeah, it's I don't know, I don't know. It's a hard one to judge right now. I think the long timeline of society will tell us where it goes, obviously. But my gut is, is that you know, we're at stage two I think with live literacy awareness action around those issues is 2013 when Snowden released everything till now we've really been in a period of like growing awareness. Right. And social dilemma did a great job of showing that there's a tipping point, right? We have a lot of interest boomed all around the world. A lot of awarenesses raised now we're moving into the stage of literacy, and then we'll hit a stage of action. And my gut is that generationally, we will shed tick tock, I don't know where we'll go. I don't know what will happen instead. But, you know, I do think Facebook's on its way out Facebook, blue, at least, maybe the whole Meta company is on its way out naturally, organically there. Nothing we have to do about it. Other than bad decisions on the CEO's part. TikTok, they'll probably be around for a while. And I don't I mean, I'm not in social media enough to know what I think is next. Or how it can be repaired.


Debbie Reynolds  20:57

Yeah, right.


Joe Toscano  20:58

We have some baseline laws of children can't have a smartphone until a certain age, kind of like you can't smoke cigarettes. You can still have a feature phone. That’s for your safety. I know that was what more or less China said, Hey, kids can only get certain content. I think it's a similar parallel. We don't want to compare ourselves and China over the right wing of society very well, but because of the protecting children there, and I think that's important. Yeah, I don't know. I don't know the answer to that. And I don't love TikTok, but like I said, I'm there, I'm going to start to create some messages. And really, what I'm going to end up doing is recording the interface all the time saying, This is bad. Don't do that. This is gonna get stuck on TikTok trying to evacuate people.


Debbie Reynolds  21:44

Yeah, I get you, I get you. I don't know. I feel like, you know, the stuff that Facebook is doing, you know, social, and trying to get into a new business. You know, I have thoughts about that as well. As well as kind of Twitter sort of being a town square. I feel like the future is not about going to be social that way, and I don't feel like the future is going to be the town square in terms of how we deal with media. No, it seems like we're going to use a little five thumbs are these little silos?


Joe Toscano  22:20

I agree, and it scares the crap out of me, to be honest.


Debbie Reynolds  22:24

Yes. It's very concerning.


Joe Toscano  22:26

I mean, that was why a big reason why I left as well, right? Not only did I see the problems with the way data was handled the way data was used, but also yeah, I mean, people who don't think about it, who don't have background media, or haven't really thought about how we distribute information, like the reason I think, I think a big reason why the United States was, in its golden years, had a lot to do with the fact that we had two or three media outlets at that time. You know, nobody wants to admit it; they want to poke it, China or push it Russia and talk about the state-controlled media, but like, back then we more or less, you know, we had America's broadcast, and we had national broadcast, like we had government-run media, more or less, you know, it branched off. So it's changed. Don't get me wrong, but now the problem is what that allowed as much as you may not want to say it out loud, but what that allowed was everybody received the same or generally the same information. So whether you were right or left, you could talk about the issue. We could debate it, we could have a democracy. Now, it's so like you said, just fiefdom. It's almost becoming anarchy. And my fear is, if you can't have a trustworthy, at least national-level media outlet, you can't run a democracy. Yeah. I mean, I don't know how I don't know how it gets a lot harder. It's concerning to me. So you know, that was my thing is we have to figure out how to bridge it, I don't care. What happens in the world if I don't care if I pour all my money into this if we don't have a democracy in the United States? I don't know if I'm going to enjoy living here. So I'm going to, that's what I basically did myself when I left. Right. And luckily, I've had a lot of people come on board; very happy with everybody I've met and worked with and had the privilege to start to make this thing important to people.


Debbie Reynolds  24:21

Very good. I'm glad you're on the case. You know, I'm very pleased to see that there's a lot of grassroots efforts in the US with people like you are really pushing this. Trying to, you know, have your voice out there. Like you said, meeting people where they are, you know, giving them some advice, giving them some choices, right? Because like you said, it doesn't work, say don't do this, don't do that. Be like, look, this is what the deal is. This is what's happening. You have to choose for yourself. So I feel like that transparency really isn't there. And I would love to be able to socialize these concepts in a way that it's kind of something that anyone can talk about. All right, so everyone has a stake. Absolutely. So it shouldn't be just talked about at conferences, or just talked about in universities or in companies, because, you know, this, we're talking about the data of humans, right? So being able to find a way to, to, you know, have, you know, have businesses definitely use technology in a way that's advantageous, but then not create this, you know, extraordinary harm as a result of that.


Joe Toscano  25:34

Right, yeah, and, you know, for me, too, you're talking about democratizing this information, when I have a specific niche that I can go back to the Midwest, I can access people in the Midwest, I saw that when I left to right, like, we're going to have these pockets of people who trust each other. And that's how this is going to bridge but especially privacy. Data protection is stuff that is bipartisan, you can take it to any part of the country and people if they even if they don't fully understand they want it or they're interested in it. And also, for me, like I said, going to the Midwest, I mean, I said, I'm going to clean up, be a businessman, and I'm going to go back and help have these conversations back home. Because whether I want to live full-time in Nebraska or not, I have a lot of people I love back home, a lot of people who simply just don't have the knowledge of what's going on; they don't tell me how to translate it. And they don't know is it's creating this very unstable world. People don't know what to send their kids to school for, and they don't know how to get a job. And really, that's I think, why we have a lot of these problems, because Nebraska and beyond, in these areas that are not Stanford, and not coastal elite cities, in the middle of America, United States, they just want the comfort to, you know, have a roof over their head have health and life insurance and things they need, have food for their family, you know, have peace and in Nebraska, right? They call it the good life. It's not an exciting life. It's not like some fantastic over the top crazy life. It's a very good life, it's stable, it's good health, it's going education basics. And a lot of people just want that. So I think if we can have this conversation with the other side of society to say, right, because let's be honest, a lot of technology comes out of the coast that comes out of the Stanfords, the more liberal and highly educated areas. If we can translate it, make it accessible, get people involved, and then join around this issue of protecting our communities protecting our kids protecting our national security. I think it's a bipartisan issue that can can really bring this country together if it's done in the right way.


Debbie Reynolds  27:45

Yeah. I agree. I agree. So on that issue, where do you feel like we in the US are going on privacy, whether that be from a business perspective, an individual perspective, or in legislation?


Joe Toscano  28:07

That's a great question. And it's in a crystal ball for here because I feel like things are changing so fast here. And proposals are being made left and right. And things are just happening. But you know, some places you're like, I didn't expect that for a long time. Other places you like I've been waiting for a long time. So you know, we had the federal proposal this year. I don't know that. I see that going anywhere, anytime soon.


Debbie Reynolds  28:33

No. So I had Cameron Kerry on the show. And he was thinking, I don't know if he said in the show, but we were chatting. And I said, you know, when the EPA came out, I'm like, this is not going to pass. Right? I was like the naysayer, and everyone was all rah, rah. And so it didn't pass right before midterms,


Joe Toscano  28:57

I think I think what's going to happen is that we're going to get legislation from a bunch of different places in a bunch of different ways. I think that is also what's going to happen. So, you know, I appreciate you mentioning kind of my work has been a lot more grassroots because it also kind of spins into my unique position, which is that I've done a lot of work with smaller and midsize companies. Obviously, I did consult for a very large organization, and I've done, I've had a lot of big impact that I go talk to large organizations or workshop with them, et cetera. But I also have a bigger experience of small midsize and what I'm hearing is a few things so number one, yeah, these laws in California, Colorado, Connecticut coming out Utah, in pair with let's just be honest, the GDPR which is causing global impact. You have a lot of mid to large-scale companies that are now required to follow those compliance rules. And it's impacting small mid-sized companies too not because they're legally required, but because now they want to go work with those companies they want to their delivery service wants to integrate into Hyatt or you You know, there's some smaller marketing organization that wants to integrate into Salesforce or something like that they have to, more or less go through all the same certification, same testing and compliance and everything to integrate right, or acquisitions or anything like that, to where it is changing the entire landscape. I think that as it piecemeal gets delivered across the United States, we'll continue to see that grow, I think thresholds, you know, we've seen proposals throughout the Midwest, where thresholds are less of, you know, 50 or 25 million revenue and actually as low as $10 million revenues, right. That's almost like a mom-and-pop shop.


Debbie Reynolds  30:35

Yeah, absolutely. Yeah.


Joe Toscano  30:36

And so it's getting really low, I think it becomes like tax brackets, where a small organization has less responsibilities, but still has responsibilities in larger organizations much greater. I think that some of the things that are going to contribute to that moving forward, obviously, we got the conversation row that opens up a big, like, stuff that we never wanted to talk about, but actually has been good for us to talk about, I think because it enabled this conversation to where now we have a direct link because this is what we've been talking about as privacy professionals for 5, 10 or 20 years. Right now coming to light. It's forcing that I see I think sought come out in the polls, right?


Debbie Reynolds  31:15

Yeah. Right. Yeah.


Joe Toscano  31:16

We also are seeing, or this is what I'm seeing is the unionization of workforces. I think what we've been doing over the last 10, 20 years in regards to locking down people's mental space, so that their minds, surveilling them at home, checking their emails, always kind of stuff is reminiscent of times back in the early Industrial Revolution, where we locked people into the, into the buildings, right, we forced them to work long hours, we did all these things to their physical body in the Industrial Revolution, when we were doing physical labor, that we're now replicating in a behavioral way. What we're going to see is demand for workplace privacy. And they may have some really large these employee unions, we have some really large levers to pull on that legislation, either state by state or federally. And then the other thing too, obviously, children's privacy, you're starting to see some movements in the States about it. I know from my conversations I think that this is a no-brainer, right?


Debbie Reynolds  32:30

Yeah, I think you're right, So California, they're going to have their own COPPA, which is going to raise the age of like, up from 13 to 18. Right. And they have other provisions. I think I did a video about that recently. But that's going to change a lot around the country because we see California being very influential. And like you said, a lot of companies would, like you said, and like I've seen companies that aren't in California and don't have, you know, have to be beholden to those laws themselves. They're often third parties to companies to say we want you to be aligned with us. Right. So when that happens to California, I think it's going to create this whole new wave of stuff. And then in the US, employers, employees have not been accustomed to having that type of access to data about what, you know, employers have about them. So I think that's definitely going to start a wave way. Yeah, absolutely. Absolutely. Especially when you think about things like workplace surveillance and boss where, you know, it was fine. When organizations could do it, you know, in a clandestine manner, but now they have to kind of disclose something that they do. So I think it's going to be a big sea change with those two things.


Joe Toscano  33:50

In regards to employee surveillance, they absolutely have these rights. You know, we need to have these rights if we don't already right? As someone who was out when I was out consulting for Google, even back then this is five going on six years ago, we had rumblings that Google was surveilling emails, and reading emails with all their employees and contractors, et cetera, trying to figure out who was looking for a new job, probably, among other things, and then firing people letting go before they actually got an offer, like pushing people out the door who had any signs that they were potentially going to leave, you know, so they had more loyal employees. Now, obviously, that's until we had any kind of validation of it. That's a conspiracy theory. So we're not going to sit here and spread conspiracies, but yeah, right, knowing what I know now, it makes sense that something like that would have been in place. And there were no protections, there's nothing against the law about what they would have been doing at that point in time. It's just a business, and then there are validated cases of this happening across different organizations as well. So it's not that far out. It's not like I made some space conspiracy theory there. But you know, yeah, I mean, it's very realistic, and people should have the right It's an email. I mean, obviously, yeah, you don't want to do it on your work computer. But let's be honest, some people don't have the money to get a personal computer, they use their work computer, because that's just like, the only access they have. Right. So I think you got to have those rights; there's a certain level of it, right? I think it's reasonable to measure certain behaviors or certain metrics to qualify that the employee is performing up to some standards that you expect from them. But there's also a limit to say, Hey, this is private; let it be.


Debbie Reynolds  35:30

I guess the thing that concerns me is people always say all we need is regulation. I wouldn't use regulation. But I think we need a lot more than that. So regulation only covers part of it, right? So a lot of it is how people feel about privacy, what they're willing to do to get that privacy, whether they're like maybe changing services, having businesses make it a priority, you right, because having that trust is something that you know when you erode that trust, you get kind of poor data, or, you know, your quality of data person wants to give you this much less, right. So you have like, yeah, people signing up for services or phony fictitious names.


Joe Toscano  36:23

I put the White House and when they asked me for my address, every time 1600 Pennsylvania Avenue, every time


Debbie Reynolds  36:30

I was one of those kids that whenever the magazine news came in the mail, I'll fill it out. And I'll put like the wrong name in there, just to see what just you know. So I was always curious about what people would do with the information that I gave, right.


Joe Toscano  36:46

Yeah, I mean, but it's a real thing. And, you know, when we were in paper forms, it was a problem. But now, especially like you're saying when people have more awareness of it, and there's services that are helping now, right? Like, I use Firefox relay all the time, which is a service that will mask my email and block spam or promotional stuff. And if I ever want to not receive emails from that person or those people, and I don't trust it, their marketing system opts me out, which, let's be honest, is a very real thing. They have an unsubscribe button, it doesn't work. It's probably intentional, right? How many times have you had that? Right? So I use Firefox really because what it allows me to I just turn off that email, and then they have no access to me. So what are we talking about here, like with these companies that are, they're crying about how much these laws are gonna impact their ad revenues and stuff? Like, you won't have an advertising industry? If all the information is fake? Right? Like, what's the value? The value of data is it allows you to make things more accurate and more targeted. If it's all fake, then you're just gonna be sending mail to some trash cans if they're not already.


Debbie Reynolds  37:56

Yeah, totally. That's true. Well, I think that was part of the issue that, you know, many of the companies that were doing this advertising, you know, they could say, oh, we spent sent all these assets to x, and here are kind of your metrics, but it really wasn't reaching the people that they wanted to reach. So it really wasn't; it didn't have a huge impact, truly. So I think it's really interesting. But yeah, if it were the world, according to you, Joe, and we did everything that you said, what would be your wish for privacy anywhere in the world? Law, technology, human stuff, what are your thoughts?


Joe Toscano  38:39

That is a big one. Let's start here. I believe, as I've said in my TED Talk, as I talked about last was we should have the right to own and control our data. That's a much bigger conversation than we can have here. But I think we share those rights, I think, ideal world, my data is tagged, and when it gets into the system, it's tracked one to one or however I want to build allows me to get paid for that work. My privacy is respected and maybe that's the ability to have controls, but actually in the future, I think there's going to be fiduciaries like we have financial advisors, you know, life insurance people, people who are going to advise us on how to set those controls, because the reality is, not everybody has the time or attention or literacy to take control even if they have it. So I don't think this paradigm of noticing consent or choice, you know, is really where we're going to get stuck. I think it's just the beginning. It's the first test. I also, like you said, I see a world where privacy is not just about compliance. You know, I've given a couple of talks about this called privacy 2.0 era of post-compliance privacy, where, you know, privacy is a human factors thing right? Right tell people, it's the UX of security; it is constantly changing, and it's always going to change for the rest of our lives. Because privacy is not a binary piece of you know, remediation or different tactical thing that you can just implement in code. Privacy has a lot to do with culture, time periods, and technologies available, and a lot of context to it. So I see this world actually, when in again, a little bit of self-promotion. But this is the work we're doing with data great, we were building an assessment. It's not about making you comply with any one jurisdiction, but actually to a control framework of ESG. Because ESG, let's be honest, when you're talking to a privacy consultant, they're going to tell you, they'll make you compliant with all the laws, but what they're really going to do is they're going to build a control framework that has X number of laws built into it with a focus on finding the most restrictive and making those your Bellwether legislation pieces. So what we can do with ESG is now we have a g here, that we can include all things governance, we can definitely make you compliant, we just don't, we're just not a compliance company. Right? That s becomes all the things that you and I have spoken about today, the dark patterns to children's privacy, the impact on the elderly, and people who may never have the literacy to engage with these things but are getting pushed deeper and deeper into it, you know, all the social things that maybe either aren't put into law or shouldn't be put into law, sometimes things shouldn't fully be encoded in law. But they are the social responsibilities that when a business obliges, they gain better, more loyal customers and a more resilient organization is what they really become. So that's the s. And then the E, obviously, the most emerging part is the impact on the environment. So this could be you know, little extrapolation here, but data minimization, data protection, real good data protection practices, just based data minimization, if you reduce 20, 30% amount of your data, you reduce the amount of cloud space you need, or servers you need, and you reduce the infrastructure costs, but you also reduce the energy use, right? So we're exploring, like all these concepts of how do we take the law as it is and where we see it going, where we think it's going and put it into a framework that's ESG, which can be across jurisdictions, which can be higher than above and beyond compliance, and can create a framework that, I think is the future, like you're saying, because privacy, as we've talked about is not just a law, privacy is a personal issue, and it needs to be addressed as such. So if I can say, where do I want to see the future, it's a future that considers the human factors of privacy. And all of our people are a lot of them are literate enough to take action and understand and we have controls in place to protect us and have a civil democratic society.


Debbie Reynolds  42:54

Yeah, I love that vision. I love that. So I would love for us to definitely keep in touch and see how things are going. Let's see how our visit comes true.


Joe Toscano  43:07

Let's see, let's see, I think we're going to make some impact together too. So let's work on his change and rewrite the narrative because we're just at the beginning. This is going to be great.


Debbie Reynolds  43:18

Well, thank you so much for being on the show. It's been wonderful to have you here. I really appreciate it.


Joe Toscano  43:24

Thank you for having me. I was so excited and I'm definitely going spread this everywhere. I'm so happy to get on "The Data Diva" podcast. I made it.


Debbie Reynolds  43:34

This is amazing. Amazing. Yeah, we'll talk soon for sure.


Joe Toscano  43:39

Talk to you later Debbie.


Debbie Reynolds  43:40

Okay. Bye bye.