"The Data Diva" Talks Privacy Podcast
The Debbie Reynolds "The Data Diva" Talks podcast features thought-provoking discussions with global leaders on data privacy challenges affecting businesses. This podcast delves into emerging technologies, international laws and regulations, data ethics, individual privacy rights, and future trends. With listeners in over 100 countries, we offer valuable insights for anyone interested in navigating the evolving data privacy landscape.
Did you know that "The Data Diva" Talks Privacy podcast has over 480,000 downloads, listeners in 121 countries and 2407 cities, and is ranked globally in the top 2% of podcasts? Here are more of our accolades:
Here are some of our podcast awards and statistics:
- #1 Data Privacy Podcast Worldwide 2024 (Privacy Plan)
- The 10 Best Data Privacy Podcasts In The Digital Space 2024 (bCast)
- Best Data Privacy Podcasts 2024 (Player FM)
- Best Data Privacy Podcasts Top Shows of 2024 (Goodpods)
- Best Privacy and Data Protection Podcasts of 2024 (Termageddon)
- Top 40 Data Security Podcasts You Must Follow 2024 (Feedspot)
- 12 Best Privacy Podcasts for 2023 (RadarFirst)
- 14 Best Privacy Podcasts To Listen To In This Digital Age 2023 (bCast)
- Top 10 Data Privacy Podcasts 2022 (DataTechvibe)
- 20 Best Data Rights Podcasts of 2021 (Threat Technology Magazine)
- 20 Best European Law Podcasts of 2021 (Welp Magazine)
- 20 Best Data Privacy Rights & Data Protection Podcast of 2021 (Welp Magazine)
- 20 Best Data Breach Podcasts of 2021 (Threat Technology Magazine)
- Top 5 Best Privacy Podcasts 2021 (Podchaser)
Business Audience Demographics
- 34 % Data Privacy decision-makers (CXO)
- 24 % Cybersecurity decision-makers (CXO)
- 19 % Privacy Tech / emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6 % Media / Press / Regulators / Academics
Reach Statistics
- Podcast listeners in 121+ countries and 2641+ cities around the world
- Over 468,000 + downloads globally
- Top 5% of 3 million + globally ranked podcasts of 2024 (ListenNotes)
- Top 50 Peak in Business and Management 2024 (Apple Podcasts)
- Top 5% in weekly podcast downloads 2024 (The Podcast Host)
- 3,038 - Average 30-day podcast downloads per episode
- 5,000 to 11,500 - Average Monthly LinkedIn podcast posts Impressions
- 13,800 + Monthly Data Privacy Advantage Newsletter Subscribers
Debbie Reynolds, "The Data Diva," has made a name for herself as a leading voice in the world of Data Privacy and Emerging Technology with a focus on industries such as AdTech, FinTech, EdTech, Biometrics, Internet of Things (IoT), Artificial Intelligence (AI), Smart Manufacturing, Smart Cities, Privacy Tech, Smartphones, and Mobile App development. With over 20 years of experience in Emerging Technologies, Debbie has established herself as a trusted advisor and thought leader, helping organizations navigate the complex landscape of Data Privacy and Data Protection. As the CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, Debbie brings a unique combination of technical expertise, business acumen, and passionate advocacy to her work.
Visit our website to learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E199 - John Cavanaugh and Debbie Reynolds
Debbie Reynolds, "The Data Diva" talks to John Cavanaugh, Executive Director & Privacy Evangelist, The Plunk Foundation. We discuss his unique journey into the privacy sector, beginning from his college days. He emphasizes the importance of grassroots privacy, a crucial aspect often overlooked in the profession. The conversation explores the intersection of technology, education, and student well-being, discussing the challenges of developing a platform that balances academic support with financial constraints. We also express concerns about AI's potential to exploit vulnerable populations, highlighting the alarming ease with which bad actors can access and misuse personal information.
We delve into the growing worries about data privacy and AI technologies. We express alarm over recent news that GPT 4o can access users' entire devices, discussing the potential misuse of data and the importance of responsible AI development. The rise of emotional AI and its implications for privacy and ethics are also discussed, raising ethical questions surrounding the integration of AI into daily life and emphasizing the need for fair and ethical AI practices.
The episode concludes with a preview of the upcoming MidwestCon event in Cincinnati. John discusses his role in the event and expresses his wish for clear opt-in privacy regulations and transparent data usage by organizations. He highlights the importance of allowing individuals to change their consent at any time. We are enthusiastic about the event and committed to promoting privacy awareness and John shares his hope for Data Privacy in the future.
Many thanks to “The Data Diva” Talks Privacy Podcast “Privacy Champion” MineOS, for sponsoring this episode and supporting the podcast.
With constantly evolving regulatory frameworks and AI systems set to introduce monumental complications, data governance has become an even more difficult challenge. That’s why you need MineOS. The platform helps you control and manage your enterprise data by providing a continuous Single Source of Data Truth. Get yours today with a free personalized demo of MineOS, the industry’s top no-code privacy & data ops solution.
To find out more about MineOS visit their website at https://www.mineos.ai/
30:54
SUMMARY KEYWORDS
privacy, data, ai, people, talk, plunk, ways, cincinnati, love, conversation, feel, build, technology, university, information, foundation, students, chat bots, work, device
SPEAKERS
Debbie Reynolds, John Cavanaugh
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legally advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva Talks" privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know. now. I have a very special guest on the show, John Cavanaugh. He is the Executive Director and privacy evangelist for the Plunk Foundation. Welcome.
John Cavanaugh 00:37
Hi, Debbie, thank you. I'm so happy to be here, and it's so good to talk to you.
Debbie Reynolds 00:43
Excellent. Well, we met on LinkedIn, and I'm really fascinated by the things that you post and the things that you're doing with your foundation, but I would love for you to tell your trajectory in privacy and how you came to be the Executive Director and privacy evangelist at the Plunk Foundation.
John Cavanaugh 01:02
That's a great question, Debbie. So when I've talked to people in privacy, we all have a unique path. It's very interesting. Not everybody is thinking of it at six years old like I want to be in privacy, but it's something that we really develop a fond passion for, and I love the community that you have brought on LinkedIn with lots of other leaders, too. My journey started actually in college, 19 years old, and I worked with a few friends to build a website, and the website wasn't very good. It was slow, it was clunky, but the whole point of it was for students to come together, especially at my university, and talk about what it was like to meet with an upperclassman and what it would be like to go to med school as an example, and various ways to connect with people who are in your same trajectory in your college experience. It was very academic-focused, and within about a few months, we grew to 10,000 people, which was 1/4 of my university at the University of Cincinnati, and it really took off. Now I was a biology major and a Spanish major, nothing to do with it whatsoever, but this really took a life of its own, and my friends and I started going to various Midwest universities and helping grow this endeavor. We ended up building an app. So, I worked with a team of 12. Some were offshore; some were local. We built a really cool app, and we were working with universities to sell software as a service, which would help students find their people: people who were in their dorm rooms, people who shared their majors, people who had clubs in common, and do it in a really simple, easy to use way, because a lot of universities had clunky ways of doing it, and our goal was to help with retention rates and also help with students feeling like they would fit in. So, as we got to this point, we had angel investing, and we were writing contracts with universities and getting ready to set the stage. At this point in technology, you need to burn millions of dollars to get market share. That's the way that tech has been traditionally done. We needed to keep up with the likes of GroupMe Discord and Facebook groups at that time, and what we found was that every investor we talked to they were happy about our growth and what we were able to do, but they were upset that we were not collecting data of the students. They found that this would be really important, going forward, to make their money back as quickly as possible, and I didn't really know what that meant. I was very naive, and I felt like, oh, I don't know what all this looks into. Then I found that, essentially, the nuts and bolts came to students typically having poor financial skills because it's their first time being away from their parents. Now, they have a lump sum of money that they get for their college loans, which is really at an interest. So they're buying pizza and beer at interest as an example. Then the other thing I learned is that college is really stressful. Of course, I learned through my own experience, but people tend to buy more when they're under stress, and so the idea was to understand, through the sentiment of people utilizing what we've built and through their pathways, is to understand who are the anxious and stressed students, and how can we push more ads to them? To me, this was absolutely mind-blowing that anybody would even think of doing this. But as I started to research, it's the norm in tech, and that's when I found, whoa, how is modern tech sustainable in this way, where we are the products at the end of the day, and that got me really thinking about, what can we do to build ethical technology in this day and age? Now, we were offered millions of dollars to continue with visitors who are offering us money. Right? But everybody on my team, luckily, had a similar mindset. We just felt like taking that money wasn't worth it, and so what we decided to do was decline and give the money we could back to our original angel investors. Then we spent about a year researching how we can build technology that is privacy-centric, that still has a place in society, and can be really useful for people without having to worry about where their data comes from, and so we built the nonprofit. Plunk is an acronym. It stands for peaceful, loving, uplifting, nurturing and kind. And we started in April 2022
Debbie Reynolds 05:38
That is unbelievable. Oh, my goodness. I think people don't really understand why the Internet is free, and a lot of it is because it's built on a foundation of advertising and marketing. But I'm glad you and your crew decided that, hey, we want to do things differently. We want to do things more ethically, and even though it may not be very lucrative, a lot of people who didn't turn down the money did that. We see them on the news all the time, but I think what you're doing is really the way to the future. I don't think there's ever going to be a time where advertising isn't going to be there, but I think trying to do it in surreptitious ways and do it in manipulative ways. I think that those days are coming closer to an end. But what are your thoughts?
John Cavanaugh 06:26
Yes, I would agree with you. Now the Plunk Foundation, what we've mostly focused on is how surveillance and privacy invasions and just a risk and online safety in general affect vulnerable populations. So there are plenty of organizations that speak out against surveillance capitalism in those areas, and where we felt a lot of the conversation and dialog is missing is when it comes to for example, the research shows teens, women, especially in compromising domestic violence situations when it comes to all of these vulnerable, marginalized communities, people of color. For example, Avondale is a community I work with in Cincinnati, and really help them understand why privacy is important, which they get. A lot of people get this right away, and just a few steps that people can make to make a change. Now, I believe, to come back to your question, I believe that the advertising model is changing. I mean, even by some miracle, the US has introduced a privacy Federal law, even though there's plenty of critiques to be made of it, the fact that that has come is pretty remarkable for us in the privacy space to even think that we have something like this in 2023 so, yeah, I think data brokers and corporate communities are looking for other ways of achieving marketing in a more ethical manner, which is the goal. We want society to be a better place for everybody and people to know what they're getting when they sign up for something.
Debbie Reynolds 08:00
Yeah, I agree with that. I love what you're doing. How about your thoughts on this? So Renee Cummings is a futurist and a criminologist and a Data Privacy and AI expert, and one thing that she talks about is data trauma, and when you said that, you deal with trying to educate under represented communities. I think a lot of times when you hear the news about data, people talk about how data's so awesome, data driven, this big data, all this stuff, but tell me about the ways the data can hurt people.
John Cavanaugh 08:38
Yeah. So there are fairly obvious ways, and then there are more subtle ways. Some of the obvious ways that I'm sure we've all heard recently is tracking information, such as Snapchat locations and Apple Air Tags to track the locations of cars; a lot of newer cars that have GPS built into them have and have the potential to cause calamities in situations where you can track people and have control over their whereabouts, even have access to devices. So one thing I'm learning, at least locally in Cincinnati, is that there's the intake for shelters as an example, for domestic violence situations, and during that intake, there are fundamental questions that could be very helpful that are missing, such as, is the location of your phone on Have you recently posted on Instagram or Facebook that has your location as an example? So these things that are fairly obvious, but in the context that you may not always think about of the harms that it has is one thing, there's other different, more insidious types of data that can be found, relatively insecure data that can be caused from hacking as an example. And we see this a lot with hospitals. We've seen this a lot with schools. Actually, about maybe a month and a half ago, we had in Minneapolis. One of the largest child hacks in schools, and it was over 100,000 children that had their health data taken. These are ways that can embarrass, expose, and damage the reputations of people as well. When we look at it, a lot of it can be sold on the dark web for various purposes. And now what we're seeing with AI is that you can connect a lot of this data and some point areas and human trafficking, that can be a really big point where you no longer need a lot of coordination when it comes to human trafficking. Most human trafficking is actually a person in the neighborhood or next door that keeps a person there, and a lot of this can be done through AI with $20 a month running multiple chatbots and just identifying people that are vulnerable very easily. If you're able to buy very cheaply, under five bucks, information about people on the Dark Web and integrate that into AI, you can find vulnerable populations very easily that meet your criteria for these things, and so bad actors are coming up with very insidious ways. The Plunk Foundation and my team are looking at where this leads into the future, as we're in this absolute Wild West at the moment, so a lot of it we need to be prepared for and really help educate parents.
Debbie Reynolds 11:20
Yeah, I agree, but thank you for that. I hear some people say, hey, what's wrong with data? Data is awesome. It's like, well, if you have an awesome data relationship, that's great, but the data can definitely hurt you in ways that people don't even realize or understand. Tell me what is happening in the world right now in privacy that's concerning you. So something you see, you're like, oh, wow, I don't like that, or I'm really concerned about that.
John Cavanaugh 11:47
Yeah, the first thing that comes to my mind is that there's huge news that just came out a few days ago about ChatGPT 4o, and it has access to your entire device, including the video and the video scanning at all times. So if you're using these types of devices, the way I saw this is that, yeah, they had features, but not too impressive with features, and a lot of it seems to me to be just a data collection sinkhole to train as much data as possible, because they get a better product if they have more data and really as a stepping stone for their 5.0 model as an example. So when I saw that, I hadn't seen many people talk about, and I plan on writing about it because that was the first thing that came to mind.
Debbie Reynolds 12:35
Wow. I didn't know that myself, and also the multi modal abilities in the these technologies, they're pulling more data, they're asking for more data. I think the thing that really struck me that I haven't yet dug deeper into is that it's trying to do more emotional AI, where it's trying to give it a picture, or somehow trying to discern your emotional state or sentiment. To me, I'm okay with sentiment. If someone says, hey, your email sounds angry, gotta do that and Grammarly, or says, oh, it sounds awesome. That's really the type of sentiment I'm talking about. But someone looking at your facial features saying, hey, I think John looks angry. It's like, well, you don't know, maybe that's John's face all the time. You can't tell someone's angry, but bots don't always know what they're going to do with that data and so emotional AI isn't new, but I'm concerned about how that information will be used in the future.
John Cavanaugh 13:31
Yeah, definitely, and I know the algorithmic Justice League has focused a lot on AI and video, especially of people of color and just totally not recognizing gender very well or various obvious things, various obvious traits. So that's one problem. Sam Altman, I think, right before the release, posted about Her, the movie, which is Joaquin Phoenix falling in love with a machine and very much likening it to the fact that the actual device itself will have emotions where you build a relationship with the AI, and that's a very interesting philosophical question, what is the relationship? Knowing that there are a lot of people saying that, of course, AI can't reason the way that humans do and that this is not a logical relationship in terms of what an actual relationship is. So how do we deal with these things as they increasingly popularize and become a part of our lives? I've also noticed it's hard to avoid at work when you call people. They have their AI Chatbots, and they have their note-taking apps and stuff, and so this is something that's here to stay, and the real question that I'm looking at is, how is this equitable? How can we build it in a more responsible way, and various ethics-based conversations, and really the fundamental one is, how is AI not stealing, or how is it, not theft to begin with, which is a deep conversation to have another time with you, offline or something.
Debbie Reynolds 15:04
Iwant your thoughts about two things I hear people say, one is privacy is dead, and the other is that I have nothing to hide, so I don't really care about privacy. What do you think if you heard someone say those two things?
John Cavanaugh 15:18
Yeah, I have nothing to hide. Classic, my favorite way to go about this is then saying, okay, show me your phone. Can I have your phone unlock it for me? Let me have access; people immediately recoil from that. The other thing I like to ask is, then, why do you close the door when you go to the bathroom? Everybody knows what you're doing anyway because there's a real instinctual connection between privacy and self-respect and autonomy, and those are things that independence that are really essential to the human experience. I'm sorry I missed the other question that you mentioned.
Debbie Reynolds 15:50
Oh, privacy is dead.
John Cavanaugh 15:53
Oh yeah. Zuckerberg, famous. I don't know what 2014 announcement he again apologized for, but there are some things to agree upon, the sentiment I don't agree upon, but for a lot of people, it feels very overwhelming when they learn about the invasion of privacy that's been happening, and I can relate to it because it's so daunting. You don't know where to begin, and then you feel like, okay if I have this little corner of privacy in my life, is it really that helpful? Is it already absorbed? Is it already too late? I've been on Facebook for 100 years, those types of things, so I can definitely understand the sentiment. And what I like to do is help people understand efficacy. That's a huge part. You'll see this in climate change. You'll see this in various areas, and I talked to various professors in communications, and people have really studied this deeply, but you can go from thinking that privacy is dead to various paths where you feel like you have autonomy and giving people that autonomy is absolutely essential for changing that narrative. If there are tiny steps that somebody can take that take two minutes, or the flip of a light switch, there are tiny steps that people can make, or where they don't feel as alone about these problems, and there's grassroots services and people there for them, then things become manageable, similar to climate change, where, hey, if I recycle, how is this really going to help? But if you work with people who are developing local governments for green solutions, and looking at what carbon taxes are like and various complicated things, I'm not an expert in, but that's where people start to have hope and build community. I always think that privacy is going to get worse before it gets better. But I have hope. When I look at the EU when I look at the right to be forgotten, which is an amazing part of the GDPR, when I look at these warriors, Mr. Weston, who started the privacy movements, when I look at all the privacy people here together, we haven't given up on this, and that's because there is a really good fight to be had, and there is ways of protecting people's privacy, and we've become a lot more mature in how we looked at it from 60 plus years ago. And I think that just blanket statement privacy is dead is not a healthy way to look at how we want to improve our society and improve the technology that we're building. And I think that privacy engineers and privacy people are the ones who can make a better technology and not stifle innovation. But private technology is innovative.
Debbie Reynolds 18:41
Yeah, I agree. I love your point of view there. I had a really interesting thing happen to me recently that I was a little shocked by because we are in privacy. I know most people don't care about privacy as much as we do. We talk about it, we think about it. This is our work, that we do. But I was speaking to an older relative who's in their 70s about my work on privacy, especially things like privacy for cars, and I was a little bit shocked about something she told me. So she said my insurance company asked me if I would install some goggle on my car that they could track me, and then they would give me $300 cheaper insurance a year, and she said, no thanks, I'll pay $300. You don't need to know where I am all the time, and I thought that was a really striking statement from someone well obviously doesn't follow privacy as much as I do, but she felt that that was not a fair exchange. So yeah, what do you think?
John Cavanaugh 19:46
Yeah, I love that, and I have given, especially in Avondale, we've given talks to seniors and many people that are 60, 70, 80, years old. They lived in a world where it was very, very weird for a company or a hospital system to ask for your Social Security number, and privacy was extremely important. You had suburban sprawl that was really popular. People want their own plot of land with their own blinds, and they grew trees so the neighbors wouldn't see what was going on, and so I do think that generation, when I talk to them about privacy, they may not understand the technology piece, but the fundamentals are so clear on why privacy is important, and that gives me a lot of hope. Talking to my sister, who's eight years younger than me, it's a lot more of an uphill conversation; even though she understands the technology, she's on her phone more than 24 hours a day somehow. So it's really interesting to see that non-technical people who aren't addicted to their phones or using a computer every day. Now, there are some in 50, 60, 70, meetings that are, but a lot of them understand the basic tenets of privacy intuitively and consider it something extremely important to them.
Debbie Reynolds 21:04
I want your thoughts about how AI is changing the privacy landscape. I think it's changing it a lot, but I want your thoughts there.
John Cavanaugh 21:13
Oh, yeah, and the hot topic too, yeah. I mean, AI is nothing without data. So by definition, if anything has to do with data, we need to look at privacy. Where I see this, there are so many things, but where I see this mainly is 2.1 is on even open domain information the scale that AI collects; it is something we are not prepared for at all. So the change of open source information and public information on the internet and what we thought people could do with it now, with AI being completely scraped and put together, it is a very strange, very different, scalable issue that many people didn't think of in the beginning of the internet, and now your public information that's on there can be perceived differently. So there's a whole conversation to be had around that. The other piece is the more fundamental privacy things that we're looking at, and I believe that how we're gathering data can really depend. So if we look at a company and the data they're collecting, let's say they're supposed to retain data for a certain amount of time, but when using AI, they can absorb that information. A lot of their algorithms are this Black Box that we talk about. The Black Boxes, essentially, data comes in. You have a Black Box, and we're not able to exactly predict how an answer was coming out of it through the Black Box. Now, your sensitive information may have gone into that Black Box and formulated the AI itself, and so there's an imprint or a lasting piece, and I think that's a very interesting privacy problem. Now, when we look at the privacy, and how AI is coming together when you share information with third parties, and how our world is getting smaller and smaller through each scraping that happens through AI, we have to really shift how we're looking at the privacy lens. Now, what I mean specifically about this is that, in a typical world, we look at privacy and say, okay, what is the privacy notice or policy, and how are we applying that? What notices are we giving to the people that they're agreeing to, and then how long do we keep that data? What do we use it specifically for, and how do we retain this data over time, and then after retention, how do we delete this data? Now AI has mixed all of those together, and so what does data retention look like in the midst of AI? We have to have a conversation on that. What does it mean to purge data in the midst of AI? We have to look at what that means. Then we also have to look at that the machines themselves and the machines themselves. When I've talked to mathematicians at the University of Dayton, which is about 40 minutes north, we were talking about, is it possible to encrypt end to end? Encrypt AI machines? What they found through mathematical equations was that there are some mathematics that we haven't figured out, to way to truly have end-to-end encryption. There are issues that we're looking at here with the Plunk Foundation. Some people at the University of Cincinnati, which is Federated AI, which allows your data to stay on your device, the algorithm actually changes and gets sent back up to the mothership, as an example. I was just having a conversation, but by that nature, you can't have zero trus, AI, even if it's Federated. So we're looking at, okay, if we really want privacy centric tools, or we're using Proton Drive and Proton Mail and various things like that, what is their next move for private-centric related AI and we're gonna figure out. I mean, there are probably ways to figure this out, but there are a lot more challenges that we're seeing on how to truly protect privacy. So I know that was a very multi pronged answer, but that's the various ways I look at AI and privacy.
Debbie Reynolds 25:18
Yeah, it's definitely more difficult. It creates more risk, I think, for data, because it's harder to manage it in the ways that we traditionally thought about data. So, data is in buckets. Data static. We go to the bucket and we pull data out of the bucket, and who has access to the bucket? Now we're saying we're basically scrambling it like eggs, and then how do you take the eggs out of cake? That's what we're trying to figure out.
John Cavanaugh 25:43
That's very true. Another piece is that our data used to be siloed in various parameters, and a lot of these AI companies had automation that pulled data from each of those silos. But some people are utilizing AI in their back-end systems, which is incredibly risky to essentially mow down all those data silos into one pool of all of your data. So if there's a cyber attack, you have everything instead of siloed data that's neatly separating compartment lines absolutely.
Debbie Reynolds 26:18
Wow. I think I'm glad that AI is heightening the privacy conversation because I feel like the urgency that people have about AI is what we should have had about privacy. Now that we see companies getting more than AI, it's becoming evident, I tell people, especially around governance. So, if your data governance was not very good. AI is not going to help you as a company. The companies that have that governance will do better, and they'll go farther with their investments and whatever they're trying to do with AI, but everybody else, like, Well, you got to go back, like being flunked in school. You have to go back to grade one because you didn't do what you're supposed to do to get to grade 12 or something like that. So I want you to tell me a little bit about your September 24th event that you have going on in Ohio.
John Cavanaugh 27:09
So we're having something called Midwestcon. It's the fourth annual conference, and the whole purpose of it is to talk about innovation and social impact, and a lot of it's policy related. The organizer is Rob Richardson, and we're looking at technology and innovation in terms of, how does it become responsible and ultimately a social good. So I'm excited because you'll be there. Mary Marwood will be there, and Angeline will be there as well. So the four of us will be there, and I've convinced the organizers that we should absolutely talk about privacy and why this is important. So it's really cool to see the Cincinnati community step up and accept and embrace privacy as an important concern. And I'm very, very excited to have you there, and I'm really excited to have you join and come and meet people in Cincinnati.
Debbie Reynolds 28:06
Yeah, it'd be fun. Road trip to Cincinnati. That'd be great. So if it were the world according to you, John, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior or technology?
John Cavanaugh 28:24
Yeah, that's a good question. I think, first and foremost, I tend to be a privacy absolutist, but I think the most practical way to do this would be to have very clear up in plain language. If you go to a website, you go to sign up for three bullet points. This is how we use your data, and you can decide to use that website, or you can choose not to. Now, there could be great legislation on deciding how invasive an organization can be, but just as a general principle, to do that would be amazing. Then, on the other side, our organizations handle the data would be really nice if they just use the data for exactly the purposes that were on it, and we wouldn't have to deal with lawsuits or anything like that or people having, like you mentioned, data trauma. I think those two fundamental principles would be awesome. Lastly, the GDPR and other laws are that you have the ability to change your mind in that meantime, so you can always say, never mind. Take me out. I'm done, and I think those three things would make a really great world.
Debbie Reynolds 29:35
I love that, instead of opt out, I'm done, but take me out. I'm through, I'm through. I would love that
John Cavanaugh 29:42
Exactly.
Debbie Reynolds 29:44
Well, thank you so much for being on the show. This was great. I love chatting with you, and I really, really love what you're doing with the Plunk Foundation. There isn't enough grassroots efforts. I feel in privacy well, because expensive, you've still got to pay bills and stuff. So I'm. glad to see that people are contributing and trying to help make sure that you get the word out. You're doing a great job.
John Cavanaugh 30:07
Thank you so much, and I love how you have promoted privacy throughout the years, even before many people cared about it; your podcast has been a really great source of inspiration to know that sometimes the privacy world can be a little lonely and it's because it's hear other people's thoughts and ideas. So, thank you for building this incredible classroom.
Debbie Reynolds 30:25
Thank you Well, thank you for contributing to it. Your voice is really vital. I mean, your story is amazing, so I love it. I'm happy that you're ready to share it.
John Cavanaugh 30:35
Thank you so much.
Debbie Reynolds 30:37
Thank you. Thank you. We'll talk soon. See you in Cincinnati.
John Cavanaugh 30:40
Yep, that sounds great. See you in Cincinnati.