"The Data Diva" Talks Privacy Podcast

The Data Diva E189 - Cindy Warner and Debbie Reynolds

Season 4 Episode 189

Send us a text

Debbie Reynolds, “The Data Diva” talks to Cindy Warner, Founder and CEO of 360ofme. We discuss the evolving landscape of data privacy and the importance of ethical data usage. Cindy shares her unconventional journey from aspiring to a medical career to becoming a tech industry leader. She recounts her work with ERP solutions and her pivotal experience at Salesforce, highlighting data's power in driving insights for organizations.

The conversation delves into the evolution of 360ofme, emphasizing the shift towards consumer consent and privacy in marketing. We discuss the need to reawaken ethical practices in data usage, highlighting the challenges in building a trusted AI corpus, ensuring data is bias-free, and the high costs associated with data preparation and testing. Cindy explains the significance of verified identity, consent, and context in data sharing and how 360ofme makes revoking consent easier than traditional platforms.

The episode introduces Privacy Policy Co-Pilot and Enterprise Privacy Pulse, tools designed to assess an enterprise's privacy maturity and compliance. Cindy talks about companies releasing products without considering data privacy and the ethical implications of such actions. We also raise concerns about data security and storage, discussing the vulnerabilities of retaining extensive image data and the impracticality of such practices.

The discussion highlights the need for a human-centric approach in technology, the role of regulators, and the impact of prioritizing profit over ethical considerations. The episode also covers the effectiveness of the Biometric Information Privacy Act, citing Facebook's $650 million settlement, and addresses concerns about biometric data usage in public spaces. Overall, the episode emphasizes the importance of privacy by design and ethical considerations in product development, aiming for a more responsible and consumer-trust-focused technology industry and Cindy’s hope for Data Privacy in the future.

From 360ofme: At 360ofme, we're thrilled to announce the upcoming launch of our new Companion Products: Privacy Policy Co-pilot and Enterprise Privacy Pulse. Privacy Policy Co-pilot is an AI-driven tool that analyzes and grades your privacy policies, providing actionable improvement suggestions to boost customer trust. Enterprise Privacy Pulse lets organizations complete a self-assessment to evaluate their privacy practices and receive personalized insights for enhancement. Currently, in beta, we invite you to sign up and be among the first 100 registrants to enjoy a 25% discount. Email 360ofme to take advantage of this offer at info@360ofme.com

Support the show

35:41

SUMMARY KEYWORDS

data, privacy, consent, companies, biometrics, people, call, enterprise, verified, create, technology, ai, revoke, consumer, nefarious, transaction, harm, industry, give, day

SPEAKERS

Debbie Reynolds, Cindy Warner


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world for information that businesses need to know now. I have a very special guest on the show with me today, Cindy Warner; she is the Founder and CEO of 360ofme Welcome.


Cindy Warner  00:37

Yeah, thank you so much. It's great to be here.


Debbie Reynolds  00:39

I really appreciate you being on the show. You and your team reached out to me, and you and I had a chat, and we just got along really well. You know, hey, we need to do a podcast together. I felt like it was one of those conversations where we could have probably recorded what we said, and that would have been a podcast. But I'm so happy to have you here today. We are very much aligned, I think, on privacy and technology, and I'd really love for you to share your journey in technology and why you created 360ofme.


Cindy Warner  01:10

Yeah, I'd love to; thanks so much. So, my journey into technology didn't start really with a focus on technology. It started with wanting to become a doctor and realizing that I couldn't stand the sight of blood and having to pivot. So, it was the early pivot. A lot of people talk about the pivots in their careers. Mine was early; it was at 18 years old when I realized I went to pre-med and UCSD that I couldn't stand the sight of blood and said, oh gosh, what am I going to do now? From the standpoint of an industry that was really doing very well, technology was, and so I said, hey, let me see if I would be a good technologist and if I enjoy doing and building technologies, so that's how I got into technology. It wasn't planned; it wasn't something that I even aspired to do. It was by a pivot early, early on in my life. I started my career doing implementations of ERP solutions. So if you think Oracle One, SAP, and as I continued on, what became abundantly clear is that data was really the differentiator for all those solutions; good processes, of course, but bright data made for a solution that delivered good insights for an organization, and so I worked in the back office on ERP systems for some time. Then I moved into the front office and to CRM, and that's where I really got the indoctrination of the power of data, or I should say, the impediment of data in an organization and large-scale enterprise. I made my way through my career to Salesforce during its early days. I call it pre-1 billion, and now, considering how big they are, that was quite some time ago. One day at lunch, I was talking to Marc Benioff, and I said, I don't understand why in the world it's so hard for people to give up the data so that marketing organizations can find them and they can buy what they want from those companies. It seemed like an exercise in Where's Waldo, and so I was kind of like, this is crazy. I endeavored at that point in time to really rethink the process of enterprises trying to find consumers and have what I considered at the time a trusted relationship so that enterprises could sell to a consumer what a consumer wanted to buy instead of cast a big net, you fish and pay a lot of money, which is the process that was going on at that point in time, and that's really the genesis early on. That was the genesis of 360, trying to have that 360 loop of enterprises that wanted to sell to consumers; consumers wanted to consent for an enterprise to sell them what they wanted to buy. Fast forward through the years after Salesforce. As Data Privacy and the use of data on the Internet became nefarious and became really something that was violating people's privacy. People were now having problems with their identity and what have you. I thought, well, we could combine privacy and my marketing approach of enterprise getting consented data from a consumer to sell them something and add a layer of privacy to that. That's really the encapsulated version of how we ended up with a design for 360ofme, but it was always, really, so that enterprises could market and stay connected to a consumer, but on a consumer's terms.


Debbie Reynolds  04:37

Yeah. That's fascinating. Thank you for giving that summation of what you guys are doing. I guess two things I want to talk about, and I feel like a lot of people don't truly understand. So first of all, the Internet is free now because of marketing. That's what I think that's one thing people don't really understand. So the way that the Internet is now, it was made for that. But then on the other side of it is that the Internet is made for sharing data, not really securing data. But we have these two things that are happening at the same time; they have people saying, hey, I want more control, I want more agency over what happens to me because, as you've seen, a lot of companies have done bad things with people's data, and I think people are waking up to that. But tell me a little bit about your thoughts about those two things.


Cindy Warner  05:29

Yeah, sure. Well, of course, in its early days, the Internet was a great information source; it was the dictionaries that we all used in our homes when we were growing up, or research that we didn't have to use anymore became a good R&D tool, which was fabulous. But the interaction on the Internet for commerce was really what I would call the 2.0 version of being able to connect and connect easily and connect for free. That was a brilliant thing for commerce. If you look at the World Economic Forum, when they looked early on at Commerce via the Internet, it just opened a gazillion new ways to connect with somebody, do it in this very rapid medium called the Internet, and do it for free. As you said, the problem is obviously then when we started looking at the entirety of the data chain that was necessary for that interaction, then we got the nefarious actors that jumped into the deep end of the pond and decided that data was just too valuable to only use for that transaction. They could use it for other things, and that's where I think we headed south, which was, instead of transactional data being used for a transaction, being earmarked for that transaction, and then calling it good and having somebody trust, you are asking me for that data, for that transaction and calling it a day. What happened is that data just continued to proliferate, gets sold, monetized, and what have you. That's where I think we went astray. That's where we are today with Data Privacy, but not necessarily because that's where we went astray. We went astray from data being used for very rapid transactions on the Internet into nefarious uses of that data that we never agreed to. I never agreed to my data being used in the way it's being used today. Now we're reining that in, we're coming back around the other side and say, okay, you bad actors used it for the wrong reasons. Let's go back to using it for a transaction. Let's use it for its intended purposes. So I look at this as just a reawakening of maybe Internet 1.0, where we had good actors doing good things, not having nefarious actions. That's I think we're coming back to.


Debbie Reynolds  07:43

I agree with that. I want to talk a little bit about artificial intelligence. So artificial intelligence, I think, creates more privacy risks for consumers, and I would love for you to talk about that. But then we also have the regulation side; first of all, the consumers are waking up to these data uses that they don't like, and regulators are trying to create more regulations to give people more transparency and control. Then we have AI going buck wild. So tell me your thoughts about all that.


Cindy Warner  08:16

Yeah, well, first of all, I marvel at AI for a whole bunch of reasons. But first and foremost, I was at IBM at the time that Watson was released, and the big problem with Watson and now they've got Watson X, but the big problem with Watson is in order to create what was called a trusted corpus, which was that big body of data that had enough in its network so that bias could not be created. So that it was data that was approved and consented data to create that corpus was really expensive and time consuming. Hence, when I was at one of the big tech companies, we did a huge pilot for them on customer service and using Watson for customer service. But the corpus that we had to build because we had to go out and get consent, we had to go out and say to somebody, we'd like to use this data, and we're going to ingest it into this neural network. By the way, it's not a flat file. So you don't just go in and erase line 127. It's in a neural network, which is multi-dimensional so it has multiple sides to it; you don't just pull it out, like a string out of the neural network, to be able to get that data, ingest that data, use that data, ensure and test it that it doesn't create bias, and other unintended consequences was really expensive. The outcomes, as an example, to identify and route a customer service call. That's what we're using it for. The outcomes were really tiny. It didn't give us a ton of productivity. It gave us about a 2% productivity uplift, where 2% of the calls went off to some other place where they could be answered by self-service instead of an agent, so we reduced the cost to serve. But again, the outcomes and the ROI were poor. But part of the reason the outcomes were poor is the initial cost of preparing that data, making sure it was tested and didn't have bias. That today, I think is what everybody misses. This is not new; AI is not new. But what we're overlooking is that the upfront cost of creating a proper neural network or corpus of the right data that doesn't have a bias that's consented data, theoretically, is very expensive. When you look at ChatGPT, or what have you, they used unconsented data; they used data that was readily available and accessible, and by the way, they didn't really test it for biases and other things that were the unintended consequences. Those things we have to put in the mix here are consented data and really doing great testing to make sure that there aren't unintended consequences that come out of that data. So I wouldn't tell you I don't think anything new has changed. I think there's just this giddiness about what this can do without a real open mind about the cost of doing it right.


Debbie Reynolds  11:10

I think that's true. Because when I work with people, or I talk with people like you, which I love to talk to, about AI, because people do think it's a new thing, I think the new thing is that it broke out into public parlance. People being able to get their hands on it and do stuff with it, as opposed to it being something that's more purpose-built, or like an expert type system, where those things have a narrow purpose, narrow set of data that you put into it, and now we're just throwing everything into AI and then hoping that we get some magic special thing at the end. It just doesn't happen that way.


Cindy Warner  11:52

I can tell you ChatGPT, just out of curiosity, I tried to build out a CV with ChatGPT, and I put in some tidbits data points, I put in my name, I put where I lived, I gave my address, and I told it, what I do for a living, founder and CEO 360ofme, and I gotta tell you about the time we built my CV out, there was nothing on that CV that was me. Truly, I mean, I've had a PhD in something and this and that, what have you, right? I mean, the CV that was created was actually flattering, but it was GARBAGE. So when you think about I gave it enough data points for it to go out and find me, I thought, but when you think about it, I would have had to build out a big corpus of Cindy Warner for it to be able to ingest, find all of the data of Cindy Warner and be able to create a CV that was honest and true. So when you get into things like deep fakes, you get into not having, bounding all that data, and making sure it's you, you get into some big problems. My resume was created by that; it was flattering but horrible, though.


Debbie Reynolds  13:04

Yeah, that's true. That's true. Oh, my goodness, let's get a little bit more into 360ofme. So just tell us a bit more about the platform and how companies come to it and what problems you saw.


Cindy Warner  13:18

Yeah, sure. So the problems that we're solving are really threefold with 360ofme. In our first incarnation of our platform, we just released for late March or early April. So there's really three things we're solving. Number one and most foundational is verified identity. So one of the things that is really important in being able to use somebody's data is making sure the person you're talking to is you even see on LinkedIn now, verify your profile. So whether it's clear that you're using it to verify your profile or verify that it's you, the amount of fraud today is so pervasive that we need to know who this person is, we need to know that I'm talking to Debbie really, or that I'm talking to Bank of America on the other end of the phone, who's asking me for my account information. Otherwise, I'm gonna get ripped off. So, verified identity is one big facet of our platform. That can even help with other things like who's in the car, a car maker, an emergency management company, a fire department, whatever. When they come to a crash site, they don't know who's in the car. Do we know when you start up your car whether you have the right or you're a verified person who should be driving that car? No. So we have these things called vehicle theft. So verified identity is a really awesome feature of our platform because it says who you think that is, is who that is, and they have rights and privileges to do something, drive a car, sign into an account, or what have you, give you consent to use some data because it's really them. So, verified identity is one facet of our platform. Second of all is consent. Once we verify it's you, then you can consent for your data to be used and shared for a specific transaction, as you're wanting when so you decide, yep, I want my data to be shared for this transaction. I don't want them to know my personal or sensitive information; whatever information they need to be able to give me that service, I want to share that with them on my terms. So think GPS, the most common data point that people want to share is your location, to be able to do wayfinding, or to be able to find the nearest whatever, the nearest is the nearest CVS, so the nearest McDonald's. So wayfinding is a big piece of sharing your GPS data, but I can consent, Cindy Warner, I can consent once it's verified; that's who I am; I can consent to share whatever data I want; it can be right down to a field level. We actually provide a feature called Data Minimization, which is, it's not an entire thread, it's not an entire record, it can be a single field of data that can be shared so that you can get the services that you would like to get. So that's consent, okay? Then the second piece is context, which is data minimization or data augmentation to make that rich experience. So we can augment your data and give it to the enterprise; we could say, you're female, and age group is this, in order for that enterprise to know more about you again, on your terms, so that they can provide you a more personalized and rich service, the verified identity that's sent and then the context, those are the three things that we released in this very first release that would give data to an enterprise to be able to provide the service that you were looking for.


Debbie Reynolds  16:53

Very good. I want to know a little bit more about how people revoke consent.


Cindy Warner  17:00

Oh, lovely. Well, in today's world, to revoke consent, there are some platforms out there that you can go and do a mass scan to see where your data is being used. The problem is, once you do the mass scan, trying to revoke that consent is not lovely. So you go through literally one by one by one; I did a scan, I don't remember what service I used, but there were well over 1000 places that my data was being used, and then I decided, well, I don't want them using my data. But that became a very laborious process of writing one by one to email address and say I'd like to revoke my consent. So I stopped after the first 20 of them. In our system. Once you provide consent, literally revoking consent, let's say that I've given consent to General Motors to use the data off my car so that it can give me a maintenance report every month. Okay? Well, if all of a sudden I find out that General Motors sent my data to an insurance company, and my insurance rates went up, and the insurance company defined that my insurance rates went up because they got car data from my automaker, with literally the slide of a button, okay, radio button, you can revoke consent to GM, no writing to them, No, nothing. But even better, you could say from the get-go, I only want to give consent, let's just say for my GPS for this transaction, a use case we're working on today is stadiums and event management solutions where if you're going to an event, maybe you're there for three days, maybe you're at CES in Las Vegas. For those three days, you want CES to be able to give you updates on changes where maybe a booth is moved, where maybe a session has moved, or maybe even new things that have been added, hey, don't forget, there's free lunch today, there's a free happy hour or whatever. But while you're at CES, you decide I want them to give me updates and to communicate with me. But as soon as CES is over revoke consent. So with our solution, you could say it's for a specific period of time. You could say it's a transaction, or you could say it's always but at the moment that you're done with that. It's literally the revoking it with the slide of a radio button, and you're done. Consent is done.


Debbie Reynolds  19:30

Wow. I'm thinking about so many different use cases for this, and I think that's really important.


Cindy Warner  19:35

Well, but if you think about the event management one or the even stadiums that we're working with right now, how cool is that, that I walk in, they identify that I'm within the geofence now I'm willing, me Cindy, I'm willing to interact because I got a ticket and I want to be there and I want to get all of the information about where to find the closest beer the closest restroom or whatever. But as soon as that event is over or I walk out of the geofence, adios; I don't want you to be telling me things about the hockey team. Because I went to an event, I don't want to lifetime with you, I don't want to be attached to the hockey team for a lifetime. I just went to an event. So it's really cool for them to get the consent to be able to interact and create a very unique experience for you while you're in their domain. But as soon as you're out of their domain, you're not getting spammed anymore; you're done.


Debbie Reynolds  20:25

I love it. I love it. I can definitely see some use cases around fleet management. Oh, yeah. A lot of companies have problems with that where they can track people while they're working. But when they stopped working, they're still tracking them, which becomes a privacy problem, or someone has an app where, okay, I want people to drop me off here. But then I wanted them to stop tracking me; that was tracking everything I do. So just from my experience, companies have a hard time revoking consent because they never had to before. That's right. That's right. Well, here's my keep everything right.


Cindy Warner  21:01

Well, and the cool thing is because the consent is initiated by the enterprise, the enterprise says, oh, Cindy Warner bought a ticket to our hockey game. Let's ask her for her consent when they ask for consent. If they don't timebox that consent, as during the event, you can respond back to them and say, I'll give you consent, this date and this time, but most enterprises to be compliant, and especially with the new Data Privacy law, we'll say hi, Cindy, we know you have a ticket to the hockey game, a hockey game runs from six; we think you'd probably be there half hour in advance and a half hour after. So, would you give us consent to track you from 5:30 until 8:30? You can either say yes, and the consent is provided. Or you could even modify that and say, no, no, no, I'll give you consent for only this hour that I'm there. But what you do know is that as soon as that consent is no longer given, you're good to go. They're not tracking you anymore.


Debbie Reynolds  22:04

Yeah, I think you're solving a very important problem. One is that enterprises who really don't understand how to be able to get this data and do it in an easy way, have problems and being able to find people who want their services. Then the other problem, which is huge, is a lot of these laws are putting too much work on a consumer, even people like me and you. It's hard for us even to manage this. So think about people who don't care as much about these issues as we do, and they're just trying to live life without doing this stuff. So I think it's very important.


Cindy Warner  22:39

Yeah, I would tell you that this is a really simple process to interact on 360ofme, interact between the enterprise and the consumer, super simple. I mean, it's literally super simple. You get a request from the enterprise to provide consent, and they give you the login; here's the request, you say yes or no, or you could modify the request, and off you go. Then, if you ever wanted to, to my point, before you go into your little dashboard on 360, you can see everybody you've given consent to, even the ones that are from the past. So to monitor that and manage that for you, as a consumer, super simple: turn it on, turn it off. Super simple.


Debbie Reynolds  23:22

Well, I want to make sure we chat a bit about, this piqued my interest; you have two companion products I want to talk a little bit about which are Privacy Policy Copilot and Enterprise Privacy Pulse. Tell us a little bit about those two.


Cindy Warner  23:36

Yeah, absolutely. So one of the things that's obvious to us when we work with enterprises, and I've worked with enterprises pretty much my whole career, I've been a Career Consultant at large consultancies, the likes of PWC, and stuff for my whole career pretty much. One of the things that I think is obvious when we look at the privacy journey of an enterprise is unlikely that two are going to be the same. So, we believed that where somebody starts could be very different than where another enterprise starts. They may be very mature; they may be very immature. So the companion products that Copilot and then what I call our assessment tool do two things. First of all, the assessment tool is giving you feedback, the enterprise feedback on your level of maturity. Do you really look at privacy the way the future of privacy is going to have to be looked at? Where do you have opportunities for improvement where you're not going to be compliant, where you really have to step up the game? That could be in policies, it could be in process, or it could be actually your overall privacy policies that you're publishing, so it could be in publishing policies. The other tool that we have, the Copilot, is really cool because if you have a current privacy policy, we can assess that privacy policy and say this doesn't meet the grade. In other words, it's not clear enough. It's by an area where it says if you don't give us the consent, we're not going to give you the service, which is becoming illegal. It doesn't tell you about where your data is going to be used and how it's going to be used. But we assess the policy itself and say, yikes, you're getting a failing grade here, and there's work to do. So those are really what we would call door opener products to help somebody identify where they are on their journey, and for us then to say, okay, what do you need to do on this journey to just start revising your policies and your processes to become compliant. So they're really great door openers to help somebody understand their level of maturity and where they are.


Debbie Reynolds  25:38

That's really important. I think, too, when companies go through digital transformation, as we're seeing people rapidly trying to stop things like AI, what they don't realize is that you may have been very mature before, but now you're going into these new tools, and that totally reduces your level of maturity, because you may not know how you need to change and adjust to these tools. What do you think?


Cindy Warner  26:02

Yeah, 100%. Again, to assume that any two companies are at the same place in their journey. I mean, we've looked at if you just take automotive, which is one that we do a lot of work in, if you look at automotive, candidly, all the OEMs are at different places, there are some that are a little in the dark ages, there are those that have really taken this to heart and have moved the ball down the field a lot. So it's important to be able to meet them, as we sometimes say this, to meet them where they are not, make assumptions that everybody's in the dark ages, and not make assumptions that everybody's compliant either. But to meet them where they are. That's why we develop these two tools for that exact purpose: to make sure we meet somebody where they are, and then we define from there, as we call it, the art of what's possible. Once we define the art of what's possible from there, then we can create again; you can tell that a consultant, but we create a transformation roadmap so that somebody can understand what they need to do to transform to Privacy 3.0.


Debbie Reynolds  27:01

So if it were the world according to you, Cindy, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior or technology?


Cindy Warner  27:17

Yeah, I was having this conversation yesterday with somebody, and that is when I started, obviously, I'm a few decades in this industry. But when I started in this industry, we never worried about ethical issues; the word ethical never came up. By the way, it's one of the reasons that I love working in this industry, because I felt it was ethical. I felt that it was a place where we wanted to make a difference in the world through technology, be it in healthcare, anyplace you look that we wanted to make a difference to using technology for good. So what I would tell you is capitalism, which is not a dirty word to me, but capitalism crept in between wanting to do good with technology, and the word ethical seems to have gone out the window, and we don't care so much now in this industry about whether it's ethical, as long as it makes us a lot of money. I think if I were queen for a day, I'd say, we've got to get back to being that ethical industry that can do good with technology and that we need to think, first and foremost, about unintended consequences through using people's data, not getting consent, infusing bias, any of those things that needs to be front and center, and how we think, and make sure that we are doing no harm. I mean, I'd love to see the technology industry, all of them, all the big boys and girls in the technology industry, go to work every day with a T-shirt on that said, do no harm. That's what I'd love to see is for us to get back to a Do No Harm mentality in this industry. I still believe there's a lot of money to be made. There's a lot of capital that can be made in this industry while still doing no harm. So that would be in all ways for this industry. I'm a little disappointed, because at this age in my career, to work in an industry that I mean, you see, every day, these big companies, the FTC is chasing them around the block all day long. Why? Because they do bad things. It's not that the FTC, we don't want to call the FTC bad. The reason they're chasing them all day long is because they're doing bad things. So I would love to see the technology industry get back to do no harm. That would be my wish for what would happen in this industry.


Debbie Reynolds  29:37

I agree. Here, we definitely need more human centricity, and I think that's the way the world is going. So companies that are fighting against that are going to be swimming upstream very shortly.


Cindy Warner  29:49

Amen. I hear you, and I long for that. And then again, of course, unfortunately, we do have regulators that are on to these big boys and girls that are doing the harm, and they've got a laser on their back. So it's unfortunate, though. I hate to see us have to have a highly regulated environment to do the right thing. That doesn't feel right. But I also know that they're sending signals to these companies to say, if you're gonna do wrong, we're gonna get you. We're gonna come get you in the hopes that they do a pivot and do no harm. So I think sending that message may get people to rethink their whole approach and do no harm, and that's my, utopic outcome is it people will just say, they'll revert and they'll go, okay, guys, we got to stop doing this harm.


Debbie Reynolds  30:31

Yeah, one example of that to me has been the most successful law so far with this, and that's been the Biometric Information Privacy Act in Illinois. So when Facebook, they're Meta now, they were Facebook at the time, they were using people's face prints on their platform without people's consent, and they didn't know what they were doing with it, it was a very simple law, like four pages printed out, but for some reason, these companies can't follow it. But as a result of that settlement, I think Facebook or Meta settled that case for $650 million. Relevantly, they decided that they would stop that feature nationally. To me, that's the best that you can hope for. Because at some point, they said, well, you know what, this is really not worth it. Yeah, it's just one State. I would have all 50 States have laws like this. We're cutting $650 million checks to each State. So to me, that was a big win. I want to see more companies decide we have to do something for these types of actions not to be worth it. Right now. They are worth it. Unfortunately.


Cindy Warner  31:43

I'll tell you straight away I was talking about the events in the stadiums and stuff, and you want to talk about somebody rolling out biometrics? I mean, do not be surprised here, but Ballmer owns Intuit Dome in California. There's supposedly 1300 points of biometrics in there, everything that you can imagine. So you want to buy a beer, whatever you want to do you want to get into the VIP lounge, it's all biometrics. We still don't know, and nobody's published the privacy policy for all of those biometrics. As I said, 1300 points of biometrics by God, if I were going in there, I need to know where that data is going. Because that's a whole lot of personal data that could be used for the wrong reasons and fed into models and creating deep fakes of you, me, and everybody else. I don't want my likeness in 12 different biometric solutions because I went to a basketball game, I don't want my likeness copied. Now, all of a sudden, I'm defrauded, left, right, and center, because I went to a basketball game. We don't want that. At the end of the day, when I found out about this, I was like, well, I'm gonna need to know what they're doing with all those biometrics. I love the experience side of that. In other words, I love how easy it could be on the experience side. But conversely, on biometrics, it isn't perfect. I mean, I don't know if you've used CLEAR; I was a CLEAR subscriber for a while, and I have to tell you, not once, and this is the doggone truth, not once, did I go up to a kiosk in an airport and it identified me. Not once. They had to go into the system? I'm like, Am I an alien? Do I have a weird face? I mean, I know I got a big nose because I'm Italian. But what happened here? Something's wrong. It didn't know my thumb. It didn't know my forefinger didn't know, my eyeballs. I'm like, okay, so are we really sure these things can tell this is you? Yeah, so the biometric thing. I've been on a bandwagon about the biometric thing now for a bit of time, but I'm telling you, I don't want that stuff getting out there and creating deep fakes of me or anything else.


Debbie Reynolds  33:50

Not only that, for you, as a consumer, there's no adequate redress for that, but getting credit monitoring isn't going to help you if someone stole your likeness.


Cindy Warner  34:01

No, no, you're doomed. You are doomed. I mean, good luck with that. You are doomed. It's gonna take you a long time. I mean, we're talking maybe years to try and rebuild between credit and everything else if somebody steals your likeness.


Debbie Reynolds  34:15

Yeah. Wow. Wow. So we're going into this brave new world. This is very interesting. Well, thank you so much for being such a warrior on privacy, and I love what you're doing with 360. I feel like that's the way that companies will need to go. We are definitely early for companies who are not getting the message yet. That this is the way that they need to go. So that's great.


Cindy Warner  34:37

Yeah, thank you so much. I really appreciate it. We really want to help enterprises bridge that trust gap with consumers and keep that commerce flowing. We want the success that comes with commerce to happen. We just want it on the consumers terms in a consented, verified relationship, and I don't think that's really asking too much. I think a company that believes that they should be ethical, should be doing that they should engage in this. So thank you so much. This has been a super cool conversation. You are my hero, for sure. I'm just thrilled to have the opportunity to do this with you.


Debbie Reynolds  35:11

Yeah, it's been great. Thank you so much. This is amazing. I'm sure that the audience will find this as enlightening as we have, and yeah, I'm sure we're going to be talking a lot more about Recall in the future. All right.


Cindy Warner  35:24

All right. Well, thank you.


Debbie Reynolds  35:25

All right. Thank you so much.


Cindy Warner  35:27

Take good care. See you now. Bye bye.