"The Data Diva" Talks Privacy Podcast

The Data Diva E86 - Erik Rind and Debbie Reynolds

June 28, 2022 Season 2 Episode 86
"The Data Diva" Talks Privacy Podcast
The Data Diva E86 - Erik Rind and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to  Erik Rind,  Founder & CEO at ImagineBC. We discuss his career journey to privacy and marketing technologies, his personal and professional privacy concerns, user choice and data agency, the monetization of data to benefit the individual, the asymmetry of benefit from data between marketers and the individual, data rights differences between consumers vs. humans rights, regulation outpaced technologies, privacy and monetary freedom, data scraping, the danger in using biometrics, facial recognition and harm of data abuses without adequate redress, recognition of imperfection in data systems, and his hope for Data Privacy in the future.

Support the show

44:30

SUMMARY KEYWORDS

data, people, monetization, privacy, app, blockchain, technology, consent, government, companies, money, world, human capital management, stop, facial recognition, happening, information, years, sell, thinking

SPEAKERS

Debbie Reynolds, Erik Rind


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know right now. I have a special guest on the show, Erik Rind. He is the founder and CEO of Imagine BC. And I'm sure he's going to explain to all of us what that is and what that means. Eric and I had a chance to have a chat a couple of weeks ago before we did this recording, so I have a feel for what he does and what he's doing. I would love to get his insights about privacy. But before we get into the privacy talk, why don't you talk a little bit about yourself, your trajectory, and your career? And what started your interest in privacy?


Erik Rind  01:11

Oh, sure. It's kind of an interesting story, I guess. I'm a gray hair. Not your typical entrepreneur, given how old I am. I'm now a grandfather of three. But I've spent 28 years pretty much in the what's called human capital management space of technology. It's about as boring as you can imagine. It's HR payroll and benefits. I mean, you can't pick a more boring area of technology to spend most of your career. But about three years ago, I was at a conference related to that industry. And I met a fellow, and we spent the entire conference discussing blockchain technology. And it doesn't seem that long ago, but three years ago, there was no Cambridge Analytica yet; I mean, that story hadn't broken, right? Nobody knew anything about Data Privacy, nothing that's it's in the news these days was happening back then. And crypto was kind of known by people. But certainly, the ones who are in it will eventually make a lot of money in it. But blockchain technology itself, even I didn't know what it was. I'm a technologist. And over the course of this two-day conference, he was basically telling me that this is a game-changing technology. And being an old guy, I thought he was out of his mind, you know, when you're old, you poo-poo everything to begin with. But he was an intelligent guy. And I felt I really should learn it. And I started to do my own studies. And when I was done, I realized, yes, the blockchain technology was a game-changing technology. And but I'm still in the human capital management business. So how do you use blockchain in the human capital management business? And for me, the answer became pretty clear pretty fast. If you think of an HR Payroll benefits system, we have all the data; we have your name, Social Security number, we have your bank account, we have HIPAA compliant information, everything there is about you. We have it to be able to process payroll, HR, and benefits. And from technology, from a company point of view and a technology point of view, that means we have a single point of failure. If a bad actor gets through our firewall and grabs all that data and steals it from us, we're not in ADP or Paychex; we can't pay our way out of that problem. So being a small boutique company, the business that's it, that's the worry, I was thinking, well, blockchain technology, distributed network ownership of information, what if we took all that private data, give it back to the people, because frankly, we don't need it that often. And when we needed it, we'd ask them for their permission to use it. If you want your W2, let us have your name and address, so you can send your W2. So I started down that path with a prototype of that. That's really how I segued from human capital management into now it's Imagine BC because we were about six months into developing this prototype when we really decided what we were talking about was people need to take back control of their data. They need to be able to make decisions over what value they should get from it and whether they want it used by third parties. And three years ago, that wasn't a conversation you want to have related to HR Payroll. So we asked ourselves if you're going to get people to listen to the importance of them controlling their own information and being in charge and making decisions of it, what would get them excited about and we decided well, making money from it, but get them excited about it. So that's what ImagieVC is about. Imagine BC is about helping individuals take back control of their information, make decisions over it. When they make decisions over that, we help them monetize that information and also their intellectual property, which is also still a different type of data. So 28 years in the HR space, and now I'm in the Data Privacy and data warehousing space.


Debbie Reynolds  05:01

Wow, that's a great story. I think a lot of people don't think about HR because HR is not the sexy thing, right? When you look at movies like Mission Impossible. Tom Cruise hanging from the ceiling, or I'm thinking he's looking for HR data, like, no one's thinking about that, right? But you're right; you're now using a gap.


Erik Rind  05:23

Here's a good, bold story, so I have a lot of stories. But here's a good one. About 10 years ago, during the course of my work in HR Payroll, we did a lot of work with the Federal government, where I'm located here in DC; it's a natural client V app. And because of that, at one point in my life, I had my fingerprints taken. And about 10 years ago, I got a letter from the Office of Personnel Management saying, Dear Mr. Rind, we're writing to you to inform you that we've been hacked, and your fingerprints have been stolen. And should you ever be accused of a crime in which your fingerprints are being used as evidence against you, please produce this letter. So I have kind of this Get Out of Jail Free card. But when it gets into the concept of Data Privacy, data control, data monetization, those terms, which is I'm an advocate of Data Privacy, but frankly, I'm never getting my data back. So it's out there. I'm more interested in right now, where I am in my life, dealing and short term dealing with control and monetization of it. But privacy is huge. The government needs to be a partner in that. And it has to go on a forward basis; my grandchildren who are being born here, it'd be wonderful if their identities were just to them and their parents. And when they were 18, they could show their identity through a technology like blockchain. Why is my Social Security number in an unsafe system? With the Social Security Administration? It just makes no sense to me anymore.


Debbie Reynolds  06:59

Right? Wow, I never heard that. Yeah, I guess, if the cops show up at your door, you but I had that letter...


Erik Rind  07:04

You can bet that letter is stored in a safe deposit box.


Debbie Reynolds  07:09

Oh, my goodness, oh, my goodness. So tell me, you’re a parent, you're a grandparent, a business owner, you're a human living in the US.  What is happening with data now that is concerning you the most around privacy, either personal or professional?


Erik Rind  07:30

It's interesting because one of the catalysts for me to make the switch from my profitable HR company and start to invest all that profit into Imagine BC was exactly that, looking forward. And I like to say that as you advance through life, first one, you're just your own person, you'd close your eyes at night, and you think through your future, and it's you. And then you get married, and you close your eyes at night, and you think through your future that you and your spouse, and then eventually kids come to the equation and you think through because you realize a night you think what their future is going to be like, well, now I have grandkids, and I close my eyes at night, and I think through their future, and it scares the living heck out of me. It's just the world they're going to grow up in controlled by big tech right now. And that's data; big tech is fired by Google, and Facebook, but also the AI and ML-driven robotics, which are driven by data. It's a worrisome future. And I started Imagine BC because I finally said to myself, I can sit around and bitch about this. So I could try to do something about it doesn't mean I will, but at least I'm investing money and striving to create technology that creates what I hope is a brighter future where everybody participates in the value of their data, not just a few hands.


Debbie Reynolds  08:52

Right, right. Wow. Yeah, I agree with it. So so, let's talk about where we are now and where we're going. So this is the reason why data monetization is a conversation that has to be had, right? Because I feel like some people have tried to dodge this conversation. It's this inevitable, in my view. So this is what we had in the past or what we're having now. So we have corporations that can take crumbs of data that you would never even think that anyone cared about, stitch it together, package it up, and sell it to other people without your knowledge or consent or anything, right? So what we're seeing is countries trying to create, or even tech companies, try to create situations where it's more transparency in that type of data exchange and their consent in there. So that doesn't work for everything, but it's at least a step forward. But what's happening and what we're seeing as a result of some big companies losing a lot of money as a result, for example, Apple's App transparency. What's happening is that those data companies are being starved of those really rich insights that they had before. And so if they can't get that, basically what Apple is doing, saying, me as Apple, I have a first-party relationship with these people, they trust me, so they give me data, but why should I let them piggyback off the trust that I built with this person? Right. So they're saying you build a first-party relationship with this individual, and you can have the same rich insights, right? But we know that can't happen. Because these companies that are taking your data are like ABC Corp, you don't know those people, like they're not doing anything for you. So why would you give them data? You don't know them, and you don't know the benefit. So then, those companies have to create financial incentives, right? To be able to find a way to get you to take data or get you to give their data. So I'm seeing tons of sweepstakes and giveaways and stuff like that, like, they just want to do anything they can to try to get that information. But the issue is, what's that process about how this data is being monetized and used for other people's benefit? And once people find out how much their data is worth, right? They're going to want to know, why can't I benefit from my own data, like this data came from me, other people are taking it without my knowledge or consent. They're benefiting from it to the tune of trillions of dollars, right? So why can't I, first of all, know what my data is worth? And then why can't I profit off of the data that's being used about me? What are your thoughts?


Erik Rind  11:51

Oh, you're preaching to the choir there. So it comes down to this, right? Yes. I mean, what Apple's doing is great. First of all, Apple and I have some issues with Apple, but in general, related to Data Privacy, what they're doing is great because at least the concept of consent is good. But it gets down to this which is what is the value of my consent? So all it says right now is that company XYZ is going to use your data. If you use this app, they're going to use your data. Are you okay with that? Now, it comes down to how badly I want to use that app or now, right? And if I really want to use that app, then I'm going to say yes, but the value of my data is probably greater than my use of that app. And that's where it's the value of consent gets very interesting. I always think of ways, right? I tried to protect my data is probably better than most people because I'm very aware of it. But something like ways, that thing tracks me now. I only have it on when it's when I'm using the app. So Apple is helping me there. But they know everything about me; what's the value of my consent is the fact that they help me through traffic from A to B worth, just the use of the app, or what Google's making from my data, knowing that I passed by that store, and I stopped at this location gets to be very low. And that's the interesting question. I believe it's more than just consent; its value to that consent. And I also believe that it's become an absolute necessity. When you bring in that little thing I call robotics because I believe there's going to be a mass displacement of jobs. This means we're going to have a lot of unemployed people; the government cannot afford to pay all those people, nor should it. So it's incumbent on new technology companies to find new ways to have people make money. And one of the richest veins of that is your personal data. We know how valuable it is; why shouldn't you receive value every time it's used?


Debbie Reynolds  14:03

Right. And also, there's a very asymmetrical nature of this value exchange. So you hit the nail on the head when you're talking about the value of your consent. So I think a lot of this is because of the free products free model. So if something is free, there's really no ceiling to what that value exchange can be. So I have a huge problem with this. It's like contract law. So like any, I'll give you an example. So there was an app a while back that had a feature that would take your photograph, and it would try to like aid you in some way. Right? So you see like you'd be 10 years old or whatever, people thought it was really fun. But if you read the privacy policy and the terms or conditions, they basically said if you use this app, we have the worldwide exclusive right to use your photograph forever. And to me, that seems like that's very asymmetrical, right. So it's like an app that you play with for 10 minutes, and then you give it away your rights, your image forever, like who's saying that your pictures are going to end up on a billboard? In Japan? It's yours. So what are your thoughts about that?


Erik Rind  15:23

Another good example of that same concept is 23 and Me, right? That I've never done that I don't care about my ancestry. I'm not sure what like; I don't care what. I pretty much know where it came from, Eastern European. But all these people who provided their DNA information at 23 and Me with the idea being that 23 and Me would tell them their ancestry, but like you said, they didn't read the fine print. 23 and Me took all that DNA data and sold it to GlaxoSmithKline for $400 million on a non-exclusive basis. And I did the math at the time of that transaction; it turned out to be based on the number of users that 23 and Me had, that turned out to be $1,000 a person what I paid, like, what did you, what do you owe? You don't have those things cost, but I paid some amount of money to get to find out that I have what I am. And then you made $1,000 from me. Weird.


Debbie Reynolds  16:21

Right? Exactly. Then on the other side of that, you're talking about the monetization side for commercial use; that data's also sold to law enforcement. So they're like, hey, we think you're covered because of your DNA sample committed this crime, this is just bananas.


Erik Rind  16:43

This is where government needs to be a partner in the idea of Data privacy and data control. My personal belief is that I don't believe government should be buying data about people. I mean, so legislators should stop that. First of all, it's a bad use of our tax dollars, right? Government is nothing but the taxpayer. So it's a bad use of our tax dollars that look to do something better with that money than to buy people's data? And you should never use it for law enforcement. I mean, that's nothing good that can come from that.


Debbie Reynolds  17:14

What are your thoughts? So I have lots of friends in different countries, especially Europe, they're superheated about this data monetization thing. So the thing that they get really upset about? Well, in Europe, privacy is a fundamental human right, right? So they want they have a right to not share their data. And here in the US, we don't have that right. Our lives, our rights kick in when we're consuming, so if you're not consuming, you don't have like the same protection. But part of it is this is my data; it’s me as an individual. I don't want to put a price on my data, which I get, but someone put a price on it, and they're selling it, so you can't stop it. So what would you say to someone in Europe who feels this way about data monetization?


Erik Rind  18:05

To be honest, I think they're being naive. Right? What good is, if personally, you can't live in a silo, so you're going to share your data, you have to be in the connected technology world that we live in today, and you're going to eventually have to share some data about yourself. So the ability to say I don't want to share is crazy; you're going to have to share something. I don't know how you want to move to Montana and live off the land, baby. But if you want to live in the tech world that we live in, you could have to share your data. I think the idea and the argument of a monetization, and you're right, Europeans are very touchy on this subject. It’s debated right now. Because we're not feeling the pain that I think that I've said just before that we're going to feel when robotics replaces jobs, and you need people to have money, then everybody should be compensated for their data. I'm not thinking about it for today. I'm thinking about it for tomorrow. Right? And tomorrow is coming. You can't stop it. There's just too much money being invested in robotics. The Chinese right now are serving all the food at the Olympics with robots. The pandemic threw a spotlight on the idea of, do I want a robot? So let me ask you, in a pandemic world, would you feel more comfortable with a human being cleaning a room or a robot cleaning your room? I can't catch COVID from a robot. So I what I might not have felt comfortable about before COVID. I'm going to feel a lot more comfortable about after COVID, and that's going to accelerate. Now. If I now have robots cleaning my rooms, what do we do with this 3 million people in the SEIU union? When we have automated trucks, what do we do with the three million Teamster drivers? That's 6 million people. What are they? What do we do with them? Yep, they're not going to go work for Facebook; they're not going to go work for Google. But what do they have that has value? They have data. And they should be paid for it. So yeah, I get it. Right. It doesn't seem right now, but boy, it's going to seem right in the not too distant future. And it's going to be a necessity.


Debbie Reynolds  20:24

So you touched on something that so this the other the underbelly of monetization, that people are concerned about that is, Warren Buffett doesn't have to sell his data, okay. So he doesn't have to earn money on selling his data. But someone who has a lower income, say, needs money, they may be more prone to want to sell their information, right? So that you're creating, I think digitalization is creating a caste system. And part of that caste to me in privacy is that, unfortunately, it seems like people who can afford to consume things or consume products to protect their data, people who can't afford that, can't get that same protection. So basically, what we're looking at is people who want to earn money, probably, maybe earn it in different ways, not having the same protection, right? Yes, maybe you or me would have, and then maybe that will be an impetus for them to want to sell their data more because, for example, Apple rolled out this privacy, app transparency, it’s great. So as an Apple user, that's great. But if you can't afford to own iPhone, then you can’t take advantage of that. And so companies see that, and they want you to not buy iPhone, they want you to have some other phone, or they want you to get a smart TV or a smart speaker in your house because they can earn more money from you. What are your thoughts? You're absolutely right, and then you hit the nail on the head, which is that those people argue against the monetization data of people who don't need the money. But my argument is that a lot more people are going to need the money in the future. There are a lot of people who need it now. And there are a lot who are going need it in the future. Now, if you yourself don't need it, I still think you should monetize it, and take the monetary value and put it back into a pool, and get it to those people who need it more than you do. But I'll tell you who doesn't need it, Google and Facebook. That's too few hands holding the value. As you open the program, trillions of dollars, right? That few hands are in control of trillions of dollars of value. It's just wrong. And I think regulation hasn't been able to keep up, and regulation who can never keep up with technology. So technology will always go ahead. But now we're seeing exponential growth of technology in a way that instead of regulation, being a few years behind, decades behind, they’re going to be light years behind what's happening in technology in it. So you think about, let's say, if you have something in your house, right, so want to take something from your house, you will say that's theft, right? That's wrong, and you wouldn't want them to do that. But somehow, we don't make the same correlation to data. So someone to take something about you. They're like, well, we think it's okay because we didn't physically go into your house, we just digitally went into your space, or we saw you walking down the street, and we put that data together and sold it to someone else. So it's like a victimless crime, so to speak. What do you think?


Erik Rind  23:56

I think it was like two years ago; I think it might have been the Postmaster General back then. Or maybe it was a Congressman or Congressperson. They said something to this effect, which is what if I told you tomorrow that the United States mail system was free. Everybody could send all the mail they want for free. The only problem, the only issue, is that we get to open up them all and read them. You'd be aghast at the concept of opening somebody's mail? It's illegal. Okay. Well, I just described Gmail. That's how is something that no one had we be aghast about. Billions of people allow it to happen every day, and they don't feel the pain. But the thing is, we're starting to feel the pain. We didn't feel the pain for 15 years. It started with Cambridge Analytica, and we're starting to feel the pain. We're seeing that this type of data in too few hands can be used for very nefarious reasons. And it's also accumulating wealth into too few hands. So I know you're going to hear it from nowhere to go. It's just wrong. Yeah. That's government. I have I've had a chance to speak to a number of US Senators, their offices, Senators Gillibrand, Bakker, Warren, and even Josh Hawley on the Republican side; it's a bipartisan issue that both sides are looking at this, and I applaud them, I get a little concerned if they tried to take the step into what the value of our data is, they should, that's what the market should decide. What the government should make sure of is that data should flow freely. And that I, the individual, will have control over. And let me give an example of that is here; I'm a small app developer, right? I'm a small company. And part of my app is the ability to help people find a job. And I'd love to be able to grab your resume off of LinkedIn; you already have all your data on LinkedIn. Right? Now, it's your data. It's your resume; it's your career. If I try to, through machine language, grab that information, LinkedIn is a robot stopped me cold. What's even worse is that many times, even with consent, I can't get the data. Even when the user logs in, they stopped me from getting the data; that portability of data, the government, definitely needs to step in and say, these big corporations collected all this data for free. Right. And now they're building these moats around it, and they won't share. And that's your data they want. So even though you might have seen value from app x over here, they don't make it easy for you to get your data from a over to x. And we know how people hate hard. So they're not going to use X's app because it's too hard. That's like competitive.


Debbie Reynolds  26:53

Right. And then the flip side of that is we have companies like ClearView AI, so they have the money and the time, and the resources to do data scraping. So there are a lot of data scraping cases going on, right? But then they're using these databases to create data; they're using this data to create databases that are going to be used for people searching for criminal activity. So you don't have to be a criminal to end up in one of those databases.


Erik Rind  27:20

Yeah. Yeah, that's the Minority Report. Totally. You're guilty of a crime before it because you thought about it. That's scary stuff. When it crosses over from just commerce into judicial and truly personal rights, it gets very, very scary.


Debbie Reynolds  27:43

Yeah, I want to jump in on this issue. This is in the news. It's very timely. And that was the IRS, and the US wanted to do facial recognition for anyone who had an IRS DACA account for certain things that they would do on those accounts. And so I want your thoughts about this. So this is what happened. And as I've been reading up on it, I didn't realize the depth of what was happening. So apparently, they have this company, I think it was called ID.me, that was doing these facial recognition, data captures, and database print things with the IRS, and at first, they were doing it on a smaller basis. So they were doing it for certain people. So then I think they had announced at some point that they wanted to do it for everybody. And then that's when the pitchforks came out. And then the IRS decided that they wanted to stop doing it. And I guess they had this company had done the same thing for other states. So they're shutting them down in all these states. But that didn't happen until over 70 million people went through this process, right? So a lot of these people are low-income people, people that were given these $600 tax credits or whatever. And again, back to my analogy, Warren Buffett probably didn't need to get facial recognition for his account. So what exactly is it that we're doing with this data? Now, they're having a thing where a person has to request that their data be deleted from the systems, and my view is to delete the data. They shouldn't have to ask you to delete it.


Erik Rind  29:24

Yeah, facial recognition in general. I'm against it, except maybe Casino. Right. They were the first ones to use it. I know. That's a fairly small, little ecosystem there. If they want to keep your card counters out, God bless them. But I'll tell you; you should never be using it. The government should never be using it. I'm not sure it's there yet. I think there's enough evidence to say there's certainly bias in it. I'm not sure if you can remove the bias from it. So now you get down to like I talked before about the value of consent. Now, what's the social call? Is the value of the social class? That is the gain from having this worth? What could go wrong? And the answer is clearly no. It's not. What can go wrong is far, far worse than the game. Stop it. Yeah, just never do it. Even if the technology were there, I don't care. You don't need it. You simply don't need it. And maybe that's because I'm old. You don’t need this stuff.


Debbie Reynolds  30:27

Yeah, I don't think it's right. I think you can achieve your goals without that. I think the problem that I have with facial recognition, and the reason why I'm out trying to be involved, and companies that are doing this are trying to do it, is that it is not going to stop, and the harm can be catastrophic.


Erik Rind  30:43

So that's the social costs I'm talking about, right? The exact value you're getting for using it isn't worth what the catastrophic harm could be.


Debbie Reynolds  30:54

Right? Exactly. Exactly. So I mean, you have to find a way to; first of all, I'm not a fan of using imperfect technology in high-stakes situations. So you shouldn't be arresting people because you think they look like an image on a camera. Do you know what I mean? Without any corroborating information, it'd be different, or that's part of your investigation. But that shouldn't be the whole part of it.


Erik Rind  31:18

Now it'll be so good. It'll be foolproof. But it's almost like DNA data. Right? Now, I will argue that DNA data is being used to prove the innocence of people who spent too much of their lives in prison is a wonderful use of technology. But the science is pretty much there. Right? I don't think we're anywhere near the confidence factor of facial recognition that we are with DNA data, right, but you're trying to use it similarly.


Debbie Reynolds  31:50

That's right.


Erik Rind  31:51

No, stop. Right, stop.


Debbie Reynolds  31:56

It's almost like a fallacy where people think that computers are perfect, just like we are perfect as humans, right? And we build imperfect things, but then you're trying to pretend like it's perfect. It's not right.


Erik Rind  32:07

When I was first coming to grips with how AI and robotics work, right? It's a complex thing, but it becomes simplified when you think of it as a child, as a baby's brain. Right? How does a baby learn the difference between a cat and a dog? Well, if every time you took your child and you pointed at a cat, and you told them a dog, guess what they would call a cat. They call it a dog. Right? That's a bias built into it. So it's the people teaching the AI these things, thankfully, don't teach themselves yet. That gets into the matrix. Right. But yeah, as long as there's human beings teaching them, yeah, there's going to be bias. I don't know of a human who doesn't have some bias in their life. Right. And it's one thing if you're trying to keep out of a casino; it's another thing. If you're using it in the world, the taxes, in the judicial system. The social cost is way too high. Keeping somebody out of casinos whose social class is very low. Putting somebody behind bars on a miss, this facial recognition, social class is way too high. Yeah. And they're all trained the same way. I mean, that's how they're trained somebody, there's people, they're looking at images teaching these things. Dog cat dog. Yeah.


Debbie Reynolds  33:28

Exactly, exactly. So tell me a little bit about how you approached the data monetization things in your company?


Erik Rind  33:40

Yeah. So for Imagine BC, what we believe is that we're like the little piece of the monetization world. And we're the piece that says if you're going to target me for an ad, and I'm going to watch that ad, I just received value for watching that in theory, and I harken back to this my dad if you know the show, man, then, you know that show on AMC. My dad was a madman. He was an advertising executive on Madison Avenue in the 70s. And he actually bought and sold, figuring out where to put commercials on television and on billboards and radio. And the best data he had back then was a thing called the Nielsen box. And what you have is like 1400 Nielsen boxes back then for 120 million people in the country back then it was like 1400 Nielsen boxes. We’re all in Nielsen boxes now. We all are walking, talking Nielsen boxes. Our data is constantly being scraped. And just like Nielsen, people were compensated; we should be compensated. We are walking and talking to us about it. So that's what Imagine BC is. You choose how much data you want to share about yourself. We'll take that data, and we'll look for a market for it. People would like to advertise to you. And when we have a match, we'll put that ad in front of you. You'll choose whether you want to watch that ad or not. And if you do choose to watch that ad, you will be compensated; you will know how much you'll make for watching that ad. And then what we hope is you'll take the money that you've earned from watching that ad. And then you can do one of three things of it with it in our, in our app, you can use it to buy content from us, you can use it to donate to a charity for those the Warren Buffett's of the world. They don't need the money. But hey, watch it and donate to Toys for Tots; there's a child who wants a toy. Or you can put it into your bank account and go buy groceries and pay an electric bill with it. So we say take the value of your data. And if you're going to watch an ad, or fill out a survey or deal with an offer, right, those come through us we find a market view, if you choose to participate, you get compensated for that. So we're a market maker.


Debbie Reynolds  35:54

I love that. I think it's so cool that your dad did that. And then you come up with this idea is brilliant, actually, it's really cool.


Erik Rind  36:00

It's kind of strange. That was growing up; I'd said to my father I would never go into advertising. So I spent 28 years in the HR world, and now here I am back in the advertising world. I don't actually create the ads. But I mean, but I truly believe that why should Google and Facebook make the money from these ads? When it's our data, we should make a fair value from it. We should make part of that. Yeah. And the other part that I said is that we hope you'll use it to buy content. That's important, too, because a big distinction in our app is if you remember the Netflix documentary, the social dilemma, right? If I watch that twice to realize what creeped me out about it. So I've watched it. I knew something really creeped me out about this. And I will actually get it by the time I finished the second time I realized what it was. Now again, it's because I'm 60. Right? So when I was 20, there was no such thing as Facebook or Google, but I'm an adult already. But I'm a 20-year-old adult. And if you look at my 60-year-old adult, you look at me as a 20-year-old adult, and I'm a very different person. As a 20-year-old, I was a light, right-leaning Republican. Now I'm a left-leaning Democrat. Right? I never ate anything but meat and potatoes as a 20-year-old. Now I'll try just about anything. I've grown as a human. Because I didn't have Google and Facebook feeding me one thing that they thought about me, and that's what The Social Dilemna was saying. Because we want to capture you to feed you ads, we're going to give you what we think you want to see over and over again. They would never let me grow as a 20-year-old. They would keep feeding me meat and potatoes and Republican garbage. Right. Now in our world. Yes, we have advertising. Yes, we have content, but the two never meet. Right. So on the content side, we want you to explore and grow. And on the advertising side, we'll target because we want to give you something of value, then you make the decision to buy content or not. But that content is open that allows you to grow as a human being. Big, big distinction. Very similar platforms. Big differences from a social aspect.


Debbie Reynolds  38:09

Oh, totally, I agree with that. I mean, the problem with that is like you say, 40 years later, Eric, the AI will say let's say you went to a restaurant and they say, well, we’ll all we have a steak is because we have something in the database. Do you like steak? So we're not going to serve you anything else?


Erik Rind  38:32

Right. Right, the menu will adjust; it will only have the most expensive steak dishes on it. Yeah, you're absolutely right that I will be walking down a food aisle.


Debbie Reynolds  38:48

You can't go down this aisle in the grocery store. Because you don't like…


Erik Rind  38:53

The Google and Facebook models don't let you grow as a human being. That is a really big social issue. Because we don't stop growing as people. But they are they're retarding us. There's no retarding in the sense of stopping right, no negative connotation of that word. They're stopping our growth as people because they're pigeonholing us. And I proved this to my sister. So my sister is a crazy and totally anti-Trumper. And I told her, you go out of your mind because you got to understand what the right doesn't understand what you're seeing. And I said because they don't ever see what you're saying. She goes, what are you talking about? I said, look, you type in this query. And we both got out to Google, and she typed in a query, and I typed in a query, and we got completely different results. I said you say they make a decision what to show you. And those people you think are seeing what you're seeing. They're seeing what they want them to see—scary stuff.


Debbie Reynolds  39:54

Oh, it is. The analogy I give people is that when you're on the Internet, people think it’s like a library where you can walk in, and you can go to that section that you want or whatever. But where there really is a section in a library with books that were chosen for you to use. It gives you the impression that you're seeing the whole library, but you're only seeing the surface.


Erik Rind  40:14

Yes, that's perfect. That's a great way to think of it. Exactly. And maybe again, maybe it's because I was a history major in college with a background to be a tech guy. But I was a history major in college. So it was drummed into my head; you have to look at first sources, right? You have to look at those little footnotes. I read those things. Even when I looked something on Wikipedia, I look at all the footnotes, and then I have to check all the footnotes before I believe what I read on Wikipedia. That's how I was trained in college. How many people do you think do that they just take for granted that without understanding. Anybody could have written that thing on Wikipedia?


Debbie Reynolds  40:51

Absolutely. And then, like you said, we all assume that we see the same things. And that's not the truth.


Erik Rind  40:57

That's the most scary part. You, and your counterpart, on the other side, don't understand that you're not seeing the same information. So you're yelling at one another? How stupid are you? How stupid are you? And you're going, what do you mean, I'm reading? Right? Well, both, really, what's been spooned out? I love it, but the library has changed. It's awesome.


Debbie Reynolds  41:23

So if it were the world, according to Eric, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether it's regulation, technology, anything?


Erik Rind  41:33

In the long term, its government has to be a positive partner in this. So I believe that the government has to get behind something like a distributed network of blockchain-type technology. And when I'm born, my Social Security number, it goes onto my blockchain, which my parents have control of until I become an official adult at, which they might, so I won't need probably need a driver's license because we probably won't have cars. But every piece of government-issued identification, my insurance cards, right? Everything that says who I am is mine. So now, if I have to prove my age, right, I just put my phone down, and a red or green light goes on based on that blockchain identification. Nobody has to see the information on that. Right. When somebody cards you, they're seeing your address. They're looking for your age, but they're also seeing your address because it's on your driver's license. Why, right, they would just need to make sure that you're really 21 years old, right? Well, right to prove it through your blockchain. So what I wish is a government, and this is going to happen; it'll happen towards the end of my lifetime, not like my current lifetime, as I'm on my final way out there 15 - 20 years, but the government needs to get there. The technologies will be there to do it. And it's a necessity to do it. We really need to be in control of our identities.


Debbie Reynolds  42:59

Yeah, that's amazing. I love that idea. And I've always advocated I think people should be like a bank of their own data, and then they choose what to share. And then, at that point, they know they have visibility into what they're sharing, and they can revoke their right to share. So I think that's just the way it is going to have to be in order to achieve this type of agency.


Erik Rind  43:24

Right, and you only share what you need to share. So in the case of I want to buy a beer and I need to be 21, you only need to see my age. You don't need to see where I live.


Debbie Reynolds  43:34

Right. Exactly, exactly. That's so cool. Well, thank you so much for doing this. This is a great conversation. I'm sure people will really love it.


Erik Rind  43:43

I appreciate you having me on. It's always fun to talk about. And it's good to speak to somebody like you who lives this every day. You get to see a different all these different opinions.


Debbie Reynolds  43:54

Yes, fantastic. Thank you so much. I really appreciate you.