"The Data Diva" Talks Privacy Podcast

The Data Diva E72 - Robin Meyer and Debbie Reynolds

March 22, 2022 Debbie Reynolds Season 2 Episode 72
"The Data Diva" Talks Privacy Podcast
The Data Diva E72 - Robin Meyer and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to Robin Meyer, General Counsel at TokenEx. We discuss her passion for privacy and emphasis on it for General Counsels, what tokenization is and how TokenEx leverages this technology, the advantage of tokenization regarding data classification, tokenization can protect sensitive data, her current concerns about privacy and technology, concerns about biometric access technology and data redundancy, differences in Data Privacy in US and EU, a wide variety of Data Privacy policies in US companies and laws which are mainly consumer-based, getting buy-in and cooperation in Data Privacy, Data Privacy is becoming critical to profitability, concerns over adverse or invasive conclusions of AI systems, examination of potential future AI and Metaverse harm, and her hope for Data Privacy in the future.



Support the show

45:26

SUMMARY KEYWORDS

data, tokenized, privacy, people, tokenization, companies, token, technology, business, customers, public sector, collect, contracts, information, organizations, protect, payment, thinking, piece, works

SPEAKERS

Debbie Reynolds, Robin Michelle Meyer


Debbie Reynolds

This is a special program note. Since the time that we recorded this podcast, Id.me and the IRS have decided to stop their facial recognition program in terms of being able to roll this out to anyone who's using certain features on the irs.gov website. We do have a lively discussion that is very relevant still, as it relates to data retention, Data Privacy, data protection, and also contracts. So enjoy the show. 


Debbie Reynolds

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Debbie Reynolds 00:00

Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world for the information that businesses need to know now. I have a special guest. We were just chatting a little before the show. So, Robin Meyer, she's a General Counsel at TokenEx. Welcome.


Robin Michelle Meyer  01:11

Thank you, Debbie. I'm really glad to be here. And thank you for having me.


Debbie Reynolds  01:16

Yeah, I'm happy to have you on the show. So you and I have been connected on LinkedIn for quite some time; we chatted back and forth about different things. You were a really early fan of the show. Actually, I think you and I have chatted at one point about Susan Brown, who was Episode two. And now we're way way up. It's been well over a year since we've been doing stuff. But I've always been fascinated with you and your career and the things you're doing. You know, you put out interesting content. We have great chats, and you're very passionate about privacy. And I like the fact that I like to talk with attorneys that work at tech companies, because I feel like you have to really, really dig deep and truly understand, you know, obviously, all people who are General Counsels at tech companies, you obviously have to understand the business type, right? But there's just such a deeper aspect there, especially as it relates to privacy. And then I think TokenEx is doing some really cool things. So but why don't you start out by telling me kind of your privacy journey. And then we talked a bit about TokenEx; what you do there?


Robin Michelle Meyer  02:33

Sure, sure. So I started straight out of law school with a General Counsel job in house for a very lucrative car dealership, you can just imagine, so straight to consumers, but walked into, you know, staff that cut and pasted contracts as they wanted the terms to be a different kind of thing. And then I had a really great season with my kids at home for a while, went back to a boutique transactions firm, just taught me just incredible skills I've carried forward. But after that, I was hired as the first IT lawyer, dedicated IT lawyer for our state. And I was in the public sector. And I was embedded in the data center operations. So I was there when they hired the first CISO, sat next to the CIO, and just embedded in everyday operations of what it took for everything it was to run the states IT systems. So that brought me thinking about data, front and center. And it has stayed with me ever since. And, like I think one way to summarize was I came across a Harvard Law Review article by Anita Allen some years back, and I actually have it written on my door, and I had it written on a whiteboard at my previous job. And she said we cannot know if we're doing the right thing if we do not know what we're doing and whom we're doing it to. And I think that has really resonated. And so, when I looked to transition from public sector to private sector, it was really important to me to go to a company, a tech company, one that I could respect. I was going from the buyer side to the seller side. And then I wanted to go to one that had a privacy emphasis. So I saw the TokenEx position, and I read through there and thought, oh, my gosh, that's me. And then within 24 hours, I had three people who didn't know each other, whom all reached out to me and said, we don't know all those things you talk about, but I think this is you so, here I am, except one funny part of all that is that I didn't know what data tokenization was. And that's what a token does. So, I had a tech background; I was a programmer, even before law school. But there I was, not knowing what data tokenization was. And then once I, I watched hours of our CEO on YouTube explaining it, which he is gifted at explaining what data tokenization is to somebody who had not heard of it. And then I saw a post from someone in marketing here that talked about privacy by design and how important it was. That's okay; I think this is the place for me; here I am, and I love it.


Debbie Reynolds  05:46

Cool. Well, let's explain to people what tokenization is exactly what's happening next. So, at a high level, I guess the reason why I like companies that deal in this area, right, tokenization is because it takes the emphasis away from traditional ways that people thought about protecting data. So in the past, protecting data was like, okay, the data's in a castle, and we're going to lock the castle door, and we're going to build a moat, and the walls are going to be tall and thick, and no one's going to get in. But I feel like data tokenization's saying, wait a minute, that the real asset value is in the data, right? And then, you know, assume that the castle has been breached already. So what are you going to do to protect that data? But then also, you know, part of protecting and this is the reason why I really like the ideas around tokenization is that some people that from traditional ways of protecting data, they say, okay, well, we're going to protect this data in a way that actually makes it hard for people to use it. So tokenization is like, okay, let's banish the castle analogy about protecting data, let's actually look at the data, let's try to find a way to protect it, and then also make it in a way that people can use it, you know, the people who have who need access to it can use it without making it easy for maybe, you know, an unauthorized person to be able to use this data. So tell me your thoughts.


Robin Michelle Meyer  07:32

Right? Well, a little explanation about TokenEx. And so TokenEx is an abbreviated name for token exchange. And we're a business-to-business solution. And I often get asked, because of the word token, if we're in the crypto world, and what's interesting about that is that the Bitcoin white paper and the birth of TokenEX are about three years apart. But we are not in the crypto world. That's, that's not what we mean by tokens. So how data tokenization works is that basically, you take a sensitive piece of data, and you go, you take the data through an algorithm, and it transforms the data to leave a piece of it that's available for the business to use. But the other pieces of that are substituted with nonsensitive characters. So one question that I had when I came to tech TokenEx was, where does the data go, then. And I saw an internal presentation where they just took somebody named just as an example so I could take Debbie put it through the algorithm and what comes out, it's like the movie Transformers, it transforms. And there's something that stands in the stead of that data. So it might have something showing that would be helpful for the business. But the rest of Debbie's name isn't there. And the name Debbie isn't there. It's literally transformed. And so when it's in the customer's environment, if they had a breach, then the clear data's not there. So that's the beauty of it. There are some other really, really cool things about it, too; what you talked about with the business utility is huge. I think you may be referring to previously encryption, versus it's either encrypted or it's not encrypted. And this is a blend. So like we said, it still allows the business to use it. But the other piece, too, is maintenance-free from the client's perspective. So they've, they don't need to manage or rotate encryption keys like they would have before. If it's a payment token, which that was the problem. The very first use of it, it's morphed since then, and it continues to morph. But as a payment solution, the client has a token, they send that to TokenEX, that de-tokens the clear data when they need it, but otherwise, they just have the token in their environment. So another cool thing about tokenizing data is that it can be tokenized for a customer across a lot of different channels or sources. So when you think about a business, they are collecting data from more than one in more than one way. So they may be, they may have an app, a mobile app, they may be collecting it there, they may be, perhaps there's a payment terminal, where someone is actually in person, and they've had their credit card, let's just think about like I'm at a hotel. And I have a credit card, and they say, okay, slip it in there, that's one way. But the same exact hotel may have me online, and I'm putting my card in. So now I'm not with them; some call that Card Not Present. And so I'm making, for instance, a payment there. That's a different way to collect it. Or, like I said, it could have been on their phone app. And I might, I might call them. Now we've got a call center taking in data. And then there's other ways that they might take in data. And so one really cool thing is that you can use tokenization. And you don't have to stay in just one way that the data is collected. You can go all the way across so that it's tokenized at every inbound point; that makes sense.


Debbie Reynolds  11:49

Oh, cool. Yeah, that's a huge challenge, right? Because a lot of organizations, they have all these different data silos, right. So being able to have technology that can help them, you know, mitigate that risk of this database, obviously, duplicate it possibly out of places, but you're actually tracking the data throughout the lifecycle in the organization. So I think that's really important.


Robin Michelle Meyer  12:13

Yeah, and yes, and a shout out to customers who do use tokenization. When you hear about a breach, and you hear them say, oh, but no Social Security numbers were taken, or no credit card information was taken. It's likely because it was tokenized. Oh, yeah. Conversely, when you hear them say payment information was taken, Social Security numbers were taken. Those were not tokenized. Right. And I always think you need to connect.


Debbie Reynolds  12:13

Oh, yeah, exactly. Yeah, you need to do away with the castle analogy and actually protect the actual data. Right. So can you hold on to that? That's really cool. Also, one big challenge that I think the tokenization does, which is a huge pain point. And to me, this is, this goes back to kind of the data governance, and you know, RFC, some companies are more first of all companies have data governance, whether they want it or not, it may be not very good, or not been very mature, or maybe overly mature. But I think one thing that tokenization helps with, maybe you can explain how your technology works around this, is that companies struggle to classify data. Right? So it's such a hard thing, and you can't do it manually, literally. But you know, especially before a classification was, does this have a high business value, or not? Right, but so now we're saying, especially in a privacy context, you have to go a bit deeper. Now. It's like, so what data? Do you know, do you have data of individuals? Do you have sensitive data? You know, so I feel like this technology is helping organizations do that type of classification of data that they couldn't do manually? What are your thoughts?


Robin Michelle Meyer  14:10

I don't know if I would call it that. It's a data classification technology. I think there are definitely some solutions that are out there today that weren't available just even a few years ago that really go straight into that. I think the privacy part that's really important here is when it comes to data collection and data sharing. TokenEx steps in front of the data, so it can step in front of the data before it ever comes into the customer. And tokenize it. So in the customer's environment. Yes, there's a piece of it that if there's a theft, it's not there. But the other part of it is you can control who has access to it, and that starts to get varied in how it's actually used. So again, with encryption versus non encryption, it's in, it's out, right? You can see it can't be tokenization; you can also make it so that you, for instance, in one department might see a piece, I might see a different piece, someone might have a need to do tokenizing see the whole part of it, but it can be customized. So that in, and also it can be, it can be tokenized. Going out as another piece of this is when data is shared from one of our customers. It can be tokenized. Shared in a way that preserves privacy, there's actually something coming on the roadmap that I really want to talk about that your questions are making me think of; I don't know if we want to jump over there and talk about it, right now. But it's super exciting on this point. So there's a few things coming on on the roadmap in TokenEx for 2022. But the one that your question was making me think of, there's a new tokenization technology that is going to have an incredible amount of flexibility in the way data is protected. So today, you have a piece of data, let's say Social Security number, and you have a token, you have maybe a credit card number, and you have a token, and our customers are linking those tokens together. And then there's some other information that isn't tokenized, that might be still personal. The new technology, though, is going to be able to tokenize what I would refer to more as like a whole record. So you could put those items in the same record along with some other information and tokenize the whole record. And then really allow different pieces of the business to come into that record and see different parts of it, but not more than they need. So we all know you businesses shouldn't over-collect. But internally, they shouldn't oversee, either. So it's, it's really going to be very flexible with that. I think that is going to have huge implications for industries like insurance or health care, where you've got a lot of personal data different fields sitting right next to each other. So instead of having to have a single token for each one of those fields, you'll have a record. And it'll have a lot of personal information in it. And more personal information than you could perhaps get tokenized with the technology that we've had for some years now. So that's going to be very exciting. And there's also something called, and if I get any technical question about this, I have to use a lifeline. But blob tokenization, that's an industry term. And that will be able to tokenize scans of like a driver's license or a birth certificate. So you have a PDF with personal information, it will be able to tokenize that, which is not a current use case. So there's a lot of pieces coming that are really expanding this technology to be more privacy forward even than it is today. I have, I'd say, larger privacy implications than it has today in a way that gets away from just encryption or not being encrypted.


Debbie Reynolds  18:49

So basically, what we're talking about at a high level is cryptography, right? So encryption is a type of cryptography. So you're doing something where you're protecting, you know, you have your own proprietary methods, obviously, with the way that you protect data. So I think sometimes people get some people to use encryption to talk about all types of ways to protect data, and it's just a form of cryptography.


Robin Michelle Meyer  19:22

Right. And I think about it as internally, we don't use the term cryptography. We use the term; we talk about our algorithm. And so technically, perhaps that I'm not a super techy person on that front. But we do have a proprietary algorithm that it goes through. Yes. And another cool thing about TokenEx is that another thing that made me want to come here, frankly when I started doing research, I found out that they had a patent that could use tokenization with any in the payment space with any processor. So one thing that happens a lot in the payment world is you get tied in with a processor. And then you can't use any other. And it limits business operations. But we're not a quote, payment processor. So we can be used with any of them, which is nice.


Debbie Reynolds  20:33

I guess I'm just going to throw out a use case in some way; let's say a customer came to us and said, okay, I have all this data. And I want to be able to protect anything that we define as sensitive data, like how would TokenEx play in that space? If someone asked that question?


Robin Michelle Meyer  20:55

Sure. Today, it would need to be structured or semi-structured data. That future technology I was talking about earlier will be different than that. That's why it's going to be such a leap forward. At that point, it won't have to be structured and semi-structured. So customers typically know what they're trying to protect when they come to TokenEx, they usually, it could be a legal requirement. Tokenizing data addresses a lot of audit requirements in pretty much every lane. So a lot of our customers come to us, and they know what they have that they consider to be vulnerable. Now, if it's today, if it were unstructured data, that would be a little different, just, you know, words on and on and on would be a little bit different. But a lot of those fields in a customer's database have a certain format to them. So anything that's got a format, it's a format-preserving method. So that went, I don't know if that really goes to your question, but hopefully, not we can.


Debbie Reynolds  22:11

Absolutely, yes. Very cool. So tell me what's happening in privacy here in technology right now. There's very much concerning you.


Robin Michelle Meyer  22:23

Oh, I would say the thing that's got my goat this week is IRS use of facial recognition.


Debbie Reynolds  22:36

I agree.


Robin Michelle Meyer  22:39

I am done. And I think there are so many things about that that concern me, and it's to start somewhere. I think one thing is it's going to close out a lot of citizens; I think it's going to close out, we act like everybody can do everything technologically. And that's just not true. Everyone doesn't have the same access; they don't have the same ability. You can have you've got probably every 13 year old in the United States who could manage all they're not the ones who are supposed to be on the IRS website. And that's very concerning because I haven't seen the contract. I would love to. Maybe it would give me some assurance. The government is often exempt from privacy laws. So when you see a vendor, that may pop up and say, oh, we comply with CCPA or this or that or whatever. It's not, it's meaningful in one way, but it's not particularly meaningful in a government contract because it's apples and oranges. Often one thing that comes up, I think in public sector contracts, and maybe is different in this one, because again, I haven't seen it is I used to when I was in public sector contracting, I used to look at a contract and think from the vendor's perspective, whatever is not prohibited is allowed. So what are they allowed to do what has not been explicitly prohibited with respect to the data? And frankly, I'm just really concerned that there will just be barriers to people being able to access their own tax information, and that's deeply disturbing to me.


Debbie Reynolds  24:36

Yeah, I share your concerns. I was. This was definitely got my attention for many, many reasons. So one for me, you know, obviously there's the access issue. So not everyone's going to want to have their face scan. Not everyone has the capability. You know, it almost assumes everyone has a smartphone has the capability to scan a bill, right? That they can do this at their home. And then you know, it does raise access concerns because if you're a person who works and you're a taxpayer, you know, that system, especially for government use, should be accessible to everyone, not just to serve people. And then one thing that concerns me is that a lot of the data that they want to collect is already collected by the Department of Motor Vehicles. And, and for in the US, we have this Real ID. So Real ID there are they're actually recollecting the same information that the Department of Motor Vehicles collects, but the Real ID is basically a national database of stuff. So if you look at what they want to collect, that stuff is literally already collected there through this Real ID system. So one is why aren't the IRS talking to the Department of Motor Vehicles to get this information for ReaL Id because I think that's tied in with Homeland Security or something like that. And then the other thing is, what is being done with this data by this third-party vendor? So I read some bits and pieces of a contract, not all of it. So they're allowed to keep the data for seven years; for some reason, I have no idea. To me, once you authenticate the person, then that's it. You really don't need the information beyond that. And then, you know, there's obviously an issue with, you know, like I said, access, like, who's going to do this? So, I think, unfortunately, I think what's going to happen is that a lot of people are going to fall back into a paper process, you know, they're not going to want to deal with this at all. So they're going to say, well, let's go backwards. And let's, you know, do a paper process and steaks. I don't want to like deal with this. And I wonder if the IRS has the staff to deal with this onslaught of people who aren't going to go this route? What are your thoughts?


Robin Michelle Meyer  27:10

Right? Well, one thought about your comment, your observation, about the fact that the data has already been collected, brings up a really foundational issue for the public sector. And that is that they really don't share data. They're very territorial with their data as a general rule. And there's a lot of like, you talked about castles in the game, there's a lot of different castles, and they do have thick walls, and they don't have any breezeways between them. So I think that's one huge issue is that working out the data-sharing agreement between two government agencies, it would just I don't even know, frankly, if you could accomplish it to the satisfaction. The other thing you just said about going back to paper is that they may have people who just I saw something earlier that said that technology, you will not be required to file your taxes. I don't know if that's the case or not. But you may see people who just don't even cooperate because they don't know what to do. They just throw their hands up. And the seven-year rule is very interesting. There's a common clause in public sector contracts: the vendor will keep all records related to their performance for seven years. And where that seven years comes from, I'm not sure we had it. We had it in the contracts that I worked on. I tried to trace back; where did that come from? I never did exactly find it was five, and it was five years plus two. So it added up to seven. And so if somebody isn't really thinking through that records related to your performance, it could be interpreted to mean all the data, unless, unless it's limited somewhere else.


Debbie Reynolds  29:18

Right? Yeah, this is a sticky one. I think. I think this is an example for people who don't understand the difference between, for example, Europe and the US about Data Privacy rights. So because we don't have privacy as a fundamental human right. You know, there's a gap there, right? And obviously, even governments in the UK, they exempt themselves from certain things, but you know, we don't have a right not to share, right? So we can say I don't want not to share, there's a very big gap.


Robin Michelle Meyer  30:01

I remember, some years back. I think it was in Google maybe who had a data processing agreement, and it was back around 2014 or so. And it was eight countries, I think, in the European Union that didn't want to allow Google to come in because of their Data Privacy practices. And I went over and looked at that data processing amendment that they entered into with those countries. And I read that, and I thought, so that is, what they're doing, all these things they promise not to do they are doing those things in the United States, because we are. Our legislation comes from that consumer perspective, which when you think about that, and just my opinion, but somebody said fresh to me, and I had no idea, and I have no background, I'd never heard any particular framework. But if somebody said to me that my personal data, information about me is mine, if I buy something from a company that's big enough to sell to lots of people in my state, where my situation? And so, if that's the situation, then I get to have some say about what happens to my information. But if that's not the situation for a particular sector, then I don't. That wouldn't sound logical to me.


Debbie Reynolds  31:29

No.


Robin Michelle Meyer  31:29

But that is where we are,


Debbie Reynolds  31:31

It totally is; well, people, I think people aren't really up in arms about it that much because they assume that they have more rights than they actually have. So they think, okay, we're a free country, we're the land of the free, home of the brave, and then privacy is like nowhere to be found in that. So, you know, these organizations take full advantage of the fact that it hasn't been stated that, hey, you need to respect the rights of individuals. And then we have people like Apple where they say, okay, we're going to treat your data as if you have the privacy of a fundamental human right. That's great, right? But not everyone uses Apple, right? So you can't afford to use Apple products being the data that you use other places that don't have that. So it's kind of it's definitely a patchwork. And then also, because our laws are very consumer-based, as you said, a lot of these rights don't kick in unless you're consuming. So if you're not consuming, you don't really have any rights, other than that,


Robin Michelle Meyer  32:36

And consuming with a large company that has certain thresholds or something.


Debbie Reynolds  32:41

Right. So what is in terms of privacy as a profession? So what you know, I love your kind of executive advice, right? How do you get by within an organization around privacy? So you're a bit lucky, where you're at a company, that part of your ethos is around preserving or protecting data? Right. So privacy is kind of part of that. But then also,  being a person with a legal background, working especially with developer groups and different things like how do you get buy-in, within the organization related to kind of privacy initiatives?


Robin Michelle Meyer  33:35

Well, I think, of course, like you said, it TokenEx, it's front and center, and it is part and parcel every day at the very core of what we do. So we don't have a problem internally with that buy-in. For the reason, you just said but at a different place. I think it would, and it would be really important. And this has changed, I think, in the last couple of years, to be increasingly more the case. But I think the risk of reputational harm is starting to really come out to the forefront. And today, you know, years ago, the analysis might have been does this apply to us? No. So we're not doing it today, even if something doesn't apply to you completely, or maybe only partially, is that the messaging that you want to send out? That we don't have to, you know, we don't have to take certain precautions, or we don't have to protect your information in certain ways. So we don't know how that sounds. I can't imagine who would adopt that as their messaging.


Debbie Reynolds  34:54

Yeah, well, we're definitely seeing you guys taking advantage of this, which I'm happy about, which is, you know, privacy is now becoming a bottom-line revenue issue, right? Because the companies that can't gain the trust of customers, those customers are going to go somewhere else. So, you know, that has a direct bottom-line impact where before people are thinking, Well, obviously, we'll adhere to this regulation, which is fine, right. But then, like you said, what type of story do you want told about your company? What, what is kind of your value system? And then we see now people can vote with their feet, right? They can decide, okay, we don't want to do this. And then what we're also seeing is that companies are holding each other more accountable. Right, like, like the third party data transfer things? What are your thoughts about that?


Robin Michelle Meyer  36:05

Right, I think, and I think GDPR has really helped us. It has brought us to the party, without Federal legislation, or without, you know, even state-specific legislation. I mean, it just today companies want to be international companies, they, they want to have a global presence. And that's part of it. And not only GDPR but in other areas of the world are, are following suit in various ways as well. So I, I think. Definitely, there's there is a regulatory aspect to it. But to your point, the story in the branding of companies is really important. I mean, it's, it's become a lot of your calling card, right? What do you do? Do you do what you say you'll do? Do you stand behind it? And do you not do what you say you want to? And you're right into and that companies are holding each other accountable. I also think that companies more and more like in business to business are more open to if there is an alternative that provides a solution that would be more protection airy than companies want to look at that. So if you're a solution, and that's, that's core to what you do. You can unseat somebody who might have been in place. But they haven't really; they haven't really been paying attention to stay in on their game in this area.


Debbie Reynolds  37:50

So true, it is. There's definitely an opportunity in the marketplace right now. So for those who are looking at it closely enough, you know, it definitely can become an issue. So if it were the world, according to Robin, and we did everything you said, what would be your wish for privacy anywhere, whether it's technology, law, human stuff? What are your thoughts?


Robin Michelle Meyer  38:13

Hmm, I mean, do I have the ability to have some magic involved? Totally. Okay, well, then, there's magic involved. I want the cat to go back into the bag. And let's be a little more thoughtful intentional on the front side. And not just friend, you know, screaming out of the gate, and kind of messed it up, to begin with. I think part of what we're trying to do here is to mitigate things that have happened and stop them from happening in the future. And if we had just been a little more intentional on the front side, but here in the United States, particularly, we're instant gratification people, and you give us the opportunity to have a carrot, and we run after it, and we throw caution to the wind and here's my information. And it, that's what I would do.


Debbie Reynolds  39:09

Yeah, it's true. Right. So the problem that we have now is in some organizations, they're like, okay, you want this thing right now. So you get the instant gratification of it, but then the harm is way down the line, right? So you don't see the harm immediately. It just happens over time. So I think being more thoughtful like you say, I tell people if you're going to share your data, share it for a good purpose. Don't give away your fingerprint for a $10 coupon, you know, so you know, so they think about,


Robin Michelle Meyer  39:46

Right, I think another thing I often think about to you in this area is something I get concerned about is the invasion of our mental privacy. And there's information out there about us. There's my name, address, and a zillion other things about me. But, you know, I don't say everything that I think. And there's a reason for that. And sometimes, I might even say too much. But I really don't like the idea of algorithms in this such being applied to my thinking. And that sounds kind of woo-woo. But no, no, it's, I mean, I think I, I've had some talks with people know much more about that than I do. And as they describe that in, and they start to see some of those documentaries, that just get straight into the edge of that. I just think that kind of seems to be where we might be. And that's really uncomfortable to me; there's a reason things get to be in my mind. Someone else's mind.


Debbie Reynolds  40:57

Absolutely, yeah, I'm very concerned with things like emotional AI, versus trying to read your facial expression and possibly take action against you based on that, or, you know, it's not really science-based, but you know, but I mean, if people want to believe that you can do this, maybe they watched some movies, but, but things concerned me now, it's kind of the next step for from that. And this is kind of a Metaverse type thing, where if you're wearing a device or something, let's say a camera can read your facial expression, right? And it's trying, okay, Robin is happy Robin sad, she's upset or whatever. But once you start wearing these devices, they can match your expression with your physiological things that are happening, right. So like, say, if you already have a headset on, it shows you something makes you upset, they're matching your maybe your heartbeat, you know, racing or something. So to me, that's like a different other category of stuff. And even that, to me, can be, you know, problematic because you're still trying to make an inference about someone or take some type of action as a result of this data that you're you put together, and it may not be right.


Robin Michelle Meyer  42:17

Exactly. It may not be right. Exactly. And listen to what you're just saying, Debbie. I think you know what I just said about. I wish we could go back and put the cat back in the bag and be more intentional. Well, is this a different bag with a different cat? And are we being intentional here? I don't know the answer. Everybody wants in the Metaverse, and everybody wants to partake, and they can't get there quick enough. And it's all the buzz, and I think that's great. But are we thinking through that? And I think that we are.


Debbie Reynolds  42:56

Yeah, it's concerning. It's concerning. I tried to decide what, where I wanted to put my focus. So my focus really is on kind of that development cycle like that early stage of come people talking about the Metaverse talking about IoT talking about, you know, like, tokenization. Because I feel like that's where I can have the most impact, because I feel like, you know, there isn't going to be an adequate redress for harm that will happen as a result of some of these things and AI and emotional AI, you know, let's say you get cut you can, you know, in Chicago, in Illinois, they have a Video AI law, right, saying that you have to tell someone you're using, like AI algorithms for video, and you know, so they have like, rules around that. So the problem with that is like, let's say you're doing an interview, and the algorithm says, like, you know, Robin is angry, so we're not going to give her this job. You're like, Well, why don't you give me the job? And they wouldn't tell you they like, oh, well, you know, whatever. But what of that information or that inference? It follows you throughout your life? That's problematic.


Robin Michelle Meyer  44:05

Right. And then, to your point, the algorithm which even knows that that was well developed, to begin with. So as they're, as they're reading my smile and thinking I'm something I'm not, doesn't think I'm, you know, I'm smiling and agreeing, and it thinks I'm on something beyond that or less than that. I don't know.


Debbie Reynolds  44:28

It's a challenge is a challenge. We'll definitely have more to talk about for sure. Well, well, thank you so much for being on the show. This is great. I know the audience will really love this episode, and I'm sure we'll chat.


Robin Michelle Meyer  44:42

Debbie, it's been great to talk to you. Thank you. You're welcome.