"The Data Diva" Talks Privacy Podcast

The Data Diva E180 - Jesse Tayler and Debbie Reynolds

Season 4 Episode 180

Send us a text

Debbie Reynolds “The Data Diva” talks to Jesse Tayler,  Team Builder, Startup Cofounder App Store Inventor, and Founder of TruAnon. We discuss Data Privacy, Digital Identity, and Jesse Tayler's background as a team builder, startup co-founder, app store inventor, and founder of TruAnon.  We discuss  the shift from physical to digital products, highlighting the friction-free nature of the digital world and its implications for the economy. Jesse shares stories about creating the app store software model and meeting with Steve Jobs. The conversation then shifts towards the intricacies of digital identity and data security, drawing attention to the challenges posed by the transition to a purely digital environment and its impact on privacy and security. Jesse Tayler emphasizes the need for a service that banks and social networks want to use to ensure online safety. He discusses the importance of individuals owning and controlling their identity while also providing value to the services they use. Debbie Reynolds expresses concerns about the sustainability of the current verification processes and advocated for a system where individuals have more control over sharing their data rather than having to provide their identity information to multiple accounts.

Jesse Tayler passionately advocates for a purely digital solution to address the challenges of digital identity and privacy violations. He emphasizes the need for a future where digital identity is seamlessly integrated into everyday life, ensuring safety and trust for individuals and governance. Tayler expresses concerns about the lack of trust in the digital realm and the challenges of transmogrifying between the physical and digital worlds. Overall, the conversation highlights the need for a digital solution prioritizing trust, ownership, and transparency in identity verification, and Jesse shares his hope for Data Privacy in the future.

Support the show

49:51

SUMMARY KEYWORDS

people, digital, data, digital identity, app store, identity, curates, talk, accounts, world, happening, control, dmv, software, physical, universe, startup, fraudsters, privacy, private

SPEAKERS

Debbie Reynolds, Jesse Tayler


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest. I'm really not kidding. This is Jesse Tayler. Today, he is a team builder, startup co-founder, App Store inventor, and founder of TruAnon.


Jesse Tayler  00:43

Well, thank you for having me, Debbie.


Debbie Reynolds  00:46

Well, it's almost unfair that I have you on the show and introduce you almost as if we don't know each other. So we met on LinkedIn, I think I made a comment on something, you reached out to me, you and I have had a couple, several hours of conversation and back and forth about stuff, and you're just like a bolt of lightning, you're a breath of fresh air, you and I connected a lot because we're kind of geeky technical people. So it was really fun to chat with you. But you have an extraordinary career. And I love the way that you think as you and I connected on a lot of technical stuff, we like to geek out about that. But I want you to talk a little bit about your trajectory in tech. You are, I'm going to reiterate this, the App Store inventor and I want you to make sure you weave in your Steve Jobs story and then talk about how you came to be the founder of TruAnon.


Jesse Tayler  01:47

Sure, of course, interesting moniker, App Store inventor, has become the question that everyone wants to ask and talk about, which is fine. Of course, I had a number of Steve Jobs related startups, and I felt that story was not perhaps the most interesting. But since the advent of the iPhone, and honestly a lot of maneuvering with using the App Store as a business mechanism to dominate that marketplace. These are not things that we were inventing. So, in a certain way, a lot of the things, whether you like about the App Store or hate about the App Store, weren't necessarily our creation. But you're right, Debbie, you and I were able to connect on that tech level; I prefer to tell people that I don't know that I'm a rock star. But other geeks, we recognize each other. There's something interesting about entrepreneurship success, Steve Jobs, and the App Store without going into too much detail because, of course, we're here to talk about trust and privacy. But Steve Jobs was a very interesting character in the tech industry, he was able to understand communication from engineers in the way that we spoke to. There are very few people who could sit through a demo of something like electronic distribution, and really, in that one session, lasting I don't know, minutes, be able to basically grasp it. As I recall, Steve had only three words of response to the whole demo, which was, I like it, and he just walked away. That was enough to tell us that there was interest and sure enough, come Monday of the following week, the phone rang, and it was Steve. Of course, he wants to talk to Peggy, who's my business partner and the real sort of CEO of the startup, I was the tech guy. But what people don't always realize about that story is that it's really not a success story. It’s, in fact, a failure story. After all, what was I doing pitching Steve Jobs in electronic distribution from my Seattle startup? This was the end of the runway; we had no more funding for what was a rather expensive thing to create that had taken some years, and sales were not where they needed to be. So we had shown the app wrapper to IBM, to Microsoft, to the big tech players of that era in 1990, and there was no interest. In fact, very few people understood it at all. We had just proven the value of putting software in a box. Microsoft was taking over the world by showing that software could be sold. You see, software was essentially free before then because there wasn't really a way to buy it. People didn't want to buy nothing, and so when you show them electronic distribution, they say, where's the box, where's the book? People don't want to come home with nothing. This is just not what people want. Now, today, you look back, and you think, how could we have expected people to drive to a store called Egghead? Just paying $80 to take on something and install it yourself doesn't make sense from this direction. But at that time, Steve Jobs was the only person who we could show it to who could understand it. So frankly, that was a Hail Mary pass, the last ditch effort; it was already over. But this was something that you could do to try to get the technology into the hands of someone else who might be able to do something with it. I don't seem to remember anyone offering me a job. I think that I was moving on to other things by then, anyway. But there was no camera crew; there was no applause. This was a fire sale, and I don't recall anything of the nature of the deal. But I have some Apple stock today to show for it. But it was a failure story. So the irony is that my success story of today, some 20, 30 years later, is really a failure story from that other era, and I think the thing that's interesting about that is just the nature of success and failure because it is a success. Right? It is a success. What does an inventor want but to change the world? It's just that, as a startup, it did not have that famous trajectory that we want to see in the world of unicorns and fun stories. So I think it's a fun story because it's exactly not what you might expect.


Debbie Reynolds  07:13

Yeah, but it's important, right? I think a lot of truly evolutionary things people don't recognize the importance of until later. It's almost like being a Van Gogh, right, where that value doesn't happen until after time. So I think it was a pivotal moment because I remember having to drive to CompUSA, or whatever, buy a box of floppy disks, and throw it into a computer, and now we just don't do software that way anymore. It may have seemed insignificant at the time, but it's quite significant because that's the way the software is now. It will be hard for you almost to go into a store, very few stores you go into, you have to buy something that you have to install yourself or your computers. First of all, thank you for thinking that way. I appreciate, I love your mind. The way that you think about these things, and I love that you share that story with us. Let's move on. TruAnon, this is another big thing. Let's say your app store story was the start of the way that we download or use the software. Now, to me, the next phase that is going to change or revolutionize the way that we work in data will be around, ideally, around a person; the way things are now, you log on to your computer, you go to Google or Microsoft, you're part of their ecosystem. But in the future, I think they're managing your permissions and different things like that. So, in the future, I think we’ll be where there's more focus on the individual., and that control happens on the individual level, as opposed to ceding that control to a corporation and having an individual change the way that they work for every service they use. But I'm getting a little bit ahead of myself. So tell me a little bit about why you've done TruAnon, and what you're doing here. It's fascinating.


Jesse Tayler  09:25

I think you've touched on a couple of different points that are important. My generation, our generation, what we've seen since, say, 1980, with the compact disc, digital TV, what have we been watching in the 70s? We had electronics, it was entirely analog, and now virtually everything is digital. The App Store is a really good example because what did we do? We took something physical that was already digital beforehand. Software was digital. We put it on a floppy to make it into a product that you could buy. That's because in the previous world, we went to stores to buy products and products were physical, and we needed to bridge this digital divide. But ultimately, the world becomes entirely digital, purely digital. Because in that world, digital is friction free. It's a different universe. It has different laws of physics. Now, in the world of software, there's an economy, just like the real-world economy, as measured by how often money changes hands between people. Here in New York, you might spend a bit of money getting a donut, and then you get on to the subway, and then, you know, blah, blah, blah, all of these little things are the reason that New York has an amazing economy. In the software world, the economy stopped at the rate at which physical goods could be transferred. Imagine if I find a bug in my app, I have to release 1.1. I just printed up a bunch of boxes. I wrote manuals with incorrect information. These things have all been printed, manufactured, and shipped; how do I possibly update that bit of software? That's the same thing as if saying, I went to work that day, but I only got to buy the donut, and from there, the economy stopped. That was it, a donut. So that's what was happening in a transmogrification world, right, one where we tried to take digital things and make them real just to make them digital. Now, let's think again, about what's happening in identity. In the real world, I walk into a bank, and you say, hello, sir, can I help you? Yes, I'd like to make an account. Well, sure, sit right down. Oh, I see by your driver's license that you live right down the street. In fact, that's right next to my sister's house, you know, oh, right, I've seen her walk her dog. There is a context of authenticity. I can't walk into a bank and be, well, people; I can't look different; I can't be a different age; there is so much context in the physical world that surrounds that physical ID. Now, when we took that into the digital universe, what did we get? None of that, no, all of a sudden, I could script up a version of my introduction to the bank. Right, I can attack the bank from a distance. I can steal pictures of other people's identities, I could submit them, and you can't even tell. So the real way to identify fraud is by examining the data, and you find fraudulent activity. But this has already happened, and all you did at the beginning was put up a big gate. That added friction for the legitimate but actually didn't really make a difference for the fraudster. They still got through the gate, got in, and did what they wanted to do. So I could find that there is an interesting bookend here between my first invention, perhaps of the App Store, and TruAnon because they are both doing something that is the same. They're both taking something that should be entirely digital, and they are solving various problems along the road to pure digital solutions. Today, it's impossible to look back and imagine buying software and again, or Best Buy, and for a software developer, we can't imagine trying to keep up with releases and fixes and economy. So let's talk for a moment about what it means in the data. When I transition membership to a higher value, verify the data. However I do this, verified data is always worth more if you're training AI. You can lose the outliers, you can focus on truth, you can focus on reality. If you are members in a social environment, you can focus on the people who are other good people sharing that expectation and avoid those with fraudulent intent. So what does TruAnon do that is very interesting, and how does that relate, perhaps, to that first app store? What TruAnon does is, instead of requiring us to surrender some private documents, physical things that we take pictures of, or however we submit this., and instead, I can confirm and connect a variety of digital counterparts, the links, and profiles that I want other people to know me, my blog, my LinkedIn, my GitHub page, right? You and I probably both have one. These are very legitimate; fraudsters very seldom set up, get up profiles, and spend 15 years checking code in and out of other people's repositories. It's a joke, of course, but imagine for a moment how subtly genuine that truly is. So, the idea of digital identity has been around, but what TruAnon does that's unique and important is it tabulates the depth of interconnections between all these different properties. This tabulation provides a rank and a score that make you visibly trusted, even without my having to see what properties you confirmed ownership of, to boost that score, and this leaves privacy in your hands. Now, all of a sudden, the owner of this digital identity controls how other people view and share my identity, I get to say how you view me, but TruAnon has an unbiased tabulation, it simply reflects what you decide to make presentable in public, and this has an effect where it naturally transitions membership to this higher value verified state. Without force, people want to do credible people share the benefit of being more credible. But this also makes visible your intention. So if your intention is fraud, well, you need people to trust you enough to interact. So there's this visible thing that says, I'm unverified; please ask me to confirm my identity. Now, if I have 15 years of context on my profile, that's not a problem. Trust me anyway. But if I come on there, and there's anything the least suspicious or filled out about my profile, you might well ask yourself, why would I trust this person, I just spent 10 minutes boosting my score. I have control over my privacy; there's no real reason for them to avoid transparency. If they are avoiding transparency, one might assume that they don't want to be known, that they would like to perhaps have fraudulent intent., and so by offering a transparent way, that benefits credible people but adds risk to fraudulent people. What TruAnon does, is it provides a way that people can back their own claim of identity and give it measured confidence. But it also creates a natural split along the user journey, where credible people become more so, and fraudsters become outliers; this happens immediately, and the more people interact, the more these two sides become visibly different. And this is transitioning membership without force. This is turning your data from unknown data, where you do not know if this data is valid for advertisers, valid for whatever it is that you're doing, right? This is fine, and if you ever want to know how valuable this is, just think about Twitter's takeover. Pointing finger, I could simply say I think you have 50% fake accounts, and you might say I think I only have 5% fake accounts. Who's right? No science. You see the whole valuation of the company halve and double based on the idea that they might have more or less control over identifying legitimate accounts. So, to sum it up, there's value in getting rid of fake accounts, identifying fraudsters, and getting them off your system. But when is cure more important than prevention, if you can identify legitimate people, that's where the money is, right? That's the difference between a $4 Facebook profile and a $40 Facebook profile: confidence that this is a real person who could buy the stuff that I'm advertising. So whatever your data is, it's going to be more valuable. If the account holders are identified as legitimate. It's far more important to the business than the whack-a-mole that we play, trying to identify the fraudsters.


Debbie Reynolds  21:17

We could talk for hours, and we have in the past, because I love the way you're thinking about this, where you said something that just made something click with me right now. That is that you and I have talked about this; for example, the way that some companies are going about identity verification, it's like they're trying to take an analog process, like your example, by going into a bank and sharing your ID. They're trying to move that into the digital world, and it doesn't work, and the result of that is that as opposed to what you're doing through, where you're trying to take the risk away from real people, the way the companies are trying to do identity or identification, where you're surrendering some private information that may create more risk for you what the current model is adding risk to real people. And the way that you're trying to do it, you're shifting the risk to the fake people, as opposed to the real people.


Jesse Tayler  22:17

That is a great point, and in fact, I wish I could think as simply as you do. Of course, I just spent 20 minutes describing what you just summed up in a sentence; it doesn't work like that, right? These are just tipping points. This is natural human behavior. This is just how it works. You're presented with an option; you take the one that makes sense. If transparency adds risk to your fraud, then of course, you're gonna avoid it, you know. Whereas if verifying doesn't require you to surrender anything that leaves you in control, well, suddenly, it's it's more valuable. That difference, however slight, is universal. You don't come with fraudulent intent and sometimes find this doesn't add risk. You don't come as a credible person. Sometimes you find verifying doesn't add value. This natural behavior journey is, I think, the critical thing to apply force and coercion is got to be a last-ditch effort. And I think what people have been left with is a legacy. When we went online, we thought, how do we do this? Well, we'll show your ID, okay, no problem. We got a camera, we'll take a picture of it. Well, it turns out that's easy to duplicate. Oh, you know, all these problems arise from it that we didn't anticipate, and 20 years later, these things have grown up from pickpocket crimes, rocking democracies across the globe, bah. Everyone is online, Debbie. My children have grade school accounts. Of course, they do. How do you think I communicate with their teachers and get their grades? Go in and get a piece of paper? Sorry, for those of you who are younger than I am, these things are probably not shocking. But as a parent in my generation, I'm like, wow, things are different. You know, and this is interesting. In the past, I could take that card that my school would give me and on the way home, I could have a few friends perhaps alter it on the bus, and when I got home, my grades might look a little better. In the digital world, I can't do that anymore. The data is curated on my behalf, on my children's behalf or all our behalf, right? This is the same all over the place; when I go to pay my parking tickets here in New York City, I'm also logging in to the account that the Department of Motor Vehicles uses to allow me to control my driving record; I need access to that data. Now, the DMV curates this data on my behalf; I'm not supposed to take my driving record and alter it. But I am allowed to make a copy of it and show it to you. So that you could look it over and say, Aha, I could see Jesse has a clean driving record; we can hire him here at Uber or Lyft. And in the physical world, this is fine; I would come in, and I would show you my papers. And these papers would represent a paper somewhere else or a row in a database somewhere else. But in a purely digital environment. There is no paper backing up these documents, right? These documents are physical things of things that were already digital, just like the software example we mentioned in the App Store; we benefit by making the world purely digital and making this leap. But we also changed the way all these things work. The laws of physics are different in the digital universe. So where I could cheat my card on the bus on the way home, in my generation, my kids no longer have access to the curated data. Just like I can't change my driving record. But if I make a copy of my driving record, two things can happen. One, I might alter it. Okay, now we get into blockchains and comparisons and encryption things that try to put fingerprints on things, and that's all well and good. But it makes things more complicated. More importantly, I've now taken my private data in order to just answer a question: am I gonna have a clean driving record? Yes or no. I've taken all of this data out of the DMV out of the hands of those curators whose job it is to maintain that security. That's their job; they wake up every day and think of how to secure that data appropriately. Now I've made a copy of it. Nobody can detect that copy in the digital universe. It has no footprints in the digital universe; you can make unlimited copies without detection forever. So every time I go to one of these institutions that curates private data on my behalf, and I make a copy of it, I forever break the bond that these people have worked for; they don't even know, but they still wake up and try to secure my data. But what happened to it? I transferred it to Uber so they could review it, then they transferred it to other people to store it and dispose of it appropriately, and all these other GDPR considerations and privacy laws and blah, blah, blah. Well, what if we do this entirely digitally? We have a completely different idea. My digital identity can act as a broker between these two digital properties. I could say the one and only identity that has confirmed ownership over my DMV account, where I pay my parking tickets, is the same one and only identity who also confirmed ownership of my health care, my insurance, and my Uber application; these are all purely digital things. In the normal version of identity, we think of identity as being the private data that we own. It's not. Private data is a sad aspect that we've tried to use to identify something that only you should know. But of course, that means people steal your private data, so they can answer that same question. So I think there is a tremendous value in brokering an answer between two parties, where really all they need is the answer. Are you authorized? Yes or No? Have you been COVID boosted yes or no? Do you have insurance to drive? Do you own this car? Right? Do you have a clean driving record? Are you over 18?


Debbie Reynolds  30:10

Are you over 18? Right.


Jesse Tayler  30:13

Are you under 13? My children can answer questions that are critical. They can prove they are a student at PS 116. Not in the past now, right? PS 116 already curates those accounts for every student, including retiring it when that student leaves by whatever means. This is a canonical source, as folks in the geek business of yours, I would say. And that just means these are the guys who have the answer that you believe, right? If you want to know if my driving record is clean, as to the DMV here in New York State. So I think that's what's at stake. It's the difference between us risking our private data, not just that we surrender, but also that which we were making copies of that we were handing around for other people to review. The other side of that data is the companies that we use every day; you can't make everybody safe online if you don't have a service that the services we use want to use. So it's one thing to protect our data; a lot of blockchain solutions do that, at least somewhat adequately. But if you don't have something that the banks want to use, that the social networks want to use their Facebook and Twitter want to use, then you're not going to be able to be safe in those environments. So it's really critical that we solve this digitally for all parties and we give everybody something, allowing us to URL and control our identity and control how other people view and share our identity. That's what we want. But the services that we use, LinkedIn, would like you to identify yourself, just because when you do, your data is worth two times, or five times, or even 10 times as much. That's why LinkedIn wants you to verify. Now, if we could verify and get value from ourselves, and we could recognize masqueraders, who are avoiding? Well, suddenly, LinkedIn’s ability to make more money off my data is fine with me because I've got what I want, and at the same time, LinkedIn or Twitter or whatever service I'm using can adopt something that gives them value that makes their service more safe, more trusted, but most importantly, more business. That's what business is. So we've got to give them something that makes money, frankly, if we want to make ourselves safe.


Debbie Reynolds  33:22

I agree with that. First of all, what you're explaining really is data plus context over time. So that really is the key to this verification that you're talking about. But the issue that I have, and you and I have talked about this, is that we're in an age where a lot of companies, for whatever reason, may want that verification for all the reasons that you said it said, but when I think of things like FIDO Alliance, they have a statistic, they say the average person has 90 plus different online accounts. Would you have to give your ID to 90 different accounts? I mean, this is ridiculous. So to me, I think it's an unsustainable issue, and it has to go the way that you say, where it's more about the person having more control and being able to share that data out as opposed to the individual going one by one trying to share their identity information in an analog fashion to a company. What do you think?


Jesse Tayler  34:24

You know, that's interesting. I hadn't heard that statistic. But honestly, I'm not surprised. We forget how many accounts we create. Often, we're exploring things, and we create throwaway accounts, and I noticed that throwaway accounts that we create are the ones that get hijacked later by fraudsters. So, while we're exploring services and services or trying to get people involved, it is also a place where a bunch of junk gets created that later causes all kinds of problems. But I think what services run into is a problem where there's friction, there's privacy issues, there's all kinds of known problems with traditional verify that sometimes it's just very expensive. There are a lot of times when even social verify just doesn't work. It doesn't give people what they want; it doesn't give the business what they want. But it's the only option out there. When you think about the evolution of these tools, I think there's always a circle leading back to a purely digital solution. And you don't always know why. But there's a couple of facts about the digital universe make all the difference. Every time we have to re-assert our identity, you're asked again to submit the very same data; the moment you press submit, you've transmitted the package of what, when stolen, is used to pass the very same gate. Now, every time you do that, you're not just adding the risk that your stuff is going to be stolen and your bank account is going to be emptied or something you unwittingly contributed to the same crime that you're trying to avoid, that you're seeking protection from. You've put your private data into the honeypot pool of stolen data. Now, yes, the company, and they're going to assure you they've got triple blockchain double something around. But then it turns out six months later, you read in the news that some employee was just being paid to siphon data off there and found it easy to write a script or something, and nobody even knew, nobody noticed. How could you write again, data digitally can be copied indefinitely without detection. So every time we go to reassert our identity, we're really kind of contributing to it. Services that force people and coerce people through identity gates are also effectively contributing to identity crime because the value of your social security number used to be nothing. All it does is represent your insurance in the government of the United States. Who cares, right? But it's a unique number, and so computer people thought, oh, well, there's a unique number only Debbie Reynolds knows her number until there's value in it being stolen. Once I can steal that number and pretend to be Debbie, then the number is valuable. So by using private data in the way we do, that's why people are stealing our private stuff. So to get back to your question, when we don't own and control that identity when we don't take control over our data, which has been curated for us, that's the purpose of it; these people are paid to protect and curate that data on our behalf. Whether it's my kids at school or your driving record, your private data should just remain private, and it should remain digital. The fact that other entities need to know, like, you're about to board an airplane, can we check your COVID status? Sure. I swiped my boarding pass; the boarding pass didn't need to know anything about my identity. All it needs to know is TruAnon, what does the owner who has assigned ownership over this account? Tell me what they have given me for data, and  TruAnon can respond unknown. Nobody, nothing, or TrueAnon can give you whatever that owner has offered. Yes, I have been COVID. Verified. Here's my date. It was at Lenox Hill Hospital on this day and date, and that comes from the hospital or my health care. It's an irony that the value of all this data is really just only starting to be used by us right now. It's already there. Why shouldn't I be able to use GitHub to get confidence on Facebook? Why shouldn't I be able to securely use Uber? Yeah, I've got a clean driving record; check it with the DMV; no opportunity for somebody to steal anything, no opportunity to take that card and change your grade from a C to an A on the way home on the bus. Data has integrity, and these answers are provided at the source. It's a beautifully simple equation. Trust ownership is transparency, and this is what people want and need. So rather than constantly reasserting our identity over and over again, if you own and control your identity, the more you use it, the more it builds up confidence. It's no wonder that people don't use Friendster anymore. People don't use MySpace anymore. Some people are stopping using Twitter; there will be a day when people don't want to use this, or are they moved from California's Department of Motor Vehicles to New York's Department of Motor Vehicles? All of these things build up confidence in my one and only genuine digital identity already exists already is the activity that I do online. Why shouldn't I be able to prove that I own it and broker it? That's the whole point. So I think we're on another one of these digital precipices where in 5 or 10 years, we'll look back, and we cannot believe that we used to force people to show a driver's license picture or pull out their passport, just to prove who they are. Or that I go to the school that I go to. MIT already curates your MIT email, right, that already curates your diploma for you. All of these things are already ours. And our digital identity is the only way that we can wield this stuff in the digital universe. Forget the physical stuff; it has to happen.


Debbie Reynolds  42:31

I agree with that. I'm smiling; this is amazing. If it were the world, according to you, Jesse, and we did everything you said, what would be your wish for privacy or a digital identity anywhere in the world, whether that be technology, human behavior, or anything in regulation?


Jesse Tayler  42:52

Well, I would like to see, of course, that we secure a future world for our children that isn't like, what is happening, and I think I alluded to this earlier of just this growing up of pickpocket crimes to the point where it's rocking democracies. I don't think we realized while we were marching online, and I mean, we, as in humanity, society, the world, and perhaps COVID, finally put a nail in the coffin of the holdouts of the digital universe. But now, the digital universe is simply an integral part of our lives. My children log into their school. I mean, we couldn't be more digital. It's happened. It's done. So I see a lot of things in the world, from fakes and masquerading and politics to laws attempting to control things like what you and I were discussing; the United States, some of our States are adopting laws in a State in a region, a physical, real-world reality. But they want to map this into the digital universe to digital reality, where there is no State of Louisiana. But what does Louisiana want? They just want people who are under 13 not to be able to access certain parts of certain sites. There's lots of these rules. Some of them are legal, some of them are policies, and some of them are parents. Some of them are selves. We all have reasons to be true, but anonymous, we all have reason to need to control how the world sees us and how we are seen in the digital world, and it's got different rules and different realities in the digital universe. So I want to see a way that these laws can be enacted, a way that parents and feel like they have control when we moved our world rolled into the digital realm, we got more of everything. I mean, we got more geography, right, we got more distance. You and I, you're in Chicago, I'm in New York City, we're having this conversation as if we were right next to each other, and then we're gonna publish it online for many people that we don't even know, to experience this. That's amazing. If you had told my 20-year-old self that was going to be happening, I would have just been awestruck, but we've gotten less of something, too, haven't we? In fact, I think we've gotten less of only one thing, and maybe I'm wrong about this, but folks can comment. Because I'd really like to know, I think we've gotten less of one thing. That's expressed, and I think we've gotten a lot less of it. And I think this is because of this transmogrification, a word by the way that I absolutely don't know what that means. But I think it has something to do with making things in the digital world and making them in the physical world and back and forth. Something is lost in translation. So when you have purely digital things, they have different sort of laws of physics, and what I think we need to do is create a purely digital solution because that's the only one that these laws that these interests, whether they be privacy, whether they be governance, whether they be controls make any sense, because otherwise, we are transmogrifying, we are getting lost in translation. So I think a purely digital solution is still necessary in identity because of the privacy violations. That is also because our world depends on it. Parents depend on it. Kids depend on it. Governments depend on it, and this is global. This is a whole generation. This is a future that we cannot avoid. Every generation will be just digital from here on. I think, in a way, I'm very proud to have what I think of his book ends in the digital invention space, I ended the need for physical inventory in software with the app store that took 20 years or 25 years to become what it is. I think the same is happening in digital identity. I think these laws, these practices, these values, keeping our children safe, keeping ourselves safe. All of these things are frustrations right now because of this transmogrification. I think a purely digital solution, once adopted, will simply have such smoothness and benefits, many of which I haven't even discovered yet. Some of which I'm sure you and I will discover in the next weeks and months as we see these ideas roll out and how people react to them. There's going to be a lot of aha moments. I hope maybe we'll do another podcast in 20 years., and we'll talk about how the world used to be, and the younger audiences will have no idea why anyone was driving to a store called Egghead to get a physical driver's license. How nonsensical it would be to take a picture of your passport just to prove that you already are who you already are. That's a funny notion. Right? So I hope we'll look back at this as a medieval torturous time.


Debbie Reynolds  49:01

Well, I love this. I love this. Well, thank you so much for being on the show. I adore you, and I love the work that you're doing. I just love the conversations that we have. So, I'm looking forward to us chatting more and being able to collaborate further.


Jesse Tayler  49:16

Absolutely, we'll put bookends on this one. We'll talk about the results as they begin to make the world a safer place because, one way or another, it's going to have to happen.


Debbie Reynolds  49:27

I agree. I agree. Well, thank you so much for being on the show and we'll talk soon.


Jesse Tayler  49:33

Absolutely see you online.


Debbie Reynolds  49:35

All right, see you online.