"The Data Diva" Talks Privacy Podcast

The Data Diva E233 - Peter Cranstone and Debbie Reynolds

Season 5 Episode 233

Send us a text

Debbie Reynolds “The Data Diva” talks to Peter Cranstone, CEO, 3PMobile, Digital Ecosystems and Consumer Choice. We discuss his personal journey in technology, beginning with his early work on data compression inspired by his uncle. He discusses the creation of the Do Not Track web standard aimed at enhancing user privacy, which faced challenges due to consumer preferences for convenience. Despite the introduction of privacy regulations such as GDPR and CCPA, he notes that users often prioritize instant gratification over privacy. His collaboration with a Kaiser Permanente executive shifted his focus from IT architecture to business strategy, broadening his understanding of how technology can be tailored to meet individual needs in healthcare.

Cranstone also recountes the historical evolution of windshield wipers, illustrating how innovation can take time to gain public acceptance. He highlightes the contributions of Mary Anderson and Robert Kearns, emphasizing the importance of gradual acceptance in automotive technology. Additionally, he discusses the complexities of engaging patients in their health management, proposing a dynamic app that allows for continuous interaction with healthcare providers, thereby addressing the challenges posed by an aging population.

The conversation shifts to data privacy and decentralization, with Cranstone advocating for a secure wallet system that empowers users to manage their data. He argues for a trusted web model where individuals are compensated for sharing their information, contrasting it with current practices that often exploit user data. Cranstone also addresses the need for equitable resource distribution, suggesting that the value generated by major tech companies could be redirected to alleviate issues like food insecurity. He concludes by emphasizing the importance of AI in personalizing user interactions while maintaining privacy, advocating for a moral approach to data management that respects individuals and promotes equitable distribution, and his data privacy hope for the future.



Support the show

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast, where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now, I have a very special guest on the show all the way from Florida, Peter Cranstone. He is the CEO of 3PMobile. Welcome.

[00:35] Peter Cranstone: Hello. Thank you for having me, Debbie.

[00:37] Debbie Reynolds: Yeah, well, we met on LinkedIn. I think we had. We've been connected on LinkedIn for a number of years, and I had actually done a podcast with Nigel Scott. And you had made some really just thought stunning comments about that conversation.

[00:56] I think your background is unbelievable, and I love the way that you think about technology. And we need people like you to think about technology and privacy and innovation. So why don't you tell your backstory in technology how you came to be the CEO of 3PMobile.

[01:13] Peter Cranstone: Well, really. Thank you, Debbie. Really. It's more about one of a lifetime of curiosities.

[01:18] Debbie Reynolds: So.

[01:18] Peter Cranstone: So I've never been focused 100% on a particular career. Typically, my very first career was an airline pilot, and the focus there was understanding flight and flying and commercial flying. But as soon as I realized that I've figured out everything I needed to do and I'd obtained all the licenses, I was kind of bored.

[01:41] And so I had the opportunity to start. I said to my wife, I said, I think I'm going to start something new. And interestingly enough, and not many people know this, I started with my uncle, who is a British mathematician.

[01:54] I think he was probably at the genius level. He'd focused a lot on AI and he had some new compression technology.

[02:02] And so without further ado, I just jumped into it and started off down the road of compression. And I got a very early lesson in intense mathematics and really pushing the frontiers of data compression.

[02:18] And I later learned that there was a lot of science involved, including the second law of thermodynamic thermodynamics.

[02:25] And I really shouldn't have attempted it, but what it really did teach me to do was to think differently and really expand my mind.

[02:35] And so from there I went on, I met a friend of mine who we worked together for 16 years.

[02:41] The very first problem we came up is, and this was in 1996, which was just when the Internet was just really starting to get going. I said, why, you know, why don't we compress the Internet?

[02:52] And the Internet was, I don't know, maybe a couple of million websites. There was no traffic on it. Ssl, which everyone knows is e commerce, had just been invented in February of 96.

[03:03] And Bezos with Amazon, didn't actually take his first credit card until July of 1996. So we were right there, right at the beginning, and obviously the silly idea was, well, let's just compress the whole Internet and make it go faster.

[03:17] So it took us about four years to figure it all out and no one believed us. And back in those days, I really couldn't think of a business model for this technology.

[03:27] So we decided to put it in the public domain.

[03:30] Within 90 days, it went global.

[03:33] All the Fortune 500 companies downloaded it and started using it. And now today, over 50% of the web uses on a daily basis. So that's billions of users on a daily basis.

[03:46] And then, just as I said before, I'm bored again. So now what's the next project? So why don't we make the web more secure?

[03:54] So I actually met the lead architect for Intel's titanium chip at Hewlett Packard and I recruited him. And there were, I only asked him one question. I said, well, what are the secrets inside Itanium?

[04:07] And he told me, in long story short, it's, it's security. Incredible security and it's incredible performance.

[04:16] So we built the world's first secure operating system for Intel's Itanium chip. And then later that morphed into secure DNS and the company is still going today. It's profitable, it's cash flow positive, and it's been deployed.

[04:33] The secure DNS product has been deployed with some of the largest telcos in the world. So Areedu, which is the Qatar Telco, jio, which is the Indian Telco Telefonica, which is South America.

[04:47] Everyone knows T Mobile, Verizon along with the IRS are all using it. So it was incredibly powerful technology.

[04:55] But again, once again, it worked. We'd finished it, we figured it out. And I also realized I'm not the guy who goes on to build the companies, but I love solving the thorny problems.

[05:07] And so what was the next problem we set out to do was, and this is your wheelhouse, is let's make the web more private. How hard can that be?

[05:18] Well, that took many, many years and, and there are a lot of technical issues that go into all of this. But in the end, we did solve the problem. We came up with a way to extend the web.

[05:32] And not many people know, but the web is an extensible protocol. So what does that mean? Protocol means a communication layer. And what we did was to take a look at it and say, good goodness, if we could extend it and we could offer customers or users more privacy, we thought that would be a good thing.

[05:51] And so before we wrote any code whatsoever, and this was in 2006, we did a survey. We asked people on the street, we asked our parents, our grandparents, what do you want?

[06:03] And they wanted three things. And I think these three things are still relevant today. They wanted convenience, they wanted privacy, and they wanted control.

[06:13] And that was one of our constituents. So we went to the next constituent, which was the enterprise, and we said, well, what do you want?

[06:21] And they wanted two things. They wanted simple integration. They didn't want to change any knowledge bases or infrastructure, and they wanted to make money.

[06:30] So what we had to do, we had our two stakeholders, and we had to figure out a way to resolve those two issues.

[06:38] And in the end, we figured out how to do it, and we invented something that later on became the Do Not Track Web standard. And it was probably the simplest solution to privacy that we've ever seen and yet nobody is using.

[06:56] And I'll get to that in just a moment.

[06:58] But the easy way to do this was to have the browser have a setting in it and say, grandma or your parents, they would just go to this setting. They would turn on this switch that says, do not track me.

[07:11] And from that point on, it would send a message to the enterprise or to the server, and they would, and here's the operative words, respect that message.

[07:22] Well, everybody very quickly figured out that they didn't want to respect the message because they were too busy using my data to earn money.

[07:31] And so essentially, the DNT standard, it's still in three quarters of all browsers around the globe. So I don't know, 3 or 4 billion, some 3 or 4 billion devices.

[07:44] And we ended up patenting do not Track. So our IP, just half of it, is in 3 or 4 billion browsers deployed around the world.

[07:54] The problem was nobody turned it on because nobody really wanted privacy, because they would mean. They would mean losing revenue. Certainly not Google and certainly not Facebook.

[08:06] And so very quickly we looked at that and said, okay, so convenience trumps privacy. And we really have learned that. Debbie. I think even today's world is consumers will trade privacy for convenience in a nanosecond.

[08:22] They we live in a world of instant gratification. We want it yesterday. We just want the answers. And if it means trading some data, it doesn't have any value to Us, unless what we get is that instant gratification and we get it for free.

[08:36] So this is where advertising comes into play. Obviously search with Google and obviously Facebook and we're addicted to these free services where we trade our valuable data and they use it.

[08:51] And if you've only got to look at their revenues, I mean I think Google does 320 billion a year in revenue. It's a phenomenal money generating system based on our data.

[09:02] So really there is really no desire to really change that, I don't think. And obviously We've heard about Web3 and the words decentralization, which really means just taking control of my data away from Google and Facebook.

[09:20] And what I've looked at is it is no one's ever going to stop searching and no one's ever going to stop over sharing on Facebook.

[09:27] So and decentralizing my data just means it appears in more places around the web. So really what's the purpose of all of this? And obviously we've now enacted some privacy regulations, GDPR and ccpa, but to me really they're not so much privacy regulations, they're more about respecting my data when it's stored online.

[09:53] And I think that's a very useful approach to it and how you encrypt it if there's a breach, how many days to notify somebody and so on and so forth, that's really all it's designed to do.

[10:05] And that was as far as I got. And that took me all the way to 2014.

[10:11] And at 2014 we sold the patents, we did obtain a license back to use the methods claimed in the patents inside our ip.

[10:21] But everyone was done. And once again I'm back to Mr. Curious sitting here going, I still haven't solved the other piece of the puzzle. And what was the other piece of the puzzle was how to monetize this new capability.

[10:35] And I was incredibly lucky. I got a call from a gentleman, an executive at Kaiser Permanente and they're about 100 billion a year payer provider on for healthcare. And he said we've got a problem.

[10:50] And I said okay, I love problems.

[10:52] And I said, well, how can I help you? And he wanted a new precision mobile care delivery strategy that engaged each person based on their individual variability and that could scale down to a single person and then up to every person in the world.

[11:13] So essentially he was asking for can you build a tailored app, something that adapted to each person's needs in the moment.

[11:22] And of course I thought, well yes, we can easily do that, but this individual. And he really did change a lot of my thinking.

[11:31] But it took years and years and years before the light bulb finally went off.

[11:37] And the simplest explanation I can give it, give to you is comparing the windshield wiper with the Intermittent Windshield Wiper. So Mary Anderson invented the windshield wiper in 1903, and I don't think it was used until about 1924 when her patents had expired.

[11:58] And back in those days, she got the idea from a tram. And they would stop the tram every so often, and the driver would jump out and he would use a rag and clean the windshield.

[12:07] And she thought this was kind of silly. And so she invented the windshield wiper. But everyone laughed at her and said, well, we'll just get out and clean the windshield by hand.

[12:15] We don't need a windshield wiper. Well, look at a car today. There's not a car on the planet that doesn't have a windshield wiper.

[12:23] So eventually she was proven right, but at the time, nobody believed her. And then it wasn't until 61 years later that gentleman came along. I think it's Robert Kearns. And in 1964, he invented the Intermittent Windshield Wiper.

[12:40] And the story is wonderful. It's pure happenstance. So he got married, and they were celebrating, and someone opened a bottle of champagne and the cork flew out and hit him in one of his eyes.

[12:51] So he had a problem with one of his eyes. So anyway, he's driving home one night and it's misty, and he was really having a problem with his good eye, with seeing through the windshield wiper through the windshield.

[13:05] And he had a thought. He says, why don't I modify.

[13:09] And he called it the dwell time of the windshield wiper, so that it would pause and then it would wipe again. Well, what was. And how. What did he choose for the dwell time again?

[13:21] This was brilliant. It was his. The blink of his eye, his good eye.

[13:25] So he invented the intermittent windshield wiper 61 years later.

[13:31] And so the analogy that I draw is we worked on the privacy piece of the web, where each individual could control the flow of their information, but we didn't work on the return loop.

[13:45] And the return loop was, how does the enterprise make money?

[13:49] So it all is easy in hindsight, but when you're sitting there in the middle of it, you're working your way through this, and you go, I have no idea how they make money.

[13:58] I really don't. I know they've got the data, but as it turned out, it's all about engagement.

[14:04] And engagement is the single Biggest problem that we have out there. Because if you and I walk into Walmart, Walmart knows us. They track us around the store, we're engaged.

[14:16] But the second we leave Walmart, they have no idea where we go.

[14:20] So part of it, when it comes to health is everybody's health is different. We're all individuals and we all engage based on different criteria.

[14:30] And so you might like a favorite brand of whatever it is, but if that brand isn't available inside that app, you disengage.

[14:42] And what do we do? We go download another app. But we've left one ecosystem for another ecosystem. Well, that wouldn't work for Kaiser.

[14:51] So we've got 14 million patients. They're all unique. But we want you to be able to offer products, content and services for the doctors, to be able to prescribe services from the user's favorite brands that would keep them engaged.

[15:09] But we want to do this all in one place.

[15:13] And obviously everybody knows that know you can't do that. That's simply not possible. If you look at the App Store, There are over 350,000 apps, health related apps on the App Store.

[15:26] Essentially what they're saying to me is take the 350,000 apps that are on the App Store with all the different capabilities because they are aligned with different people's needs in the moment, and condense all of them into a single app.

[15:40] And obviously you just sit there and laugh because you simply can't do that.

[15:44] Essentially what you're asking me to do is, and this is Nigel Scott's great phrase, is to create a webinar box.

[15:53] And there are four key ingredients when you build this solution that have to be present. If they're not, no one will engage with it. And so the app had to be dynamic, iterative, interactive, and it had to support my intrinsic motivators.

[16:10] How can I put the Internet in a box? Because our first approach, it didn't put the Internet in a box. It just merely set the privacy settings for each individual to control when they went to each website.

[16:25] This required the equivalent of the intermittent windshield wiper. And then one day I finally figured it out and I said, oh my goodness. What we could do is remain as the central network provider for the service.

[16:42] But they could now seamlessly integrate using this extended protocol, which was all web standard.

[16:49] Any ecosystem partner that I needed in real time to remain engaged.

[16:55] And so that's what we did. And it really takes the big leap is we started for an app, we started with an app for that, and everybody knows that now we've kind of morphed into the super app.

[17:09] And the super app is controlled by what I call the bigs. Only very large companies can really afford a super app because what they're doing is they're creating a narrow, tightly integrated walled garden of just their services.

[17:26] So you can't have a narrow, tightly integrated walled garden. What you need is a broad, tightly connected, open ecosystem so that any ecosystem vendor that the consumer wants or that drives that engagement could now engage with that particular person in real time.

[17:47] And so once we figured that out, you've gone from a regular app for that to a super app. And now you head into the arena of an ultra app. And an ultra app is really that it is a broad, tightly connected, open ecosystem where other vendors can seamlessly integrate using current web standards with your customer.

[18:15] And the beauty of this is everything is tailored for that individual in real time. So to me, this is where I think the next wave of the Internet goes.

[18:25] It's something that drives revenue for collaborative vendors, partner ecosystems, and for the entity that sits in the middle. It's really what I look at it, it's Google 2.0.

[18:40] It's what comes next. Because Google, to me, they built a fabulous business, but it's search and it's advertising.

[18:48] So advertising is their ecosystem partner, all the advertisers, and the value is my intent and my data and serving an ad. But now imagine if we take that exact same business model and concept and we say, why don't we run the whole web and all of the ecosystems, every one of them through a single point?

[19:11] Normally that's impossible because the app can't support all of those different vendors ecosystems in real time because the navigation of those ecosystems is simply too complex.

[19:22] But what if we simply rewrote the app in real time? What if we figured out a way to have the app open and running, but then modify the user interface so that navigation became incredibly simple and we came up with what I call the 0123 Rule 0 behavioral changes.

[19:43] So our parents and grandma, there was no behavior change. They would know how to use it instantly.

[19:49] Single sign on, it would recognize me two second response time. Which we took our content acceleration technology and made it even smarter to get them instantly engaged. And then the key was the last one was three clicks to relevant content.

[20:06] So I should be able to get where I'm going, no matter where it is in the world in three clicks and through any vendor's ecosystem.

[20:14] So the beauty of this is you're actually engaging each person in the world because it scales to everybody with all of the things that they could possibly want in real time.

[20:25] And as their needs, values and preferences change over time, even daily, even via a calendar, the app could adapt.

[20:35] And that takes us all the way through to us sitting here chatting with you right now.

[20:41] Debbie Reynolds: Yes. Well, a couple things I want to drill down on so that people really understand. So obviously I'm a geek and a nerd and a very curious person like you, so I totally understand what you're doing.

[20:54] But I want to make sure that the audience understands how this improves or impacts privacy or control or agency, and how decentralization of data plays into this, certainly.

[21:09] Peter Cranstone: So let's start with the privacy and the control.

[21:12] My overall viewpoint on this is the privacy horse has bolted out of the field. I think it's been going on for so long that our data is all over the place.

[21:22] But let's turn this around and say, how can we increase the value of our data so we know it's already gone? But what if we could put a mechanism by which someone could say, you know, what if I trust you and to me, trust is the next frontier on the web, then I will share more valuable consent based data.

[21:48] So for this, we called it the three R's. It would recognize that it's me. And there's a great article. It was written by a guy called Kim Cameron. He was a Microsoft researcher and he wrote the seven Laws of Identity.

[22:02] And the problem with the web is it was built without an identity layer, which funnily enough, was probably a good idea because it would violate your privacy.

[22:10] But he came up with seven laws. And so we looked at those laws and one through four have already been solved. Five through seven have not yet.

[22:19] But let's focus on how do we make the data more valuable. So our first thought was put it under the control of the user.

[22:28] So how do we do that? So back in 2006, so it's almost 20 years old, is we built a secure wallet.

[22:36] And what we thought of is, well, why don't we allow the consumer to put all of the data that they want to put into this wallet, but give them control over every aspect of it.

[22:48] And then what we would say is, if I trust you, Mr. Web Server, Mr. Content Provider, I will share more of my data, but what I'm going to do is to attach some terms and conditions.

[23:01] So one of the fields that we put in there was you can use my data for free for these three things, but if you sell my data, then you must compensate.

[23:12] So if you think about it from the broad brush stroke is the situation we're in right now is they don't have accurate data, they don't have real time, financially viable data.

[23:23] So the choice is really simple. I will share my data if you compensate me for it. But if you don't want to compensate me for it, then guess what? I now have the ability to control the flow of what you see.

[23:36] And I think this is unique when it comes to privacy. And this is something that Helen Nissenbaum talks about, Professor Helen Nissenbaum is it's the data wants to flow. It wants to flow like a river.

[23:48] What we really want to do is to build a trusted web, an environment where I can comfortably share my data because it has much more valuable, much more value. Now to the people who want to use my data and reward me in some way, shape or form.

[24:08] It could be financially, it could be loyalty points or what have you.

[24:12] Now the first thought is once the data is gone, guess what? We have no way of stopping who it shared with really from around the world.

[24:22] But in this ultra app central network system, I have the final say. I can open up my wallet and I can turn it all off.

[24:33] And I think the first part of privacy is empowering each individual, give them a say in something. We don't have a say right now. Still to this day, I can't go into any browser and say do not track.

[24:49] I mean, it was such a simple and elegant solution that was totally disrespected because they want to make money. So I look at it in the person who uses this first and they set up a trusted central network value exchange.

[25:04] And the gentleman who wrote about this is John Hagel iii, who's a former Deloitte consultant and he calls this an infomediary where I am willing to share my data and I get paid for sharing my data.

[25:17] So when you think about it, Debbie, it's an incredible business model because in one respect, you're getting great data. It's accurate and it's viable and it's always accurate and it's always up to date versus what Google gets, which is not as valuable.

[25:34] But you have to stop the bigs competing with your business model so the bigs don't reward you for your data. Imagine a world where you get rewarded for your data.

[25:45] So it's not something the incumbents can challenge very quickly.

[25:49] So that to me, I think was really where privacy has to go. It has to go through a trusted environment and you have to be compensated for it. And you have to be in control of it.

[26:03] So now let's go to the second part of your question. The magic words that everyone loves to talk about is decentralization, and it does make me smile. So there are really only two economies in the world when it comes to tech, ad tech and E commerce.

[26:19] Either we're the product or I buy a product, content or service.

[26:24] But what they really wanted to do was to say, we want to take away from Google. And Google is a centralized collection point for my data. They are a financial control point.

[26:37] So they said, well, why don't we decentralize that data?

[26:43] So this is where blockchain came along, and this is where the current web is read write, whereas blockchain is read write own.

[26:53] So what they're saying is you can own your own data when it comes to blockchain, but really, when you think about Adobe, you have to go build a whole new infrastructure.

[27:03] The web runs on TCPIP and HTTP and HTML. Blockchain is blockchain. It runs on layer 0, layer 1, and layer 2 infrastructure. That all has to be built.

[27:14] So we have the Bitcoin blockchain, and we have the Ethereum blockchain. And I think the last time I checked, there was like 30,000 different chains.

[27:24] And I keep going like, yeah, but what's wrong with HTTP and HTML?

[27:30] So essentially what you're really telling me is you want to decentralize the profits from Google onto your chain.

[27:40] That's what you're really talking about. Because once I'm on your chain, you're seeing my data. And again, if I don't have any control over my data, then what's the difference between that and HTTP and HTML?

[27:55] So it's not about decentralizing the data. I think it's about decentralizing the profits.

[28:02] But it sounds so much better from a marketing standpoint to say we are taking back control of. We're giving you back control of your data.

[28:10] But they never finish it with, well, yes, but now it's on your blockchain. And how are you monetizing your blockchain? Well, guess what, it's E commerce. Or it's. You're speculating on a coin.

[28:25] So I think we're just swapping one horse for another horse and you're just shifting control to a different place. And I think that's really what it's all about. And I certainly.

[28:38] And I've spoken to grandparents and I've spoken to young children and introduced those terms, and they just look at me like I'm crazy. They really don't understand any of it.

[28:49] But I think what I would rather talk about is, or bring into the context is, first of all, I want to give each consumer in the world control, a semblance of control over the flow of their data.

[29:02] It has value. You should be respected. But then on the other side, I want to give the enterprise the ability to make money with simple integration.

[29:11] And if you build a trusted central network value exchange, you have decentralized data away from Google and you've also decentralized the profits away from Google. But here's the best part about this technology is you can flow search, which is Google, through your network value exchange, it's just another web service.

[29:40] But now I would sit down with Google and I would say, certainly as your service grows, would you like access to this data? This data is far richer, far more valuable than what you have right now.

[29:52] And the best part about it, Debbie, from a privacy standpoint, is it has my real time consent.

[29:59] I authorize you to do this with it for this period of time, and as long as I get compensated.

[30:07] So then it's a negotiation between the new value exchange and the old value exchange. Google still resolves the search, Google still serves the ad, but the value of the ad goes up because the data is better and it's consent based.

[30:25] Well, what Helen Nissenbaum talked about, and I thought this phrase was wonderful, was the equitable distribution of costs and benefits that can be distributed across social domains. And what she talked about with social domains was wealth, health, literacy, food, things like that that we all need on a daily basis.

[30:48] So one of my favorite focus points right now is food is health. I think obviously everything starts with what we eat. And there are 43 million people in America who go hungry each day.

[31:03] And that just, to me, blows my mind. In as much as we're a superpower, why are people going hungry?

[31:09] And I come back to what Helen said, the equitable distribution.

[31:14] So what we've got right now with the lack of privacy controls, is the inequitable distribution.

[31:21] Google gets it all, Facebook gets it all. It's a zero sum game. They win, I lose.

[31:28] Why can't I? If the data is so valuable and it generates 320 billion a year for Google, why can't we take a portion of that data or the value of that data and feed people?

[31:40] I mean, everyone needs to eat. So there is authentic demand already built into the human who needs to eat two to three times a day. Well, why can't we use the value of their data and we know how valuable is.

[31:53] We've only got to look at the revenue numbers for Google and Facebook or Meta and say, why can't we take some of that value and turn it back into food for that particular person?

[32:05] And the only dependency is a phone. And pretty much everybody on the planet has a phone.

[32:14] So I think there is value sitting in the person. And they say data is the new oil. So you and I have significant value in our data. All we want to do is to reflow that data in a more equitable way where we get compensated from a privacy standpoint and the enterprise gets compensated from a revenue generation standpoint.

[32:41] And really the beauty of this, Debbie, is it's incredibly capital efficient because you're leveraging value each time I connect.

[32:50] You're increasing mind and wallet share simply because you're connecting ecosystems for me.

[32:56] And it's incredibly low customer acquisition costs because it's viral. You will tell your friend, who will tell another friend that this app, it's amazing. It adapts to me, it respects my privacy, it recognizes me and it responds just to me with the things that I need.

[33:16] And I think as we kind of headed towards the end, this is where AI makes an appearance.

[33:23] And I think AI right now is still in search of its perfect use case.

[33:28] But I think in the future there will be two forms.

[33:33] The first one will be predictive AI. And this is, you know, some science has been around an awful long time, some incredibly solid mathematics, and really predictive AI is all about the next best action.

[33:46] Based on all of these things I know about Debbie and Peter, can I predict higher probability what the next best action is? And it can do it. So, and there's no hallucination.

[33:58] So then there's really only one more step. And this is where to me AI agents come into play. And there's a wonderful article, it's called Decentralizing Self Driving Money.

[34:10] And and I thought one day, why wouldn't I have an AI agent that has one job is to find and connect the value exchange with the right collaborative partner, they can negotiate the contract.

[34:25] And now the second, the next best action is that particular person or vendor, they instantly connect to me. And the app simply adapts in real time. And we figured out how to do that using web standards so that it totally engages me on a daily basis.

[34:43] And I think that to me is where you top into the bookends for where we're going now is I have to deliver on privacy for you. I have to give you a better method to give you some control.

[34:56] Back. And I want to reward you as an individual, but I can't not reward my shareholders and pay my employees.

[35:06] And so there has to be an equitable distribution. And the beauty of this mechanism now is there's an equitable distribution that has a foundation in privacy, a foundation of control and flowing that data, of simple integration.

[35:24] And the final foundation is making money.

[35:28] And once you turn on, I call it a financial spigot, once you turn it on, it doesn't shut off.

[35:33] Debbie Reynolds: Right now, one thing I wanted to get your thoughts on, and I think this is something that I think to fill in the gaps. And you and I have talked about this, and that is what is happening now with data that will make this a very attractive option.

[35:51] And part of what I think is companies, first of all, people are mistrustful of companies and how they handle their data. So a lot of them either give them like partial data data that's incorrect.

[36:07] There's, you know, AI, there's data that's flooding the market and it's not super high quality. So I think the fight now is for that high quality, more accurate, consented information.

[36:23] And so I think that's where the fight is. But I want your thoughts there.

[36:27] Peter Cranstone: That's a wonderful question. And I can answer it with one word, authenticity.

[36:34] So let's dive into that just a little bit.

[36:37] So think of it as the metadata.

[36:42] So unfortunately, with this current crop of gen AI, the bigs decided, we're just going to go ahead and do it. We're going to suck everybody's data off the Internet, read all of these documents and everything else, and we'll deal with the aftermath later on down the road.

[37:02] But they've kind of got. Where they've got right now is they haven't really been able to turn it into something because as you say, what is the value of the data?

[37:11] So what is now starting to come out is what is the authenticity of the data? So I look at this as provenance of the data. So let's say you and I go to an art gallery and we're looking at a van Gogh, the two of us, and the price tag is 60 million.

[37:26] And we check our bank accounts and you've got 30 million in yours and I've got 30 million in mine. We decide to buy the painting.

[37:33] The first thing we're going to say to each other is, how do we know it's a van Gogh?

[37:37] And so there is this whole provenance chain that has to be proven. And this actually, I think, in one respect gets its Roots in security and something called root trust.

[37:46] So I want to know unequivocally that it really is an authentic vanguard.

[37:52] Well, the way to do that, or one of the ways to do that is with notary and notarizing my data. And this is one of my favorite web3 technologies is a company called Geek Geek IO and it's run by Dr.

[38:08] Stephanie so who I just think the absolute world of.

[38:11] And she showed me this demo one day and it was they were selling coffee beans. A farmer in South America was selling coffee beans to a broker and who was turning around and selling it to a barista in New York.

[38:23] And I went, oh my goodness, do you realize what you're looking at here is a new value chain. But the problem is, how do I prove that those coffee beans are authentic?

[38:37] And so what she did was to create a provenance engine and it was a notary. And I kept looking at this and I'm saying, well, what if we put our data, if we tokenized it and put our data on the blockchain and we notarized it, so now we can unequivocally prove that this is me and this is my data.

[39:02] So what we have now, Debbie, is accountability.

[39:05] What I want then to do is I want to exchange the value of that data with the central value exchange. So the way she did it was to create an nft, a non fungible token.

[39:18] And I went, oh my goodness, that's brilliant. Because what I could do is I can transfer the non fungible token using our extension protocol of the web. So we would bury it inside and we could put at the top our terms and conditions around that data.

[39:36] So not only have we proven its ownership of our data by proof, we've increased the value of the data.

[39:43] So now, as you say, we want that trusted infomediary who looks at the value, recognizes the value and says, you know what, I got a couple of collaborative ecosystem partners, which, let's just think about it, Van Gogh painted it, put the painting in the art gallery, and you and I pulled up to buy it.

[40:03] Well, it's the same value chain. So if I can now authenticate that that data is 100% accurate. But what I can do is I can also record who uses it to the blockchain.

[40:15] So there is much higher levels of accountability and that builds trust.

[40:21] And as George Schultz says, once if trust is in the room, you can do amazing things. If it's not present. Exactly what you say happens is I don't trust anyone with any of this data.

[40:35] I Don't know where it came from. I don't know how you generated that answer.

[40:41] And so I'm very, very skeptical.

[40:44] And so this way, it removes a level of skepticism. It puts accountability into it. And for you, as a privacy expert, you say, they are respecting my privacy. And this goes back to the do not track signal, which was a signal.

[41:01] It just sent a single byte. It was the most incredibly simple thing, Debbie. It would send a single byte, but across the web in a header, and it said DNT equals 1.

[41:14] And the person who read that signal would instantly respect the signal and not track you. Well, obviously completely the reverse happened. They just ignored the signal. But this way, if you can build a monetization structure around data, around convenience, privacy and control, we go back to our three things and we can do seamless integration across the entire value chain.

[41:43] And we combine that with the ability to make money through customer engagement.

[41:49] I mean, there used to be a time at Google that if you interviewed with them or you wanted to sell them your product, they would say, tell me why it's like a toothbrush.

[41:58] Why will people use it twice a day?

[42:00] And why would people use this new service based on this new technology twice a day? Is because it's convenient. It respects my privacy and have control. And for you, it's simple integration with a broad, tightly connected, open ecosystem that generates money from your data.

[42:24] And it aligns perfectly with both ad tech and the e commerce economy. And in fact, McKinsey came out and said the next economy is the ecosystem economy.

[42:35] And it's really, it's the combination of taking the ad tech in the e commerce and they project it, you know, 100 trillion a year by 2030.

[42:45] So it's a very large economy, but it's tied directly back to your passion, which is privacy combined with innovation that equitably distributes costs and benefits across social domains.

[43:02] Debbie Reynolds: Very good. If it were the world according to you, Peter, and we did everything you said, what would be your wish for privacy anywhere in the world? Whether that be technology, human behavior, or.

[43:16] Peter Cranstone: Regulation, it respects me as n of one, as an individual. It asks everybody to just step up a little bit, but it respects me as an individual, as a human being on the planet.

[43:29] And I think that was what is missing. The old phrase, sooner or later you sit down to a breakfast full of consequences. And I think we're sitting down to a breakfast full of consequences right now with social media and everything else that's going on in the world.

[43:43] And I think if I had one wish for privacy is it would enable us to return to a more moral compass where we respected human beings and we distributed equitably to lift everybody up.

[44:00] So it was no longer a zero sum game where some people end up with 300 billion in the bank account and then another person is starving.

[44:09] I think we can do better as not only as a nation but I think we can do better as civilization and that would be my defining wish with this technology is the people who get to use it do something that lifts everybody up.

[44:26] Debbie Reynolds: Very good. I support you in that. So thank you so much for being on the show. This is great. I'm glad we were able to have this chat. This is tremendous and I'm sure the audience will like it as much as I do.

[44:38] Peter Cranstone: Thank you very much for having me. Debbie, it's been an absolute pleasure and.

[44:42] Debbie Reynolds: We'Ll talk soon for sure.

[44:44] Peter Cranstone: Absolutely. Thank you so much. You're welcome. Sa.